Categories
The Industry

Unwinding the PSTN

This blog is aimed mostly at telephone companies and various CLECs who have been operating on the legacy Public Switched Telephone Network (PSTN). This network has been used for interconnection to the local incumbent offices and tandem switches, for connecting to 911 centers, for connecting to operator services, for connecting to cellular carriers, or for connecting to other neighboring carriers.

At CCG, we are finally starting to see that network being shut down, route by route and piece by piece. But like everything related to operating in the regulated legacy world, it’s not easy to disconnect the PSTN connections called trunks. The big incumbent telcos like AT&T, Verizon, CenturyLink, and others will continue to bill for these connections long after they stop being functional.

I don’t use this blog to make many pitches for my consulting practice, but I think we’re one of the few consultants left in the industry that can help to unwind and stop the billing of the old PSTN network arrangements. We spent many years helping ILECs and CLECs originally order these connections. The ordering process for the PSTN has always been complicated and cryptic. Carriers need to go through those same systems to cut a circuit dead. You often can’t stop the billing by calling or writing to the incumbents – network arrangements need to be unwound in the reverse manner they were built in the first place.

It’s not surprising that this is hard to do. The ordering system was made difficult on purpose after the big telcos decided they didn’t like the requirements of the Telecommunications Act of 1996 that required them to share their networks with other carriers. After that FCC order, big telcos purposefully made it hard to initiate a connection with them – and now it’s just as hard to disconnect. The big telcos will be glad to continue to bill for circuits for years after they no longer work.

I have no idea how long it’s going to take the PSTN to die, but it’s finally starting to be disassembled, piece by piece. In some ways, it’s a shame to see this network die because it was the first nationwide communication network. It was built right, and it was reliable. Outages came from the same issues that still plague networks, and a fiber cut has always been able to isolate a town or a region from the PSTN.

Sadly, the big telcos never spent the money to create route redundancy. Folks like me have shouted for decades that there was no way to justify multi-day rural network outages when we know how to solve the problem. These outages are still happening today – and the fibers that carry the PSTN are often the same fiber routes that act as the only broadband backbone route into a rural area.

I remember twenty years ago when I had a few small telephone company clients who were willing to solve the redundancy problem by building a new fiber route. We were shocked when Verizon and AT&T refused to connect the new routes into the PSTN. Apparently, the big telcos were more worried about being bypassed than they were about having a more reliable network.

Over time, and as a result of some orders from State regulators, the big telcos allowed route redundancy when somebody else paid for it. Today, large carriers like Level 3, Zayo, and many others cross the country with alternate transport routes, but unfortunately, there are still a lot of rural places where the only available fiber comes from the incumbents.

If you are having problems disconnecting or rearranging connections with other carriers, give us a shout. This could be connections with a large telco, with cellular towers, or to other local carriers. You can contact Derrel Duplechin at CCG at dduplechin@ccgcomm.com. We hate to see the PSTN starting to go. But even more, we hate to see folks who can’t figure out how to get a divorce from the big telcos.

Categories
The Industry

Starlink’s New Business Broadband

Starlink has quietly updated its business broadband offerings. The original plan for businesses was $500 per month with a two-terabyte data cap. If a customer exceeded the data cap, the speed reduced to 1 Mbps for the remainder of the month unless a customer bought additional broadband at $1 per gigabyte. Starlink business comes with a premium antenna from HP at a one-time cost of $2,500.

The new plans are:

  • 1 TB Data Cap. $250/month plus $2,500 equipment costs.
  • 2 TB Data Cap. $500/month plus $2,500 equipment costs.
  • 6 TB Data Cap. $1,500/month plus $2,500 equipment costs.

Extra data now costs $0.50 per additional gigabyte.

Starlink promises faster speeds for businesses with the HP business antenna. This antenna has a 35% better field of view, is less sensitive to hot weather, handles rain better, and melts snow faster. The company now claims the following speeds on its website:

Download Upload
Residential 20 – 100 Mbps 5-15 Mbps
Business 40 – 220 Mbps 8-25 Mbps
RV 5-50 Mbps 2-10 Mbps

Interestingly, the speed claims above from the Starlink website are a lot slower than what was promised as recently as September 2022. For example, residential customers were told in 2022 that download speeds would be between 50 – 200 Mbps with upload speeds of 10 – 20 Mbps. Customers have been saying online that speeds are getting slower – something that has been validated by Ookla speed tests.

In the most recent FCC maps, Starlink claims speeds up to 350/40 Mbps. That matches the maximum speeds that Starlink promised to business customers in September 2022. We’ll have to see if the company drops the speeds claimed to the FCC now that it has dropped its maximum claimed speed down to 220/25 Mbps.

On May 9, Starlink notified customers that it would no longer deprioritize traffic after a customer hits the monthly data cap. Customers were being slowed to speeds as slow as 1 Mbps. Now, customers can sign-up to automatically be billed for excess data usage.

To some degree, the business offering is going to be a concern for some residential customers since business customers will get bandwidth priority. That might make a difference in neighborhoods with multiple business customers.

It will be interesting to see how Starlink performs over the long run. The company still has plans to add many thousands of satellites. But the company still has a waiting list of customers – and the company admits that it can get easily get oversubscribed in a neighborhood.

 

In 2021, Elon Musk said that he foresaw a future where Starlink could provide backhaul bandwidth to rural cell towers. That may still be coming in the future, but not with the current constellations. The speeds above are not nearly what a cell tower owner wants to buy. Even the most rural small cell site is going to want a steady 500 Mbps of bandwidth, with a more typical requirement of 1 Gbps. I would think that residential subscribers have to hope the company never sells to cell towers, or the coverage at peak times could suffer.

Categories
Regulation - What is it Good For? The Industry

The Rural Cellular Crisis

Over the last few years, I have helped dozens of counties get ready for the upcoming giant broadband grants. We’ve been very successful in helping counties identify the places in their County that don’t have broadband today – which is often drastically different than what is shown by the FCC maps. We then help county governments reach out to the ISPs in the region and open up a dialog with the goal of making sure that all rural locations get better broadband. This takes a lot of work – but it’s satisfying to see counties that are on the way to finding a total broadband solution.

In working with these counties, one thing has become clear to me. Some of these counties have a bigger cellular coverage problem than they do a broadband problem. There are often a much larger number of homes in a county that don’t have adequate cellular coverage than those who can’t buy broadband.

The counties I’ve helped have reached out to me – either directly or through an RFP looking for a consultant. Only a tiny number of the Counties identified their cellular problem up front when they hired me. Yet, when I talk to residents and businesses in the County – I hear more horror stories about poor cellular coverage than I do about poor broadband coverage.

I always knew that the cellular coverage maps published by the big cellular carriers were overstated. You might recall back before cellular advertising was all about 5G that the cellular carriers would all claim to have the best cellular coverage. They would proudly show their coverage map in the background on ads and on their websites to show how they covered most of the country.

I’ve come to learn that those maps were pure garbage. They weren’t just an exaggeration, and when you drilled down to look at specific counties, they were outright fabrications. I’ve worked recently with two counties that are the homes of major universities and one state capital. In all three of these counties, cellular coverage dies soon after people leave the biggest urban center.

If anything, I think that cellular coverage has gotten worse with the introduction of the spectrum that the carriers are all claiming as 5G. These are new frequency bands that have been introduced in the last few years to relieve the pressure on the 4G LTE networks. It makes sense that coverage would be reduced with the higher frequencies because one of the first rules of wireless technology is that higher frequencies tend to dissipate more quickly than lower frequencies. When I hear the complaints in these counties, I have to think that the 5G spectrum is not carrying as far into the rural areas.

This is a problem that is well-known to everybody in the industry, including the FCC. Back before the pandemic, the FCC came up with a plan to spend $9 billion from the universal service fund to build and equip new rural cellular towers – using a reverse auction method much like RDOF. This process derailed quickly when the biggest cellular companies produced bogus maps that Showed decent coverage in rural areas that were close to some of the smaller cellular carriers. The FCC was so disgusted by the lousy maps that it tabled the subsidy plan.

The FCC finally reconsidered this idea in 2021. Now the cellular carriers are required to produce maps every six months at the same time as ISPs report broadband coverage. If you haven’t noticed, you can see claimed cellular coverage on the same dashboard that shows the broadband map results. I haven’t spent much time digesting the new cellular maps since all of my clients are so focused on broadband. But I checked the maps in the region around where I live, and the maps still seem to exaggerate coverage. This is supposed to get better when wireless carriers are supposed to file heat maps for the coverage around each transmitter – we’ll have to see what that does to the coverage. It’s going to get harder for a wireless carrier to claim to cover large swaths of a county when it’s only on a tiny handful of towers.

There is a supposed way for folks to help fix the cellular maps. The FCC has a challenge process that requires taking a speed test using the FCC cellular speed test app. Unfortunately, this app requires a lot of speed tests in a given neighborhood before the FCC will even consider the results. I’m doubtful that most rural folks know of this app or are motivated enough to stop along the side of the road and repeatedly take the speed tests. And frankly, who knows if it will make any real difference even if they do?

The big cellular companies have clearly not invested in many new rural cell towers over the last decade because they’d rather have the FCC fork out the funding. I haven’t the slightest idea if $9 billion is enough money to solve the problem or even put a dent in it. No doubt, the FCC will saddle the program with rules that will add to the cost and result in fewer towers being built. But whatever is going to happen, it needs to start happening soon. We are not a mobile society, and it’s outrageous that a lot of people can’t make a call to 911, let alone use all of the features that are now controlled by our cell phones.

Categories
Technology

Using Fiber as a Sensor

I am an admitted science nerd. I love to spend time crawling through scientific research papers. I don’t always understand the nuances since scientific papers are often written in severe jargon, but I’ve always been fascinated by scientific research, because it presages the technology of a few decades from now.

I ran across research by Nokia Bell Labs concerning using fiber as a sensor. Scientists there have been exploring ways to interpret the subtle changes that happen in a long strand of fiber strand. The world is suddenly full of fiber strands, and scientists want to know if they can discern any usable real-life data from measuring changes in fiber.

They are not looking at the transmission of the light inside the data. Fiber electronics have been designed to isolate the light signal from external stimuli. We don’t get a degraded signal when a fiber cable is swaying in the wind. We probably don’t marvel enough about the steady and predictable nature of a fiber light signal.

The research is exploring if the physical attributes of the fiber can be used to predict problems in the network before they occur. If a network operator knows that a certain stretch of fiber is under duress, then steps can be taken to address the issues long before there is a fiber outage. Developing ways to interpret the stresses on fiber would alone justify the research many times over.

But scientists can foresee a much wider range of sensor capabilities. Consider a fiber strung across a bridge. It’s hard to measure tiny shifts in the steel infrastructure in a bridge. However, a fiber cable across the bridge can sense and measure subtle changes in the tensions on the bridge and might be able to understand the way that a bridge is shifting long before it becomes physically obvious.

There is already some physical sensing used to monitor underseas fibers – but more can be done. The fiber can possibly measure changes in temperature, current flows, and seismic activity for the full length of these long fibers. Scientists have developed decent sensors for measuring underground faults on land, but it’s much harder to do in the depths of the open ocean.

To test the capabilities to measure and interpret changes to fiber, Bell Lab scientists built a 524-kilometer fiber route between Gothenburg and Karlstad in Sweden as the first test bed for the technology. This will allow them to try to measure a wide range of environmental data to see what can or cannot be done with the sensing technology.

It’s hard to know where this research might go, which is always the case with pure research. It’s not hard to imagine uses if the technology works as hoped. Fiber might be able to identify and pinpoint small forest fires long before they’ve spread and grown larger. Fibers might serve as an early warning system for underground earthquakes long before we’d know about them in the traditional way. The sensing might be useful as a way to identify minor damage to fiber – we know about fiber cuts, but there is often no feedback today from lesser damages to fiber that can still grow to finally result in an outage.

Categories
Regulation - What is it Good For?

Will Congress Fund the ACP?

The clock is ticking on the Affordable Connectivity Program (ACP). Current estimates show the program may run out of funding as soon as the end of the first quarter in 2024, ten months from now. The ACP provides a $30 monthly discount to eligible households and up to a $75 monthly discount to households residing on Indian reservations. The program started with a little over 9 million households at the start of 2022, and in March 2023 was up to over 18 million enrollees. You can see the enrollment statistics on this website.

The only solution for keeping ACP operating is for Congress to refill the ACP funding bucket somehow. This topic was discussed at the recent House oversight hearings on broadband. Angela Siefer of NDIA (National Digital Inclusion Alliance) testified at that hearing and said that reauthorizing ACP was one of the biggest broadband issues on the plate for Congress. She talked about the many gains that have been made in getting broadband to low-income homes.

ACP was not created through a normal budget appropriations bill but was funded by $14.2 billion from the Infrastructure Investment and Jobs Act (IIJA). There was also rollover funding of $2.2 billion added from the previous Emergency Broadband Benefit program that had been funded by the CARES Act. That was a one-time funding event, and that means specific legislation is needed to keep the program running.

There has been talk of moving the responsibility of the ACP to the FCC’s Universal Service Fund. But that would mean the agency would have to find a new way to pay for it. The current fees levied on Internet telecommunications are not nearly large enough to absorb the ACP obligations. Congress has already been considering ways to eliminate the FCC’s Lifeline fund, so the FCC might not be a politically viable solution.

Big ISPs are in favor of the ACP. The largest recipient of the funding is Charter, and Comcast is the fourth largest. One of the things that makes it harder to continue the funding for ACP is that eleven of the top fifteen recipients of ACP are wireless carriers. There is some concern that there is fraud embedded in the claims of some of these companies, which gives ammunition to those who don’t want to see the subsidy continue.

For ISPs, one of the biggest issues that will arise from the end of the ACP is that the upcoming BEAD grants require ISPs to have a low-income plan. Most ISPS have been pointing to the ACP as their low-income solution. But if the ACP expires, ISPs will have to develop a self-funded discount plan in order to win grant funding.

Anybody who has been watching Congress this year understands the challenge of getting a divided Congress to agree to continue funding a subsidy program. Many DC pundits are convinced that there will be very little bipartisan legislation passed in 2023 and 2024. There has been a lot of recent effort aimed at getting more folks enrolled in ACP – but that effort will mean very little in the long run if the program runs out of money.

Categories
Regulation - What is it Good For?

Our Uneven Regulatory System

The Florida Legislature recently passed a bill that brings poles under state jurisdiction for any electric cooperative that elects to enter the broadband business. That would be a change from current regulatory rules that exempt cooperatives and municipally-owned electric companies from federal and state oversight of pole regulation.

This blog isn’t going to debate the pros and cons of this specific legislation but will instead point to our uneven regulatory environment in the U.S. Over the years, State legislatures have passed laws that create regulatory rules that apply only to specific entities and not to everybody.

Laws affecting electric cooperatives related to broadband are a good example. Over the last several decades, a lot of states passed laws that prohibited electric cooperatives from entering the broadband business. These laws were clearly prompted by telephone and cable companies that didn’t want a new competitor. In the last few years, a lot of legislatures reversed these laws due to pressure from the public to allow their local electric cooperatives to bring them fiber broadband. These new laws have been effective, and a Google search tells me that 250 of the 900 electric cooperatives are either offering broadband or have plans to do so.

Most of the legislation that created exceptions has been aimed at squelching broadband competition. A lot of states have laws that either prohibit or restrict municipalities from entering the broadband business. It’s a little hard to get a precise count these days because there is a wide range of laws against municipal broadband that range from outright bans to rules that only create a few hurdles for a municipality to enter the business that aren’t faced by other competitors.

Most of the restrictions against municipal broadband are written in such a way as to make it seem like there is a path for cities to provide broadband. But most such laws often have a kicker that makes it extremely difficult for a municipality to comply.

As an example, North Carolina imposes a long list of requirements on a municipality that wants to offer broadband. A public entity must impute phantom costs into their rates, conduct a referendum before initiating service, forego popular financing mechanisms like bond financing, refrain from using typical industry pricing plans, and must make their commercially sensitive information available to their incumbent competitors. These rules collectively make it nearly impossible for a municipality to launch a broadband business.

As the new Florida law shows, it’s a never-ending battle with incumbents trying to legislate away competition. Last year there was a major push in Ohio for new legislation that would prohibit municipalities from providing broadband. The law was ultimately defeated, but it seems that several laws that create hurdles to market entry are introduced around the country every year.

Because of the push to get broadband to everybody, some restrictions have been relaxed in the last few years. It’s now a bit easier for municipalities in Arkansas to offer broadband. In Washington, there was a law prohibiting the municipal Public Utility Districts (county-wide electric companies) from offering any broadband except open-access. Last year, in a bizarre legal move, the Washington legislature passed two conflicting laws that allow PUDs to offer retail broadband. The Governor simultaneously signed the two bills rather than choose one of them, and now it’s unclear what the law is. Colorado just lifted the requirement that municipalities must hold a referendum before entering the broadband business.

It’s not surprising that there are laws that restrict some entities from becoming ISPs. Every textbook about monopoly power and abuse predicts that monopolies can’t resist the temptation to quash competitors.

Categories
Improving Your Business The Industry

Community-Wide WIFI

Somebody sent me an article from BocaNewsNow that talks about the trend that Hotwire is seeing in communities that want broadband everywhere. Residential communities in Florida are investing in outdoor WiFi networks that allow residents to connect to broadband from everywhere on a property, including tennis courts, lakefronts, and common community areas.

Communities are advertising ubiquitous broadband as an attractive amenity, and homeowners associations are investing in the technology at the prompting of residents.

It’s an interesting idea, but not a new one. Folks might remember the municipal WiFi craze of twenty years ago when cities everywhere were considering installing massive outdoor WiFi networks as a way to provide broadband to everybody. This was such a hot topic that there was even a magazine for municipal WiFi and conventions where folks came to learn about it. The largest such experiment was in Philadelphia, but there were many other cities that tried this on a smaller scale.

All of the early attempts for creating massive outdoor WiFi failed. The main reason for the failure was technical. The technology required deploying large numbers of pole or building-mounted radios that operated in a mesh network. The radios were mounted fairly close to each other so that there was a radio every several blocks in all directions. The advantage of a giant mesh network was that a customer walking around a community never left the network and didn’t have to keep logging in to keep the same connection.

But there was a giant downside that was never solved. The mesh radios constantly communicated with neighboring radios so the network could reconfigure to avoid a faulty or overloaded radio. It turns out that large early mesh networks spent more bandwidth communicating between neighboring radios than in providing bandwidth to users. The whole concept crumbled once a few cities tried this on any scale.

The other issue that killed the idea was that home broadband was improving drastically during this same time period. Speeds were climbing from cable companies and telcos, and folks were suddenly able to buy speeds of 6 Mbps to 12 Mbps, which quickly made the 1-2 Mbps speeds on wireless mesh networks feel glacial.

Over the years, outdoor WiFi technology has improved dramatically like other technologies. Since the early days of the technology, the FCC approved the 5 GHz, and more recently the 6 GHz bands of spectrum for use in WiFi networks. Outdoor hotspots that are fed with significant backhaul can now easily deliver speeds that are adequate for most of the kinds of uses of broadband that would be expected outdoors. Folks can watch videos, join Zoom calls, and use the outdoor WiFi network to stay connected.

Hotwire claims that the demand for outdoor WiFi has also grown due to people now working from home. It’s attractive for employees to take a laptop to the pool or a park rather than be tied to a desk all day.

I’ve talked to a lot of cities that have already expanded or are considering expanding public WiFi to parks and other public areas. The pandemic showed a lot of city officials that there are a lot of folks who need broadband access and don’t have it at home for some reason. It’s one of those amenities that, once you have it, you wonder how you lived without it.

Categories
Regulation - What is it Good For? The Industry

Should We Trust the Companies that Created the Digital Divide?

For those of you who don’t know Bruce Kushnick, he’s been tracking the promises made and broken by Verizon since the 1990s and written extensively on the issue. His latest article is “NTIA: Require Every State Broadband Agency to Investigate Those Responsible for Creating the State’s Digital Divide.”

Bruce has been arguing eloquently for years that the big telcos like Verizon, AT&T, and CenturyLink caused the rural digital divide by extracting profits from the regulated telephone and broadband businesses in rural and low-income areas while neglecting maintenance and not using any of the profits to modernize the technology. According to Bruce, the only reason we need massive federal grant programs today is to make the investments that the big telcos refused to make for the last several decades.

He argues that the NTIA should require states to investigate how the digital divide was created in rural areas and center cities. He uses the two examples of New Jersey and Los Angeles to make his point. He’s been tracking the promises made by Verizon to the State of New Jersey for the last thirty years. Verizon repeatedly sought regulatory relief through deregulation along with rate increases that were supposed to fund modernizing the network in the State – upgrades that were never done. When Verizon finally upgraded to fiber, it did so only in neighborhoods with the lowest costs, avoiding rural areas and most low-income neighborhoods.

I’ve been tracking this issue during my career as well. Consider West Virginia. I remember when Verizon was looking for a buyer of the telco network there as far back as the early 1990s. When big companies are trying to sell a property, they do what valuation folks call ‘dressing up the pig”. This means cutting expenses to make the property look more profitable. The cuts are usually deep, and drop maintenance below the level needed to keep up with routine repairs and maintenance.

Verizon didn’t end up selling the West Virginia network until the sale to Frontier in 2010. By then, the networks had been neglected for more than fifteen years. Frontier made only minimal upgrades to the properties they purchased – but it’s hard for an outsider to know if this was due to an intention to continue to milk cash flow out of the acquired network like Verizon had done or due to a lack of the capital and impact of the heavy debt used to buy the property. In any case, the West Virginia network continued to degrade under Frontier’s ownership.

For years, Bruce has made the point that there has not been any financial or regulatory cost to the big telcos for their bad behavior. They’ve repeatedly broken promises made to states. They’ve routinely milked profits out of networks while ignoring customers as the properties deteriorate.

In fact, we’ve seen the opposite of penalties. For example, the big telcos were rewarded with over $10 billion of CAF-II subsidies to support dying and neglected rural DSL networks. That money was supposed to be used to increase rural data speeds to 10/1 Mbps at a time when that speed was already obsolete. We’ve seen far too many places where even that basic upgrade was not made.

Bruce’s conclusion is that it would be ludicrous to give grant funding now to the companies that caused the digital divide in the first place. That would be using public money to upgrade the networks for these companies when profits should have been used over the decades to do so. He makes a solid argument that giving money to these same companies will not solve the digital divide since there is no reason to think the big telcos won’t turn around and do it all over again.

Categories
The Industry

Taking Aim at Junk Fees

Senator Richard Blumenthal of Connecticut introduced Senate Bill S.916, which takes aim at eliminating junk fees, which are fees that are not advertised for a product but that get added on after a customer buys a product or service. These fees were attacked this year by President Biden in the State of the Union Address.

The bill language is clear: “A covered entity shall clearly and conspicuously display, in each advertisement and when a price is first shown to a consumer, the total price of the good or service provided by the covered entity, including any mandatory fees a consumer would incur during the transaction, which shall not change during the purchase process.” The legislation goes on to give authority to regulate junk fees to the Federal Trade Commission.

Telecom companies, particularly cable companies, are among the worst in having hidden junk fees that are not included in advertising but are added to a customer’s first bill. But telecom companies aren’t the only industry, and the bill is aimed at airlines, online ticket companies, and other industries that routinely advertise prices that are lower than what a consumer is ultimately charged.

It’s clear why companies use junk fees since the practice gives them the ability to advertise super-low rates to attract customers. Consider the junk fees charged by Comcast. Comcast is not unusual, and the hidden fees of other large cable companies are similar.

Comcast routinely advertises low rates to attract new cable TV customers. A customer who buys at an advertised special rate will get a first bill with a lot of hidden junk fees that are not included in the advertised price – or else hidden deep in small print footnotes.

  • Comcast has a broadcast fee of $28.70 per month. This is a fee where Comcast has accumulated annual increases in programming costs into this side fee instead of raising the basic price of cable.
  • Most markets have a regional sports fee. This fee is specific per market and can range from $4 to $8. This fee is the accumulated increases in sports programming costs that have not been added to the basic rate.
  • Comcast also charges $9.00 extra for each settop box – a fee that is not included in the advertised price.

A first-time Comcast customer buying cable at an advertised $30 special rate could get a first bill for almost $75 – a startling difference.

Comcast also has hidden fees for broadband. The company charges $15 per month for a WiFi modem. The biggest surprise for many new customers is the Comcast data cap on broadband. The company charges $10 for each 50 GB of data over the data cap limit.

Consumers hate hidden fees. Anybody who has signed with one of the giant cable companies got a big surprise when they opened their first bill. But by then, most people are locked into a contract that came along with getting the low advertised rates.

There have been almost no repercussions for the practice. Occasionally, the states will pursue a company for deceptive advertising. For example, in early 2020, Comcast settled a dispute with the Minnesota Attorney General’s Office over false advertising related to sports fees. Comcast refunded $1.4 million to customers and paid a fine of $160,000 – which is a small penalty for a Company that had 16.1 million cable customers at the end of last year – all paying similar fees.

Deceptively low special rates make it unfairly hard to compete against a cable company. A competitor could have prices that are lower than the cable company, but hidden fees let the cable company advertise an untruthfully lower price. My clients with fiber networks tell me that customers routinely compliment them for not having hidden fees. People have gotten used to signing up for a low rate but paying a lot more – they become instantly loyal to a company that doesn’t play the games. This legislation would hopefully make the big ISPs become more truthful – but I’ll believe it when I see it.

Categories
Technology

The Next Big Fiber Upgrade

CableLabs recently wrote a blog announcing the release of the specifications for CPON (Coherent Passive Optical Networks), a new fiber technology that can deliver 100 gigabits of bandwidth to home and business nodes. The CPON specification working group that developed the new specification includes seventeen optical electronics vendors, fourteen fiber network operators, CableLabs, and SCTE (Society for Cable Telecommunications Engineers). For those interested, a link to the new specifications can be downloaded here.

The blog notes the evolution of PON from the first BPON technology that delivered 622 Mbps to today’s PON that can deliver 10 gigabits. The blog notes that current PON technology relies on Intensity-Modulation Direct-Detect (IM-DD) technology that will reach its speed limitations at about 25 gigabits.

The CPON specification instead relies on coherent optical technology, which is the basis for today’s backbone fiber networks that are delivering speeds up to 400 Gbps. The specification calls for delivering the higher bandwidth using a single wavelength of light, which is far more efficient and less complicated than a last-mile technology like NG-PON2 that balances multiple wavelengths on the customer path. This specification is the first step towards adapting our long-haul technology to serve multiple locations in a last-mile network.

There are a few aspects of the specification that ISPs are going to like.

  • The goal is to create CPON as an overlay that will coexist with existing PON technology. That will allow a CPON network to reside alongside an existing PON network and not require a flash cut to the new technology.
  • CPON will increase the effective reach of a PON network from 12 miles today to 50 miles. This would allow an ONT placed in a hut in a city to reach customers well into the surrounding rural areas.
  • CPON will allow up to 512 customers to share a neighborhood node. That means more densely packed OLT cards that will need less power and cooling. On the downside, that also means that a lot of customers can be knocked out of service with a card failure.

The blog touts the many benefits of having-100 gigabit broadband speeds in the last-mile. CPON will be able to support applications like high-resolution interactive video, augmented reality, virtual reality, mixed reality, the metaverse, smart cities, and pervasive communications.

One of the things not mentioned by the blog is that last-mile fiber technology is advancing far faster than the technology of the devices used in the last mile. There aren’t a lot of devices in our homes and businesses today that can fully digest a 10-gigabit data pipe, and stretching to faster speeds means developing a new generation of chips for user devices. Releasing specifications like this one puts chipmakers on alert to begin contemplating those faster chips and devices.

There will be skeptics who will say that we don’t need technology at these faster speeds. But in only twenty years, we’ve gone from broadband delivered by dial-up to bandwidth delivered by 10-gigabit technology. None of these skeptics can envision the uses for broadband that can be enabled over the next twenty years by newer technologies like CPON. If there is any lesson we’ve learned from the computer age, it’s that we always find a way to use faster technology within a short time after it’s developed.

Exit mobile version