Categories
Current News Regulation - What is it Good For?

How Many Households Have Broadband? – Part I

Polk County SignFCC Chairman Wheeler made a speech last week about the lack of broadband competition in the country. As part of the speech he released four bar charts showing the percentage of households that have competitive alternatives at the download speeds of 4 Mbps, 10 Mbps, 25 Mbps and 50 Mbps. His conclusion was that a large portions of the households in the US can only buy broadband from one or two service providers. I was glad to hear him talking about this.

But unfortunately there is a lot of inaccuracy in the underlying data that he used to come to this conclusion, particularly at the charts showing the slower speeds. The data that the FCC relies on for measuring broadband is known as the National Broadband Map. While the data gathered for that effort results in a Map, it’s really a database, by census block, that shows the number of providers and the fastest data speed they offer in a given area.

A census block is the smallest area of population summarized by the US Census. It is generally bounded by streets and roads and will contain from 200 – 700 homes (with the more populated blocks generally just in urban areas with high-rise housing). A typical rural census block is going to have 200 – 400 homes. The National Broadband Map gathers data from carriers that describe the broadband services they offer in each census block. As it turns out, self-reporting by carriers is a big problem when it comes to the accuracy of the Map. In tomorrow’s blog I will show a real life example of how this affects new deployment of rural broadband.

Broadband service providers don’t generally track their network by census blocks, so part of the problem is that census block don’t match the physical way  that broadband networks are deployed in a rural area. Anybody who lives in rural America understands how utilities work there. In every small town there is a very definite line where utilities like City water and cable TV stop. Those utilities get to the edge of the area where people live and they stop. That doesn’t match up well with Census blocks that tend to extend outward from many small towns to include rural areas. Rural census blocks are not going to conveniently stop where the utilities stop.

There are three widely used rural broadband technologies – cable modem, DSL and fixed wireless. Let’s look briefly at how each of these match with the broadband mapping effort. Cable is the easiest because every cable network has a discrete boundary. There is some customer at the end of every cable route and the next house down the road cannot get cable. So it is not too likely that the cable companies are claiming to serve census blocks where they have no customers.

DSL and fixed wireless are a lot trickier. Both of these technologies share the characteristic that the bandwidth available with the technology drops quickly with distance. For example, DSL can transmit over a few miles of copper from the last DSLAM in the network. The household right next to that DSLAM can get the full speed offered by the specific brand of DSL while the last house at the end of the DSL signal gets only a small fraction of the speed, often with speeds that are not really any better than dial-up.

The same thing happens with fixed wireless. A WISP will install a transmitter on a tower or tall structure and the customers close to that tower will get decent broadband, and those transmitters tend to be installed in small towns where people live. But wireless broadband speeds drop rapidly with distance from the transmitter and if you go more than a few miles from any tower there is barely any bandwidth.

Both telcos and WISPs input their coverage areas into the National Broadband Map database. And in doing so, it appears that they claim broadband anywhere where they can provide service of any kind. But for DSL and fixed wireless, that service-of-any-kind area is much larger than the area where they can deliver actual broadband. Remember that broadband is currently defined as the ability to deliver 4 Mbps download. Because of the nature of their technologies, a lot of the people who can buy something from them will get a product that is slower than 4 Mbps, and at the outer ends of their network speeds are far slower than that.

I don’t necessarily want to say that the carriers inputting into the system are lying, because in a lot of cases customers can call and order broadband and a technician will show up and install a DSL modem or a wireless antenna. But if that customer is too far away from the network hub, then the product that gets delivered to them is not broadband. It is something slower than the FCC definition of broadband, but it is probably better than dial-up. But customers with slow connections can’t use the Internet to watch Netflix or do a lot of the basic things that require actual broadband. And as each year goes by, and as more and more video is built into everything we do on the Internet there are more and more web sites and services that out of reach for such customers.

But unfortunately, there are also areas where it appears that the carriers have declared that they offer broadband where there isn’t any. If you were to draw something like a 5-mile circle around every rural DSLAM and every WISP transmitter you will see the sort of broadband coverage that many rural carriers are claiming. But the reality is that broadband can only be delivered for 2-3 miles, which means that the actual broadband coverage area is maybe only a fourth of what is shown on the Map. If you go door-to-door and talk to people outside of rural towns you will find a very different story than what is shown on the National Broadband Map. Unfortunately, the Chairman’s numbers are distorted by these weaknesses and distortions underlying the Map. There are a lot more rural Americans without broadband than are counted in the Map and rural America has far fewer broadband options than what the Chairman’s charts claim.

Tomorrow, a real life example.

Categories
Current News Regulation - What is it Good For?

The FCC’s Data Collection Effort

Character for children of FCC”Broadband” (Photo credit: Wikipedia)

The FCC just changed the way that they are going to gather data from carriers about voice and data usage in the US. To some degree they seem to be throwing in the towel and just giving up.

I have blogged before about the massive inadequacies of the National Broadband Map. This is an FCC-sponsored effort to show the availability of broadband on a geographic basis. This sounds like a laudable goal, but the carriers decide what information they want to supply to the mapping process, and so the map is full of what can only be described as major lies from the largest carriers. They claim to have broadband where they don’t and at speeds far greater than they actually deliver.

The FCC announced new rules for their data collection process that is done using FCC Form 477. This revised effort by the FCC is going to make their data gathering more like the process that is used to collect data for the National Broadband Map. They are no longer going to try to collect actual data speeds in tiers, but instead will be collecting only advertised speeds for data – the fastest advertised speed for landline providers and the slowest advertised speeds for wireless providers. For the life of me I can’t imagine how this data can be of the slightest use to anybody.

I just recently worked with a client in a small town in Oregon. The incumbent providers there are the biggest telephone company and cable company in the state. In both cases, they advertise the same speeds in this small town that they advertise in Portland. But in this town, as in most or rural America, the actual speeds delivered are far slower. They think the fastest cable modem speeds in the town are from 3 – 5 Mbps download and the fastest DSL is not much over 1.5 Mbps. And yet both carriers advertise products at many times those speeds.

This would just be a big annoyance if it wasn’t for the fact that the FCC and others use the data gathered to talk about what a great job the carriers are doing in this country to supply broadband. I recently saw an announcement that 98% of households now have broadband availability. And since the FCC’s definition of broadband is now a download speed of 4 Mbps and an upload speed of 1 Mbps, this makes it sound like the country’s broadband problems are being solved. But announcements of this sort are based upon lies and exaggerations by the carriers.

And since the whole point of this data gathering effort is to formulate policies to spur the carriers to do better, letting the carriers self-report whatever they want is like asking the fox to go count the eggs in the henhouse every morning. There is no practical penalty against a carrier advertising any speed they want or reporting falsely to the FCC. And it’s a lot easier, as it is with the Oregon example, for the incumbent providers to gear all of their advertising in a state around the urban markets. I have no idea if those incumbents in Oregon can actually deliver the advertised speeds in Portland, but I know for a fact that they do not do so outside of Portland.

The FCC is also changing the way that they gather information about VoIP lines. But I think the day for them to be able to gather any meaningful data about business phones in the country is over. There is such a proliferation of IP Centrex and other VoIP technologies that the carriers don’t even know what is being delivered. Consider this:

  • It’s now possible to use one number for a thousand lines in a call center or instead to give a thousand numbers to one phone.
  • There is a proliferation of resellers in the market who buy numbers and 911 from larger carriers so that they don’t have to become a CLEC. And these resellers can then deliver a wide variety of business voice services over anybody’s data connection. These carriers will not be reporting what they are doing to the FCC because most of them are not certified as carriers but rely on the certification of the CLEC that gave them numbers.  Nobody in the FCC reporting chain is going to know about or report these kinds of customers and lines. And it gets worse because I know of many cases now of resellers of these resellers. Literally almost anybody can become a carrier overnight reselling these services. It’s back to the wild west days we used to see with long distance resale. I’m expecting to go to a telecom convention soon and see the shark-skin suits again.
Categories
Current News Regulation - What is it Good For?

At Least We are Not Europe

Europe Simulator (Photo credit: wigu)

In this country the FCC has undertaken various policy initiatives to promote broadband. However, except for some universal service funding that will bring broadband for the first time to tribal areas and very rural places, these initiatives come with no federal money. And so the real broadband policy in the country is to wait for the private sector to build the infrastructure. The FCC may make proclamations about creating gigabit cities, but it’s completely up to the private sector to make it happen.

And we all know how that is working out. We have a checkerboard of broadband coverage. At one end of the spectrum are the fiber networks – Google and a few others bringing gigabit fiber, Verizon with FiOS, and many smaller communities with fiber built by municipalities or independent telephone companies. In the middle most metropolitan areas are served by decently fast cable modem service and ADSL2 DSL. And then there are a lot of smaller cities and rural communities where the DSL and the cable modems are a generation or more old and which deliver far less bandwidth than advertised. And we have many rural areas still with no broadband.

But what we have, by and large, is still better than what has been happening in Europe. And this is because our regulatory policy for last-mile connectivity is mostly hands-off while the European markets are heavily regulated. After the European Union was formed the European regulators went for a solution that promoted low prices. They have required that all large networks be unbundled for the benefit of multiple service providers. This has turned out to be a short-term boon for consumers because it has brought down prices in every market where multiple providers are competing.

But there is a big catch and the European policy is not going to work out well in the long-run. Over the last five years the per capita spending on new telecom infrastructure in Europe is less than half of what it is in the US, and this is directly due to the unbundling policy. Network owners have no particular incentive to build new networks or upgrade existing ones because it brings their competitors the same advantages they get.

In the long-run, Europe is going to fall far behind everybody else in fiber deployment because nobody wants to invest in fiber to connect to homes and businesses. There have been several major fiber initiatives in recent years in Europe, but these have largely been driven by large cities who are spending the money on the fiber infrastructure, much as is happening with some cities here. But the normal kinds of companies that ought to be investing in last-mile fiber in Europe, the cable companies and the telcos, are not doing so.

We tried something similar here for a few years. When the Telecommunications Act of 1996 was enacted, one of the major provisions was that the RBOCs (Bell companies) had to unbundle their networks, much as is being done in Europe. This was to spur competition by allowing new competitors to get a start in the business without having to invest in a new network. And this brought short-term benefits to consumers for a while. Companies were leasing RBOC unbundled loops and providing voice and data (DSL at the time) to businesses and residences all over the country.

But the FCC didn’t go the whole way like they did in Europe or else they would have also unbundled the large cable networks in this country. The unbundled telecom network business plans broke apart after cable modem service began winning the bandwidth war. And of course, there was the telecom crash that killed the larger new competitors. There are still a few companies out there pursuing this unbundled business model, but for the most part it didn’t work. And the reason it didn’t work is that it is a form of arbitrage. The business plan only worked because federal regulators made the RBOCs unbundle their networks and then state regulators set the prices for the network elements low to spur competition. But the services the competitors were able to offer were no better than what the RBOCs could offer on the same networks.

It’s always been clear to me that you can’t build a solid business on arbitrage. A smart provider can take advantage of temporarily low prices to make a quick profit when they find arbitrage, but they must be ready to ditch the business and run when the regulatory rules that created the opportunity change.

And Europe is currently engaged in one gigantic arbitrage situation. There are multiple service providers who are benefitting by low network costs, but with no burden to make capital investments. Customers there are winning today due to the lower prices due to competition. But in the long run nobody wins. The same rules that are making prices low today are ensuring that nobody makes any serious investment in building new fiber networks. So the competitors will fight it out on older networks until one day when the arbitrage opportunity dies, the competitors will all vanish like the wind. We know it will happen because it happened here. The CLECs in this country had tens of millions of customers, and they disappeared from the market and stranded those customers in a very short period of time.

The only policy that is really going to benefit consumers here, or in Europe, is one that fosters the building of state-of-the-art networks. The commercial providers have not stepped up nearly enough in this country and there is still not a lot of fiber built to residences. But in Europe it’s even worse. So, as much as I read about people criticizing the broadband policies in the US, I have to remind myself – at least we are not Europe.

Categories
The Industry What Customers Want

The DSL TV Market

CenturyLink Contingent (Photo credit: sea turtle)

I find it surprising that DSL TV providers have been the fastest growing segment of the cable TV industry. And my surprise is due to the fact that these companies are delivering TV over the smallest data pipe of any of the comparable technologies. Over the last year the companies using DSL and fiber to deliver cable TV have grown in customers while the traditional cable companies have lost customers.

Cable TV is delivered over DSL using a bonded pair of telephone wires using either ADSL2 or VDSL. In theory these technologies can deliver speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps. The bandwidth that is left over after the TV signal is used to deliver voice and data.

The DSL providers make cable work by using a technology called IPTV. This technology only sends the signals to the home that the customer is asking to see. One can always tell that you are on an IPTV system because of the small pause that occurs every time you change channels.

The DSL cable industry is composed of AT&T U-verse, CenturyLink Prism and a whole slew of smaller telephone companies. Not every telco has taken the bonded DSL path. For example, a number of the mid-sized telcos like Frontier, Fairpoint and TDS have elected to partner with a satellite provider in order to have a TV product in the bundle. But last year TDS ventured out into the DSL TV market in Madison Wisconsin.

AT&T is by far the most successful DSL TV provider as one would expect from their large customer base. AT&T has made the product available to over 24 million homes. At the end of the first quarter of 2013 they reported having 5 million cable customers on U-verse and 9.1 million data customers.

The biggest problem with using DSL is the distance limitation. The speeds on DSL drop significantly with distance and so customers have to be on a relatively short copper path in order for it to work. The DSL that AT&T is using can support the U-verse product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs. And the key word in that description is good copper, because older copper and copper with problems will degrade the speed of the product significantly.

I really don’t know who is in second place. CenturyLink announced that they had 120,000 TV customers on their Prism product at the end of the first quarter of 2013. There may be some other telcos out there with more DSL cable customers. But CenturyLink if fairly new to the product line having launched it just a few years ago. They still only offer it in a few markets but are adding new markets all of the time. So if they are not in second base they soon will be.

In researching this article I came across some web sites that carry customer complaints about Prism. Look at the Yelp pages for CenturyLink in Las Vegas. I’ve always suspected that unhappy customers are more likely to post an on-line review than happy ones, but some of the stories in here are extraordinarily bad. Obviously CenturyLink is having some growing pains and has a serious disconnect between their marketing and sales departments and their customer service. But some of the policies in here, such as charging people a large disconnect fee even though there is no contract is surprising in a competitive environment. And yet, even with these kinds of issues the company has added over 100,000 customers in just a few years.

I have to wonder how this industry segment is going to handle where the cable business is going. How much they can squeeze out of a 20 Mbps data pipe when you have customers who want to watch several TVs at the same time, record shows while watching another show and also streaming video to tablets and laptops, all simultaneously? Yesterday I noted the new trend in large TVs which is to split the screen into four parts, each showing something different. Most reviews of the performance of TV over DSL are pretty good, but how will DSL handle the guy who wants to watch four HD football games at the same time while surfing the internet?

Categories
Technology

Is There any Life Left in Copper?

RG-59 coaxial cable A: Plastic outer insulation B: Copper-clad aluminium braid shield conductor C: Dielectric D: Central conductor (copper-clad steel) (Photo credit: Wikipedia)

Copper is still a very relevant technology today, and when looked at on a global scale nearly 2/3 of all broadband subscribers are still served by copper. That percentage is smaller in the US, but this country has a far more widely deployed cable TV system than most of the rest of the world.

The most widely deployed DSL technologies today are ADSL2 and VDSL. In theory these technologies can get speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps.

ADSL2 and VDSL technology has been widely deployed by AT&T in its U-verse product which serves over 7 million data customers and over 4.5 million cable customers. AT&T has made the product available to over 24 million homes. AT&T can support the product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs.

And ADSL2 is a pretty decent product. It can deliver IPTV and still support an okay data pipe. However, as the cable companies are finding ways to get more bandwidth out of their coaxial cable and as new companies are deploying fiber, these DSL technologies are going to again fall behind the competition.

So what is out there that might resurrect copper and make speeds faster than ADSL2? Not too long ago I wrote a blog about G.Fast, which is Alcatel-Lucent’s attempt to find a way to get more speeds out of legacy copper networks. In recent field tests ALU achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

However, the G.Fast distance limitations are far shorter than ADSL2 and G.Fast is really more of a drop technology than a last mile technology and it would require a telco like AT&T to build a lot more fiber to get even closer to houses. You have to wonder of it makes any sense to rebuild the copper network to be able to get up to 500 Mbps out of copper when fiber could deliver many gigabits.

There are other technologies that have been announced for copper. Late last year Genesis Technical Systems announced a scheme to get 400 Mbps out of copper using a technology they are calling DSL Rings. This technology would somehow tie 2 to 15 homes into a ring and bridge them with copper. Details of how the technology works are still a little sketchy.

In 2011 the Chinese vendor Huawei announced a new technology that will push up to 1 gigabit for 100 meters. This sounds very similar to G.Fast and sounds like a way to use existing copper within a home rather than rewiring.

There is one new technology that is finally getting wider use which is bonded VDSL pairs that use vectoring. Vectoring is a noise cancellation technology that works in a way similar to how noise-cancelling headphones work to eliminate sound interference. Vectoring eliminates most of the noise between bonded pairs of copper. Alcatel-Lucent hit the market with bonded pair VDSL2 in late 2011 that can deliver up to 100 Mbps. However, in real deployment speeds are reported to be 50 Mbps to 60 Mbps on older copper. That is enough speed to probably give another decade to DSL, although to do so requires a full replacement of old technology DSL technology with VDSL2. One has to wonder how many times the telcos will keep upgrading their copper electronics to get a little more speed rather than taking the leap to fiber like Verizon did.

One only has to take a look at the growth rate of the data used at homes and ask how long copper can remain relevant. Within a few short decades we have moved from where homes could get by on dial-up and now find a 20 Mbps connection too slow. Looking just a few years forward we see the continued growth of video sharing and a lot of new traffic from cellular femtocells and the Internet of Things. It’s hard to think that it won’t be long until people are bemoaning the inadequacy of their 50 Mbps connections. But that day is coming and probably is not more than a decade away.

Categories
Current News The Industry

The National Broadband Map

Seal of the United States Federal Communications Commission. (Photo credit: Wikipedia)

Last Thursday the FCC voted to take over the data collection for the National Broadband Map. The Map was created as part of the funding for broadband supplied a few years ago by the Stimulus package. The Map was created and administered by the NTIA (National Telecommunications and Information Administration) with input from the states, and that funding is now running out.

Back when the Map was suggested I thought the concept was a good one. But as soon I saw that the data gathered for the Map was to be self-reported by carriers I knew that there were going to be problems. And sure enough, when the first generation Map was produced it was full of errors – big errors.

I work with a lot of rural communities and I have reviewed the maps in many areas of the country and compared it to the actual deployment of broadband. Some communities have developed their own maps – and they did it the hard way. They sent people around to see where broadband was available. A lot of this can be done by somebody who knows how to look up at the cables. It’s easy to know where cable modems are available by the presence of coaxial cable on the poles. And rural DSL generally has repeaters that can be spotted by the eagle-eyed observer. And it’s not hard to look at your cell phone to see how many bars of data you can get. But the best test of where broadband is at is done by knocking on doors and asking people what they are able to buy.

As an example of what I found, let me talk about the issues found in just one county in Minnesota. The Map showed that most of the County had landline broadband availability. The County is very typical of rural areas and the County Seat is the largest town in the County. There are half a dozen much smaller towns and everything else is rural. A large chunk of the rural area is a national forest where very few people live. Most people live in close proximity of the roads in the rural areas.

The reality in this County is that even in the several of the smaller towns the DSL is so slow that it is hard to think of it as broadband. It’s more like dial-up plus. And there was no cable modem service from the cable company outside of the County Seat. And as is typical with DSL, as one goes outside of the towns the quality of the DSL quickly degrades with distance from the DSL hub. We’ve always called this the donut effect with large areas of no broadband surrounding rural towns that have DSL and/or cable modems.

The Map also showed that almost every populated area of this Minnesota County had 3G wireless data available. It’s a very hilly and rugged place and probably half of the county by area can’t even get cellular voice calls, let alone data. But even where voice is available there are many areas that can’t get cellular data. The Map was just wrong about this.

Everywhere that I have helped communities look at the Map we have seen the same thing. The Map shows broadband that isn’t there. It shows cellular data coverage that isn’t there. And it often shows providers that are supposedly serving the counties that nobody ever heard of.

And this is not true for just rural counties. I have helped two suburban counties near large cities look at the Map and they found the same situation. The Map showed areas that are supposed to have broadband where their citizens still have dial-up or satellite. And cellular coverage was exaggerated on the Map.

An obvious question is why this matters? The national Broadband Map has only been around for only a few years and anybody who has ever looked at it knows it us full of inaccuracies. The problem is that the federal government now relies on the Map for several purposes. For instance, if you want to get federal money by loan or grant to deploy rural broadband the assumption is that the Map is good. It is then your responsibility to show where the map is wrong.

And the FCC uses the Map when it talks about the availability of Broadband in rural America. The Map has been overlaid with Census data to count how many households can get broadband. This produces a very distorted picture of who has broadband. There are pockets of people without broadband in even some of the most populated counties in the country and the Map simply misses them. And in rural areas the Map can be very wrong.

The FCC just took over responsibility for the Map. From my perspective they either need to do it right or get out of the mapping business. It’s not easy to get it right, but it can be done. One of the easiest steps they could take would be to give counties the authority to clean up the maps for their areas. Many of them would be glad to do that. And broadband availability is not static. There are areas all of the time getting or losing broadband. If the FCC won’t take the time to get the Map right they should just let it die as another impractical idea.

Categories
Improving Your Business Technology

Make it Faster

Cable modem Motorola SurfBoard for broadband internet (Photo credit: Wikipedia)

Whenever I look at my client’s data products I almost have the same advice – make it faster. I am constantly surprised to find companies who deliver small bandwidth data products when their networks are capable of going much faster. I have come to the conclusion that you should give customers as much bandwidth as you technically can deliver, within any technical restraints.

I know that networks are operated largely by engineers and technicians and very often I hear the engineers warn management against increasing speeds. They typically are worried that faster speeds mean that customers will use more bandwidth. They worry that will mean more costs with no additional revenue to pay for the extra bandwidth.

But the experience in the industry is that customers don’t use more data when they get more speeds, at least not right away. Customers do not change their behavior after they get faster data – they just keep doing the same things they were doing before, only faster.

Of course, over time, internet data usage is steadily increasing on every network as customers watch more and more programming on the web. But they are going to increase usage regardless of the speed you deliver to them as long as that speed is fast enough to stream video. Going faster just means they can start watching content sooner without having to worry about streaming glitches.

The engineers do have one valid point that must be taken into consideration, in that many networks have chokepoints. A chokepoint is any place in a network that can restrict the flow of data to customers. Chokepoints can be at neighborhood nodes, within your network backbone, at devices like routers, or on the Internet backbone leaving your company. If your network is getting close to hitting a chokepoint you need to fix the issue because the data usage is going to grow independently of the speeds you give your customers. When I hear worry about chokepoints it tells me that the network needs upgrades, probably sooner rather than later.

Historically telecom companies were very stingy with data speeds. The first generations of DSL didn’t deliver speeds that were much faster than dial-up and even today there are many markets that still offer DSL with downloads speeds of 1 Mbps. Then cable modems came along and they upped speeds a little, with the first generation of cable modems offering speeds up to 3 Mbps. And over time the telcos and the cable companies increased data speeds a little, but not a lot. They engaged in oligopoly competition rather than in product competition. There are many notorious quotes by the presidents of large cable companies saying that their customers don’t need more speed.

But then Verizon built FiOS and changed the equation. Verizon’s lowest speed product when they launched service was 20 Mbps, and it was an honest speed, meaning that it delivered as advertised. Many of the DSL and cable modem speeds at that time were hyped at speeds faster than could be delivered in the network. Cable modems were particular susceptible to slowing down to a crawl at the busiest times of the evening.

Over time Verizon kept increasing their speeds and on the east coast they pushed the cable companies to do the same. Mediacom in New York City was the first cable company to announce a 50 Mbps data product, and today most urban cable companies offer a 100 Mbps product. However, the dirty secret cable companies don’t want to tell you is that they can offer that product by giving prioritization to those customers, which means that everybody else gets degraded a little bit.

And then came Google in Kansas City who set the new bar to 1 Gbps. Service providers all over the country are now finding ways to 1 Gbps service, even if it’s just to a few customers.

I am always surprised when I find a company who operates a fiber network which does not offer fast speeds. I still find fiber networks all the time that have products at 5 Mbps and 10 Mbps. In all of the fiber-to-the-premise technologies, the network is set up to deliver at least 100 Mbps to every customer and the network provider chokes the speeds down to what is sold to customers. It literally takes a flick of a switch for a fiber provider to change the speed to a home or business from 10 Mbps to 100 Mbps.

And so I tell these operators to make it faster. If you own a fiber network you have one major technological advantage over any competition, which is speed. I just can’t understand why a fiber network owner would offer speeds that are in direct competition with the DSL and cable modems in their market when they are capable of leaping far above them.

But even if you are using copper or coax you need to increase speeds to customers whenever you can. Customers want more speed and you will always be keeping the pressure on your competition.

Exit mobile version