Categories
Current News Improving Your Business Regulation - What is it Good For?

Another Idea for Rural Broadband

An rural area west of Route 41 and Lowell, Ind...
An rural area west of Route 41 and Lowell, Indiana. (Photo credit: Wikipedia)

The Fiber-to-the-Home Council (FTTHC) asked the FCC to give consideration for a new way to fund rural broadband. Their proposal asks the FCC to make available unused portions of the Universal Service Fund to supply grants to build gigabit fiber networks. They would have this done under a competitive process, meaning that the networks that could do this the most efficiently would be at the top of the grant list.

It’s an intriguing idea. I have often talked in this blog about the state of broadband in rural America. Consider some of the following rural broadband issues:

  • About a year and a half ago the FCC estimated that there was still about 14 million rural households with no access to any kind of terrestrial broadband. There have been some projects in the last year that now serve some of these customers, but the number is still probably not much smaller.
  • In the FCC’s last three Broadband Progress Reports the agency said that incumbent carriers were not upgrading to the FCC’s minimum definition of broadband fast enough. Those speeds are currently 4 Mbps download and 1 Mbps upload. And the FCC has promised that every few years they will revisit that definition of broadband, and supposedly will increase it over time.
  • There is often a big difference between advertised speeds and actual speeds. Getting 4 Mbps download is barely enough bandwidth for a household to participate in today’s web, and if the actual speeds delivered are less than this then it’s hard to call the service broadband by today’s reality.
  • The availability of rural broadband depends upon where a customer lives. If they live in a large enough rural town then they might have broadband available from either the telco or the cable company, and even sometimes from both. But cable networks rarely extend much past the borders of these small towns and DSL rarely carriers more than a mile or two from the center of town. So there are many rural counties that have some broadband in the towns but practically none outside the towns.
  • Most urban areas now have cable modem service that is advertised at between 10 Mbps and 20 Mbps. And urban broadband keeps improving. Rural areas are permanently falling behind and the gap is going to widen over time. This has huge implications for the long-term economic viability of rural America.

Of course, there are some organizations that have opposed this idea, mostly those organizations funded by incumbent telcos and cable companies. This always has me scratching my head. For the most part the large telcos and cable companies have ignored rural America for one or even two decades. They have not poured new capital into these areas to bring them up to the needed speeds and they spend as little as possible to keep these areas operating. I contrast this to the small independent telcos who generally do an excellent job in rural America, but there are still large swaths of rural area that have been largely ignored. And even while ignoring these areas the large telcos want to protect their revenue streams.

I guess that is good business, but it is poor policy. In my mind broadband is basic infrastructure and homes and businesses need adequate broadband in order to take part in modern society. And this is just about to become much more important as we move into the Internet of things. It’s one thing to not provide enough broadband to a rural home so that they can watch streaming videos. But when we are having our healthcare monitored by the Internet then broadband becomes an essential component of every home’s life.

The rural broadband crisis is already here and the broadband gap is already unacceptable. The FTTHC’s proposal is creative and doesn’t ask for any additional government funds. They are asking that the FCC make an investment today in rural areas as a down-payment to help those areas stay viable as places for people to live in the future. I would assume that any awards of funds are also going to expect the rural communities to chip in a lot of matching funds, and so all that is being asked is to help these communities help themselves. I think it is an idea worthy of FCC consideration.

Categories
Current News Regulation - What is it Good For?

The FCC’s Data Collection Effort

Character for children of FCC”Broadband” (Photo credit: Wikipedia)

The FCC just changed the way that they are going to gather data from carriers about voice and data usage in the US. To some degree they seem to be throwing in the towel and just giving up.

I have blogged before about the massive inadequacies of the National Broadband Map. This is an FCC-sponsored effort to show the availability of broadband on a geographic basis. This sounds like a laudable goal, but the carriers decide what information they want to supply to the mapping process, and so the map is full of what can only be described as major lies from the largest carriers. They claim to have broadband where they don’t and at speeds far greater than they actually deliver.

The FCC announced new rules for their data collection process that is done using FCC Form 477. This revised effort by the FCC is going to make their data gathering more like the process that is used to collect data for the National Broadband Map. They are no longer going to try to collect actual data speeds in tiers, but instead will be collecting only advertised speeds for data – the fastest advertised speed for landline providers and the slowest advertised speeds for wireless providers. For the life of me I can’t imagine how this data can be of the slightest use to anybody.

I just recently worked with a client in a small town in Oregon. The incumbent providers there are the biggest telephone company and cable company in the state. In both cases, they advertise the same speeds in this small town that they advertise in Portland. But in this town, as in most or rural America, the actual speeds delivered are far slower. They think the fastest cable modem speeds in the town are from 3 – 5 Mbps download and the fastest DSL is not much over 1.5 Mbps. And yet both carriers advertise products at many times those speeds.

This would just be a big annoyance if it wasn’t for the fact that the FCC and others use the data gathered to talk about what a great job the carriers are doing in this country to supply broadband. I recently saw an announcement that 98% of households now have broadband availability. And since the FCC’s definition of broadband is now a download speed of 4 Mbps and an upload speed of 1 Mbps, this makes it sound like the country’s broadband problems are being solved. But announcements of this sort are based upon lies and exaggerations by the carriers.

And since the whole point of this data gathering effort is to formulate policies to spur the carriers to do better, letting the carriers self-report whatever they want is like asking the fox to go count the eggs in the henhouse every morning. There is no practical penalty against a carrier advertising any speed they want or reporting falsely to the FCC. And it’s a lot easier, as it is with the Oregon example, for the incumbent providers to gear all of their advertising in a state around the urban markets. I have no idea if those incumbents in Oregon can actually deliver the advertised speeds in Portland, but I know for a fact that they do not do so outside of Portland.

The FCC is also changing the way that they gather information about VoIP lines. But I think the day for them to be able to gather any meaningful data about business phones in the country is over. There is such a proliferation of IP Centrex and other VoIP technologies that the carriers don’t even know what is being delivered. Consider this:

  • It’s now possible to use one number for a thousand lines in a call center or instead to give a thousand numbers to one phone.
  • There is a proliferation of resellers in the market who buy numbers and 911 from larger carriers so that they don’t have to become a CLEC. And these resellers can then deliver a wide variety of business voice services over anybody’s data connection. These carriers will not be reporting what they are doing to the FCC because most of them are not certified as carriers but rely on the certification of the CLEC that gave them numbers.  Nobody in the FCC reporting chain is going to know about or report these kinds of customers and lines. And it gets worse because I know of many cases now of resellers of these resellers. Literally almost anybody can become a carrier overnight reselling these services. It’s back to the wild west days we used to see with long distance resale. I’m expecting to go to a telecom convention soon and see the shark-skin suits again.
Categories
Current News Regulation - What is it Good For?

At Least We are Not Europe

Europe Simulator (Photo credit: wigu)

In this country the FCC has undertaken various policy initiatives to promote broadband. However, except for some universal service funding that will bring broadband for the first time to tribal areas and very rural places, these initiatives come with no federal money. And so the real broadband policy in the country is to wait for the private sector to build the infrastructure. The FCC may make proclamations about creating gigabit cities, but it’s completely up to the private sector to make it happen.

And we all know how that is working out. We have a checkerboard of broadband coverage. At one end of the spectrum are the fiber networks – Google and a few others bringing gigabit fiber, Verizon with FiOS, and many smaller communities with fiber built by municipalities or independent telephone companies. In the middle most metropolitan areas are served by decently fast cable modem service and ADSL2 DSL. And then there are a lot of smaller cities and rural communities where the DSL and the cable modems are a generation or more old and which deliver far less bandwidth than advertised. And we have many rural areas still with no broadband.

But what we have, by and large, is still better than what has been happening in Europe. And this is because our regulatory policy for last-mile connectivity is mostly hands-off while the European markets are heavily regulated. After the European Union was formed the European regulators went for a solution that promoted low prices. They have required that all large networks be unbundled for the benefit of multiple service providers. This has turned out to be a short-term boon for consumers because it has brought down prices in every market where multiple providers are competing.

But there is a big catch and the European policy is not going to work out well in the long-run. Over the last five years the per capita spending on new telecom infrastructure in Europe is less than half of what it is in the US, and this is directly due to the unbundling policy. Network owners have no particular incentive to build new networks or upgrade existing ones because it brings their competitors the same advantages they get.

In the long-run, Europe is going to fall far behind everybody else in fiber deployment because nobody wants to invest in fiber to connect to homes and businesses. There have been several major fiber initiatives in recent years in Europe, but these have largely been driven by large cities who are spending the money on the fiber infrastructure, much as is happening with some cities here. But the normal kinds of companies that ought to be investing in last-mile fiber in Europe, the cable companies and the telcos, are not doing so.

We tried something similar here for a few years. When the Telecommunications Act of 1996 was enacted, one of the major provisions was that the RBOCs (Bell companies) had to unbundle their networks, much as is being done in Europe. This was to spur competition by allowing new competitors to get a start in the business without having to invest in a new network. And this brought short-term benefits to consumers for a while. Companies were leasing RBOC unbundled loops and providing voice and data (DSL at the time) to businesses and residences all over the country.

But the FCC didn’t go the whole way like they did in Europe or else they would have also unbundled the large cable networks in this country. The unbundled telecom network business plans broke apart after cable modem service began winning the bandwidth war. And of course, there was the telecom crash that killed the larger new competitors. There are still a few companies out there pursuing this unbundled business model, but for the most part it didn’t work. And the reason it didn’t work is that it is a form of arbitrage. The business plan only worked because federal regulators made the RBOCs unbundle their networks and then state regulators set the prices for the network elements low to spur competition. But the services the competitors were able to offer were no better than what the RBOCs could offer on the same networks.

It’s always been clear to me that you can’t build a solid business on arbitrage. A smart provider can take advantage of temporarily low prices to make a quick profit when they find arbitrage, but they must be ready to ditch the business and run when the regulatory rules that created the opportunity change.

And Europe is currently engaged in one gigantic arbitrage situation. There are multiple service providers who are benefitting by low network costs, but with no burden to make capital investments. Customers there are winning today due to the lower prices due to competition. But in the long run nobody wins. The same rules that are making prices low today are ensuring that nobody makes any serious investment in building new fiber networks. So the competitors will fight it out on older networks until one day when the arbitrage opportunity dies, the competitors will all vanish like the wind. We know it will happen because it happened here. The CLECs in this country had tens of millions of customers, and they disappeared from the market and stranded those customers in a very short period of time.

The only policy that is really going to benefit consumers here, or in Europe, is one that fosters the building of state-of-the-art networks. The commercial providers have not stepped up nearly enough in this country and there is still not a lot of fiber built to residences. But in Europe it’s even worse. So, as much as I read about people criticizing the broadband policies in the US, I have to remind myself – at least we are not Europe.

Categories
The Industry What Customers Want

The DSL TV Market

CenturyLink Contingent (Photo credit: sea turtle)

I find it surprising that DSL TV providers have been the fastest growing segment of the cable TV industry. And my surprise is due to the fact that these companies are delivering TV over the smallest data pipe of any of the comparable technologies. Over the last year the companies using DSL and fiber to deliver cable TV have grown in customers while the traditional cable companies have lost customers.

Cable TV is delivered over DSL using a bonded pair of telephone wires using either ADSL2 or VDSL. In theory these technologies can deliver speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps. The bandwidth that is left over after the TV signal is used to deliver voice and data.

The DSL providers make cable work by using a technology called IPTV. This technology only sends the signals to the home that the customer is asking to see. One can always tell that you are on an IPTV system because of the small pause that occurs every time you change channels.

The DSL cable industry is composed of AT&T U-verse, CenturyLink Prism and a whole slew of smaller telephone companies. Not every telco has taken the bonded DSL path. For example, a number of the mid-sized telcos like Frontier, Fairpoint and TDS have elected to partner with a satellite provider in order to have a TV product in the bundle. But last year TDS ventured out into the DSL TV market in Madison Wisconsin.

AT&T is by far the most successful DSL TV provider as one would expect from their large customer base. AT&T has made the product available to over 24 million homes. At the end of the first quarter of 2013 they reported having 5 million cable customers on U-verse and 9.1 million data customers.

The biggest problem with using DSL is the distance limitation. The speeds on DSL drop significantly with distance and so customers have to be on a relatively short copper path in order for it to work. The DSL that AT&T is using can support the U-verse product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs. And the key word in that description is good copper, because older copper and copper with problems will degrade the speed of the product significantly.

I really don’t know who is in second place. CenturyLink announced that they had 120,000 TV customers on their Prism product at the end of the first quarter of 2013. There may be some other telcos out there with more DSL cable customers. But CenturyLink if fairly new to the product line having launched it just a few years ago. They still only offer it in a few markets but are adding new markets all of the time. So if they are not in second base they soon will be.

In researching this article I came across some web sites that carry customer complaints about Prism. Look at the Yelp pages for CenturyLink in Las Vegas. I’ve always suspected that unhappy customers are more likely to post an on-line review than happy ones, but some of the stories in here are extraordinarily bad. Obviously CenturyLink is having some growing pains and has a serious disconnect between their marketing and sales departments and their customer service. But some of the policies in here, such as charging people a large disconnect fee even though there is no contract is surprising in a competitive environment. And yet, even with these kinds of issues the company has added over 100,000 customers in just a few years.

I have to wonder how this industry segment is going to handle where the cable business is going. How much they can squeeze out of a 20 Mbps data pipe when you have customers who want to watch several TVs at the same time, record shows while watching another show and also streaming video to tablets and laptops, all simultaneously? Yesterday I noted the new trend in large TVs which is to split the screen into four parts, each showing something different. Most reviews of the performance of TV over DSL are pretty good, but how will DSL handle the guy who wants to watch four HD football games at the same time while surfing the internet?

Categories
Improving Your Business Technology

Do You Understand Your Chokepoints?

Almost every network has chokepoints. A chokepoint is some place in the network that restricts data flow and that degrades the performance of the network beyond the chokepoint. In today’s environment where everybody is trying to coax more speed out of their network these chokepoints are becoming more obvious. Let me look at the chokepoints throughout the network, starting at the customer premise.

Many don’t think of the premise as a chokepoint, but if you are trying to deliver a large amount of data, then the wiring and other infrastructure at the location will be a chokepoint. We are always hearing today about gigabit networks, but there are actually very few wiring schemes available that will deliver a gigabit of data for more than a very short distance. Even category 5 and 6 cabling is only good for short runs at that speed. There is no WiFi on the market today that can operate at a gigabit. And technologies like HPNA and MOCA are not fast enough to carry a gigabit.

But the premise wiring and customer electronics can create a choke point even at slower speeds. It is a very difficult challenge to bring speeds of 100 Mbps to large premises like schools and hospitals. One can deliver fast data to the premise, but once the data is put onto wires of any kind the performance decays with distance, and generally a lot faster than you would think. I look at the recent federal announced goal of bringing a gigabit to every school in the country and I wonder how they plan to move that gigabit around the school. The answer mostly is that with today’s wiring and electronics, they won’t. They will be able to deliver a decent percentage of the gigabit to classrooms, but the chokepoint of wiring is going to eat up a lot of the bandwidth.

The next chokepoint in a network for most technologies is neighborhood nodes. Cable TV HFC networks, fiber PON networks, cellular data networks and DSL networks all rely on creating neighborhood nodes of some kind, a node being the place where the network hands off the data signal to the last mile. And these nodes are often chokepoints in the network due to what is called oversubscription. In the ideal network there would be enough bandwidth delivered so that every customer could use all of the bandwidth they have been delivered simultaneously. But very few network operators want to build that network because of the cost, and so carriers oversell bandwidth to customers.

Oversubscription is the process of bringing the same bandwidth to multiple customers since we know statistically that only a few customers in a given node will be making heavy use of that data at the same time. Effectively a network owner can sell the same bandwidth to multiple customers knowing that the vast majority of the time it will be available to whoever wants to use it.

We are all familiar with the chokepoints that occur in oversubscribed networks. Cable modem networks have been infamous for years for bogging down each evening when everybody uses the network at the same time. And we are also aware of how cell phone and other networks get clogged and unavailable in times of emergencies. These are all due to the chokepoints caused by oversubscription at the node. Oversubscription is not a bad thing when done well, but many networks end up, through success, with more customers per node than they had originally designed for.

The next chokepoint in many networks is the backbone fiber electronics that delivers bandwidth to from the hub to the nodes. Data bandwidth has grown at a very rapid pace over the last decade and it is not unusual to find backbone data feeds where today’s data usage exceeds the original design parameters. Upgrading the electronics is often costly because in some network you have to replace the electronics to all nodes in order to fix the ones that are full.

Another chokepoint in the network can be hub electronics. It’s possible to have routers and data switches that are unable to smoothly handle all of the data flow and routing needs at the peak times.

Finally, there can be a chokepoint in the data pipe that leaves a network and connects to the Internet. It is not unusual to find Internet pipes that hit capacity at peak usage times of the day which then slows down data usage for everybody on the network.

I have seen networks that have almost all of these chokepoints and I’ve seen other networks that have almost no chokepoints. Keeping a network ahead of the constantly growing demand for data usage is not cheap. But network operators have to realize that customers recognize when they are getting shortchanged and they don’t like it. The customer who wants to download a movie at 8:00 PM doesn’t care why your network is going slow because they believe they have paid you for the right to get that movie when they want it.

Categories
Technology

Is There any Life Left in Copper?

RG-59 coaxial cable A: Plastic outer insulation B: Copper-clad aluminium braid shield conductor C: Dielectric D: Central conductor (copper-clad steel) (Photo credit: Wikipedia)

Copper is still a very relevant technology today, and when looked at on a global scale nearly 2/3 of all broadband subscribers are still served by copper. That percentage is smaller in the US, but this country has a far more widely deployed cable TV system than most of the rest of the world.

The most widely deployed DSL technologies today are ADSL2 and VDSL. In theory these technologies can get speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps.

ADSL2 and VDSL technology has been widely deployed by AT&T in its U-verse product which serves over 7 million data customers and over 4.5 million cable customers. AT&T has made the product available to over 24 million homes. AT&T can support the product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs.

And ADSL2 is a pretty decent product. It can deliver IPTV and still support an okay data pipe. However, as the cable companies are finding ways to get more bandwidth out of their coaxial cable and as new companies are deploying fiber, these DSL technologies are going to again fall behind the competition.

So what is out there that might resurrect copper and make speeds faster than ADSL2? Not too long ago I wrote a blog about G.Fast, which is Alcatel-Lucent’s attempt to find a way to get more speeds out of legacy copper networks. In recent field tests ALU achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

However, the G.Fast distance limitations are far shorter than ADSL2 and G.Fast is really more of a drop technology than a last mile technology and it would require a telco like AT&T to build a lot more fiber to get even closer to houses. You have to wonder of it makes any sense to rebuild the copper network to be able to get up to 500 Mbps out of copper when fiber could deliver many gigabits.

There are other technologies that have been announced for copper. Late last year Genesis Technical Systems announced a scheme to get 400 Mbps out of copper using a technology they are calling DSL Rings. This technology would somehow tie 2 to 15 homes into a ring and bridge them with copper. Details of how the technology works are still a little sketchy.

In 2011 the Chinese vendor Huawei announced a new technology that will push up to 1 gigabit for 100 meters. This sounds very similar to G.Fast and sounds like a way to use existing copper within a home rather than rewiring.

There is one new technology that is finally getting wider use which is bonded VDSL pairs that use vectoring. Vectoring is a noise cancellation technology that works in a way similar to how noise-cancelling headphones work to eliminate sound interference. Vectoring eliminates most of the noise between bonded pairs of copper. Alcatel-Lucent hit the market with bonded pair VDSL2 in late 2011 that can deliver up to 100 Mbps. However, in real deployment speeds are reported to be 50 Mbps to 60 Mbps on older copper. That is enough speed to probably give another decade to DSL, although to do so requires a full replacement of old technology DSL technology with VDSL2. One has to wonder how many times the telcos will keep upgrading their copper electronics to get a little more speed rather than taking the leap to fiber like Verizon did.

One only has to take a look at the growth rate of the data used at homes and ask how long copper can remain relevant. Within a few short decades we have moved from where homes could get by on dial-up and now find a 20 Mbps connection too slow. Looking just a few years forward we see the continued growth of video sharing and a lot of new traffic from cellular femtocells and the Internet of Things. It’s hard to think that it won’t be long until people are bemoaning the inadequacy of their 50 Mbps connections. But that day is coming and probably is not more than a decade away.

Categories
Improving Your Business Technology

G.Fast

You are going to start hearing about a new technology that may infuse some life back into existing copper networks. The technology is being referred to as G.Fast. This technology promises to be able to deliver very fast speeds up to a gigabit over copper for very short distances.

Some are referring to G.Fast as a last mile technology, but it is really a drop technology. The distances supported by the technology are so short that this is going to require fiber to the curb, or as some are now calling it, fiber to the distribution point.

Alcatel-Lucent and Telekom Austria just announced a field trial of G.Fast. That trial achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

Current copper technologies use only a small portion of the theoretical bandwidth available on a copper wire. For example, most VDSL2 systems deployed today use up to 17 MHz of spectrum on the copper. G.Fast can provide more speeds by using more of the available spectrum and will be able to use somewhere between 70-140 MHz on copper. Plus G.Fast will be more efficient. Today DSL functions by dividing the data path into sub channels which each contain about 15 bits of data. Engineers are looking at coding and modulation techniques that will increase the bits per sub channel for G.Fast and thus increase speeds more.

G.Fast also will benefit by an existing technique called vectoring. This technology is used today with VDSL2 and eliminates crosstalk interference between copper pairs. It does this by monitoring the noise on copper and then creating an anti-noise signal which cancels the noise in the same was as is done by noise-canceling headphones.

Right now Alcatel-Lucent is spending a lot of time on G.Fast because they see a big opportunity to make more money out of the old copper networks. So let’s look at the issues that a large telco like AT&T will face when considering the technology:

  • Because the distances to deploy G.Fast are so short, the carrier is going to have to build fiber past every customer, just like in a FTTH network. A large carrier like AT&T has some advantages over a fiber overbuilder in that they can overlash fiber onto existing copper on pole lines. This is cheaper and faster than putting up fiber for a new provider who has to deal with pole make-ready costs.
  • Copper drops are generally the worst copper in the network. These wires get banged around by wind and suffer from repetitive water damage and are the weak point in the copper network. The promised savings from G.Fast is to lower the cost of installation at a customer. Some of this savings disappears if too many homes need a new drop to make it work.
  • G.Fast will save the cost of getting into the house. Once connected to an existing telephone NID on the outside of the house the signal can go anywhere in the home that is already wired for telephone. But the distance issue quickly kicks in and I would expect carriers to take this to a wireless right inside the house.
  • Savings are going to depend on how inexpensive the G.Fast electronics are compared to FTTH electronics.
  • Large telcos have relied for years upon customer self-installation of DSL and they will need G.Fast to work the same way.

So the savings to somebody like AT&T come from a) cheaper fiber installation costs because of the ability to overlash, 2) the ability in many cases to use existing drop and inside telephone wires, and 3) the ability to have customers self-install the product to avoid having to go into the home.

There are still a lot of technical issues to consider and overcome. Some issues that come to my mind include things like overcoming existing splices in the copper, and making sure there is no interference with existing DSL.

The expected time line for the deployment of G.Fast is as follows:

  • Standards finalized by spring of 2014.
  • Chip sets developed in 2015.
  • First generation hardware available in 2016 that probably won’t support vectoring.
  • Mature second generation equipment available in 2017.

Since a carrier has to build fiber everywhere for this to work, the technology is really competing against FTTH. By the time this is readily available there may be lower-cost units for FTTH deployment and I think any carrier would prefer an all-fiber network if possible.

Categories
Current News The Industry

The National Broadband Map

Seal of the United States Federal Communications Commission. (Photo credit: Wikipedia)

Last Thursday the FCC voted to take over the data collection for the National Broadband Map. The Map was created as part of the funding for broadband supplied a few years ago by the Stimulus package. The Map was created and administered by the NTIA (National Telecommunications and Information Administration) with input from the states, and that funding is now running out.

Back when the Map was suggested I thought the concept was a good one. But as soon I saw that the data gathered for the Map was to be self-reported by carriers I knew that there were going to be problems. And sure enough, when the first generation Map was produced it was full of errors – big errors.

I work with a lot of rural communities and I have reviewed the maps in many areas of the country and compared it to the actual deployment of broadband. Some communities have developed their own maps – and they did it the hard way. They sent people around to see where broadband was available. A lot of this can be done by somebody who knows how to look up at the cables. It’s easy to know where cable modems are available by the presence of coaxial cable on the poles. And rural DSL generally has repeaters that can be spotted by the eagle-eyed observer. And it’s not hard to look at your cell phone to see how many bars of data you can get. But the best test of where broadband is at is done by knocking on doors and asking people what they are able to buy.

As an example of what I found, let me talk about the issues found in just one county in Minnesota. The Map showed that most of the County had landline broadband availability. The County is very typical of rural areas and the County Seat is the largest town in the County. There are half a dozen much smaller towns and everything else is rural. A large chunk of the rural area is a national forest where very few people live. Most people live in close proximity of the roads in the rural areas.

The reality in this County is that even in the several of the smaller towns the DSL is so slow that it is hard to think of it as broadband. It’s more like dial-up plus. And there was no cable modem service from the cable company outside of the County Seat. And as is typical with DSL, as one goes outside of the towns the quality of the DSL quickly degrades with distance from the DSL hub. We’ve always called this the donut effect with large areas of no broadband surrounding rural towns that have DSL and/or cable modems.

The Map also showed that almost every populated area of this Minnesota County had 3G wireless data available. It’s a very hilly and rugged place and probably half of the county by area can’t even get cellular voice calls, let alone data. But even where voice is available there are many areas that can’t get cellular data. The Map was just wrong about this.

Everywhere that I have helped communities look at the Map we have seen the same thing. The Map shows broadband that isn’t there. It shows cellular data coverage that isn’t there. And it often shows providers that are supposedly serving the counties that nobody ever heard of.

And this is not true for just rural counties. I have helped two suburban counties near large cities look at the Map and they found the same situation. The Map showed areas that are supposed to have broadband where their citizens still have dial-up or satellite. And cellular coverage was exaggerated on the Map.

An obvious question is why this matters? The national Broadband Map has only been around for only a few years and anybody who has ever looked at it knows it us full of inaccuracies. The problem is that the federal government now relies on the Map for several purposes. For instance, if you want to get federal money by loan or grant to deploy rural broadband the assumption is that the Map is good. It is then your responsibility to show where the map is wrong.

And the FCC uses the Map when it talks about the availability of Broadband in rural America. The Map has been overlaid with Census data to count how many households can get broadband. This produces a very distorted picture of who has broadband. There are pockets of people without broadband in even some of the most populated counties in the country and the Map simply misses them. And in rural areas the Map can be very wrong.

The FCC just took over responsibility for the Map. From my perspective they either need to do it right or get out of the mapping business. It’s not easy to get it right, but it can be done. One of the easiest steps they could take would be to give counties the authority to clean up the maps for their areas. Many of them would be glad to do that. And broadband availability is not static. There are areas all of the time getting or losing broadband. If the FCC won’t take the time to get the Map right they should just let it die as another impractical idea.

Categories
Improving Your Business Technology

Make it Faster

Cable modem Motorola SurfBoard for broadband internet (Photo credit: Wikipedia)

Whenever I look at my client’s data products I almost have the same advice – make it faster. I am constantly surprised to find companies who deliver small bandwidth data products when their networks are capable of going much faster. I have come to the conclusion that you should give customers as much bandwidth as you technically can deliver, within any technical restraints.

I know that networks are operated largely by engineers and technicians and very often I hear the engineers warn management against increasing speeds. They typically are worried that faster speeds mean that customers will use more bandwidth. They worry that will mean more costs with no additional revenue to pay for the extra bandwidth.

But the experience in the industry is that customers don’t use more data when they get more speeds, at least not right away. Customers do not change their behavior after they get faster data – they just keep doing the same things they were doing before, only faster.

Of course, over time, internet data usage is steadily increasing on every network as customers watch more and more programming on the web. But they are going to increase usage regardless of the speed you deliver to them as long as that speed is fast enough to stream video. Going faster just means they can start watching content sooner without having to worry about streaming glitches.

The engineers do have one valid point that must be taken into consideration, in that many networks have chokepoints. A chokepoint is any place in a network that can restrict the flow of data to customers. Chokepoints can be at neighborhood nodes, within your network backbone, at devices like routers, or on the Internet backbone leaving your company. If your network is getting close to hitting a chokepoint you need to fix the issue because the data usage is going to grow independently of the speeds you give your customers. When I hear worry about chokepoints it tells me that the network needs upgrades, probably sooner rather than later.

Historically telecom companies were very stingy with data speeds. The first generations of DSL didn’t deliver speeds that were much faster than dial-up and even today there are many markets that still offer DSL with downloads speeds of 1 Mbps. Then cable modems came along and they upped speeds a little, with the first generation of cable modems offering speeds up to 3 Mbps. And over time the telcos and the cable companies increased data speeds a little, but not a lot. They engaged in oligopoly competition rather than in product competition. There are many notorious quotes by the presidents of large cable companies saying that their customers don’t need more speed.

But then Verizon built FiOS and changed the equation. Verizon’s lowest speed product when they launched service was 20 Mbps, and it was an honest speed, meaning that it delivered as advertised. Many of the DSL and cable modem speeds at that time were hyped at speeds faster than could be delivered in the network. Cable modems were particular susceptible to slowing down to a crawl at the busiest times of the evening.

Over time Verizon kept increasing their speeds and on the east coast they pushed the cable companies to do the same. Mediacom in New York City was the first cable company to announce a 50 Mbps data product, and today most urban cable companies offer a 100 Mbps product. However, the dirty secret cable companies don’t want to tell you is that they can offer that product by giving prioritization to those customers, which means that everybody else gets degraded a little bit.

And then came Google in Kansas City who set the new bar to 1 Gbps. Service providers all over the country are now finding ways to 1 Gbps service, even if it’s just to a few customers.

I am always surprised when I find a company who operates a fiber network which does not offer fast speeds. I still find fiber networks all the time that have products at 5 Mbps and 10 Mbps. In all of the fiber-to-the-premise technologies, the network is set up to deliver at least 100 Mbps to every customer and the network provider chokes the speeds down to what is sold to customers. It literally takes a flick of a switch for a fiber provider to change the speed to a home or business from 10 Mbps to 100 Mbps.

And so I tell these operators to make it faster. If you own a fiber network you have one major technological advantage over any competition, which is speed. I just can’t understand why a fiber network owner would offer speeds that are in direct competition with the DSL and cable modems in their market when they are capable of leaping far above them.

But even if you are using copper or coax you need to increase speeds to customers whenever you can. Customers want more speed and you will always be keeping the pressure on your competition.

Categories
Technology The Industry

Is Wireless a Substitute for Wireline?

Last week in GN Docket 13-5 the FCC issued an update that asked additional questions about its planned transition of the historic TDM telephone network to all-IP network. This docket asked for comments on several topics like having a trial for transitioning the TDM telephone network to all-IP, for having a trial to go to enhanced 911 and for making sure that a switch to IP would not adversely affect the nationwide telephone databases.

But the docket also asks for comments on whether the FCC should grant telephone companies the right to substitute wireless phones for wireline phones and abandon their copper network. The docket mentioned two companies that wanted to do this. For example, Verizon said they intend to put wireless on Fire Island off New York City as they rebuild it from the devastation of hurricane Sandy. But AT&T has told the FCC that they are going to request permission to replace “millions of current wireline customers, mostly in rural areas, with a wireless-only product”.

Let me explain what this means. There are now traditional-looking telephone sets that include a cellular receiver. To replace a wireline phone, the telephone company would cut the copper wires, and in place of your existing phones they would put one of these cellular handsets. They would not be making every family member get a cell phone and there would still be a telephone in the house that works on the cellular network.

This make good sense to me for Fire Island. It is mostly a summer resort and there are not many residents there in the winter. It’s a relatively small place and with one or two cell phone towers the whole island could have very good coverage. And if the cell phone tower is upgraded to 4G there would be pretty decent Internet speeds available, certainly much faster than DSL. One would have to also believe that the vast majority of visitors to the island bring along a cell phone when they visit and that there is not a giant demand for fixed phones any longer.

It is AT&T’s intentions, though, that bother me a lot. AT&T wants to go into the rural areas it serves and cut the copper and instead put in these same cellular-based phones. This is an entirely different situation than Fire Island.

Anybody who has spent time in rural areas like I do knows the frustration of walking around trying to find one bar of cellular service to make or receive a call. Cell phone coverage is so good today in urban areas that one forgets that this is not true in many places. I have a client, a consortium of towns and the rural areas of Sibley and Renville Counties in Minnesota. Let me talk about my experience in working with them as an example of why this is a bad idea.

My primary contact works in the small town of Winthrop. I have AT&T cellular service and when I visit him my cellphone basically will not work. I sometimes can move around and find one bar and get a call through, but I can’t coax the phone to get a data connection so that I can check email. And if you go west from Winthrop the coverage gets even worse. AT&T’s coverage maps show that they serve this area, but they really don’t. There are places in the east end of Sibley County that have decent coverage. But there are also plenty of farms where you can get coverage outdoors, but you can’t get coverage in the house.

The traditional cellular network was not built to serve people, but rather cars. Cell phone coverage is so ubiquitous now that we already forget that cellular minutes used to be very expensive, particularly when you roamed away from your home area. The cell phone network was mostly built along roads to take advantage of that roaming revenue stream. If you happen to live near to a tower you have pretty decent coverage. But you only need to go a few miles off the main highway to find zero bars.

And I use the Renville / Sibley County client as an example for a second reason. The people there want fiber – badly. They have been working on a plan for several years to get fiber to everybody in the area. The area is a typical farming community with small hub towns surrounded by farms. The towns have older cable systems and DSL and get broadband, although much slower than is available in the Twin Cities an hour to the east. But you don’t have to go very far outside of a town to get to where there is no broadband. Many people have tried satellite and found it too expensive and too slow. There are any homes still using dial-up, and this is not nearly as good as the dial-up most of you probably remember. This is dial-up delivered to farms on old long copper pairs. And it is to get access to an Internet that has migrated to video and heavy graphics. Dial-up is practically useless for anything other than reading email, as long as you don’t send or receive attachments.

Over 60% of the people in the rural areas in Renville and Sibley Counties have signed pledge cards to say that they would take service if fiber can be built to them. One would expect this would translate to at least a 70% penetration if fiber is built. They refer to the project locally as fiber-to-the farm. There has been a cooperative formed to look at ways to get fiber financed. And any financing is going to require local equity, meaning the people in the County are going to have to invest millions of their own dollars in the project – and they are certain they can raise that money. That is how much they want the fiber. And this same thing is true in rural areas all over the country. Most of rural America has been left behind and does not have the same access to the Internet that the rest of us take for granted.

AT&T’s idea is only going to work if they make a big investment in new rural cell towers. The current cell phone network in rural areas is not designed to do what they are proposing, even for delivering voice. And even if the existing rural cell phone towers are upgraded to 3G or 4G data (which almost none have been), most people live too far from the existing towers to get any practical use from cellular data. Cellular data speeds are a function of how close one is to the tower and, just like with DSL, the speeds drop off quickly as you get away from the hub.

I hope rural America notices this action at the FCC and files comments. Because as crappy as the rural copper wires are today, when the wireline network disappears many rural households are going to find themselves without telephone service. And forget about fast rural data. The AT&T plan is really just a plan for them to abandon and stop investing in rural communities.

Exit mobile version