The Industry

The Birth of the Digital Divide

A lot of the money being spent on broadband infrastructure today is trying to solve the digital divide, which I define as a technology gap where good broadband is available in some places, but not everywhere. The technology divide can be as large as an entire county that doesn’t have broadband or as small as a pocket of homes or apartment buildings in cities that got bypassed.

I can clearly remember when the digital divide came about, and at that time I remember discussing how the obvious differences between technologies were going to someday become a major problem. Today I’m going to revisit the birth of the digital divide.

Until late in the 1990s, the only way for almost most people to get onto the Internet was by the use of dial-up access through phone lines. ISPs like AOL, CompuServe, and MSN flourished and drew millions of people online. At first, dial-up technology was only available to people who lived in places where an ISP had established local dial-up telephone numbers. But the online phenomenon was so popular, that ISPs eventually offered 800 numbers that could be reached from anywhere. There was no residential digital divide, except perhaps in places where telephone quality wasn’t good enough to accommodate dial-up. Some businesses used a faster technology to connect to the Internet using a T1, which had a blazingly fast speed of 1.6 Mbps, almost 30 times faster than dial-up. To people connecting at 56 kbps, a T1 sounded like nirvana.

The digital divide came into being when the faster technologies of DSL and cable modem were offered to homes. My first DSL line had a download speed of almost 1 Mbps, an amazing 18 times increase in speed over the dial-up modem. At almost the same time, some cable companies began offering cable broadband that also had a speed of around 1 Mbps. Homes in urban areas had a choice of two nearly-identical broadband products, and the early competition between telephone and cable companies was loud and fierce.

The advent of DSL created the first digital divide – the gulf between urban areas and rural areas. While telcos theoretically offered DSL in much of rural America, the 2-mile limitation of the DSL signal meant the speed didn’t carry far outside of the towns that housed the DSL transmitters, called DSLAMs. Many telcos were willing to sell rural DSL, even if speeds were often barely faster than dial-up. Soon after the first DSL was offered to customers, the vendors came up with ISDN-DSL that could deliver a speed up to 128 kbps deeper into rural copper networks – twice the speed of dial-up. But decent DSL never made it very far into most of rural America – and still doesn’t today for much of rural America.

The DSL and cable modem technologies improved within a few years after introduction, and the technology improvements created the second digital divide. I recall versions of DSL that had a maximum speed of 3, 6, 12, 15, 24, and eventually 48 Mbps. The big telcos upgraded to later DSL technology in some neighborhoods, but not others. Sadly, even today we continue to find places where the earliest versions of DSL are still offered, meaning there are places where DSL speeds never climbed above 3, 6, or 12 Mbps. This was particularly painful in towns that didn’t have a cable competitor because they were stuck with whatever flavor of DSL the telephone company offered to them. This was noticeable in big cities where some neighborhoods never saw any DSL upgrades. There was a well-known study done a number of years ago documenting the DSL technologies available in Dallas, Texas. The study showed that poor neighborhoods still had the slowest versions of DSL while more affluent neighborhoods had DSL speeds up to 50 Mbps.

Cable modem technology improved more quickly than DSL. By 2005, the cable modem won the speed game. And that’s when the cable companies started charging more for cable broadband – something they could do because the broadband was faster. This price difference largely meant that low-income households were stuck with DSL, while folks who care about speeds migrated over the years to the cable companies.

The digital divide in rural areas deepened as older DSL was not upgraded while the DSL that had originally been deployed started to reach end-of-life. Copper networks have lasted far past the expected economic useful life and get a little worse every year. In cities, any parts of the city stuck with only DSL fell far behind the neighborhoods where speeds increased significantly from both DSL and cable modems.

Unfortunately, we are not at the end of this story. There is a huge amount of fiber being constructed today in urban areas. But there is no reason to think that most of the ISPs building fiber are going to serve every neighborhood. The big telcos that build fiber like Verizon, AT&T, Frontier, CenturyLink, and others have always cherry-picked what they think are the best neighborhoods – best in terms of either demographics or the lowest in cost of deployment.

Unless we reach a time when fiber is everywhere, the digital divide will stick around. Right now, we’re tackling the rural digital divide – I expect in 5 or 10 years we’ll have to do this all over again to tackle the urban digital divide.

What Customers Want

Why Offer Fast Data Speeds?

A commentor on an earlier blog asked a great question. They observed that most ISPs say that customer usage doesn’t climb when customers are upgraded to speeds faster than 50 Mbps – so why does the industry push for faster speeds? The question was prompted by the observation that the big cable companies have unilaterally increased speeds in most markets to between 100 Mbps to 200 Mbps. There are a lot of different answers to that question.

First, I agree with that observation and I’ve heard the same thing. The majority of households today are happy with a speed of 50 Mbps, and when a customer that already has enough bandwidth is upgraded they don’t immediately increase their downloading habits.

I’ve lately been thinking that 50 Mbps ought to become the new FCC definition of broadband, for exactly the reasons included in the question. This seems to be the speed today where most households can use the Internet in the way they want. I would bet that many households that are happy at 50 Mbps would no longer be happy with 25 Mbps broadband. It’s important to remember that just three or four years ago the same thing could have been said about 25 Mbps, and three or four years before that the same was true of 10 Mbps. One reason to offer faster speeds is to stay ahead of that growth curve. Household bandwidth and speed demand has been doubling every three years or so since 1980. While 50 Mbps is a comfortable level of home bandwidth for many today, in just a few years it won’t be.

It’s also worth noting that there are some households who need more than the 50 Mbps speeds because of the way they use the Internet. Households with multiple family members that all want to stream at the same time are the first to bump against the limitations of a data product. If ISPs never increase speeds above 50 Mbps, then every year more customers will bump against that ceiling and begin feeling frustrated with that speed. We have good evidence this is true by seeing customers leave AT&T U-verse, at 50 Mbps, for faster cable modem broadband.

Another reason that cable companies have unilaterally increased speeds is to help overcome customer WiFi issues. Customers often don’t care about the speed in the room with the WiFi modem, but care about what they can receive in the living room or a bedroom that is several rooms away from the modem. Faster download speeds can provide the boost needed to get a stronger WiFi signal through internal walls. The big cable companies know that increasing speeds cuts down on customer calls complaining about speed issues. I’m pretty sure that the cable companies will say that increasing speeds saves them money due to fewer customer complaints.

Another important factor is customer perception. I always tell people that if they have the opportunity, they should try a computer connected to gigabit speeds. A gigabit product ‘feels’ faster, particularly if the gigabit connection is on fiber with low latency. Many of us are old enough to remember that day when we got our first 1 Mbps DSL or cable modem and got off dial-up. The increase in speed felt liberating, which makes sense because a 1 Mbps DSL line is twenty times faster than dial-up, and also has a lower latency. A gigabit connection is twenty times faster than a 50 Mbps connection and seeing it for the first time has that same wow factor – things appear on the screen almost instantaneously as you hit enter. The human eye is really discerning, and it can see a big difference between loading the same web site at 25 Mbps and at 1 Gbps. The actual time difference isn’t very much, but the eye tells the brain that it is.  I think the cable companies have figured this out – why not give faster speeds if it doesn’t cost anything and makes customers happy?

While customers might not immediately use more broadband, I think increasing the speed invites them to do so over time. I’ve talked to a lot of people who have lived with inadequate broadband connections and they become adept at limiting their usage, just like we’ve all done for many years with cellular data usage. Rural families all know exactly what they can and can’t do on their broadband connection. For example, if they can’t stream video and do schoolwork at the same time, they change their behavior to fit what’s available to them. Even non-rural homes learn to do this to a degree. If trying to stream multiple video streams causes problems, customers quickly learn not to do it.

Households with fast and reliable broadband don’t give a second thought about adding an additional broadband application. It’s not a problem to add a new broadband device or to install a video camera at the front door. It’s a bit of the chicken and egg question – does fast broadband speeds promote greater broadband usage or does the desire to use more applications drive the desire to get faster speeds? It’s hard to know any more since so many homes have broadband speeds from cable companies or fiber providers that are set faster than what they need today.

The Industry

Using Gigabit Broadband

Mozilla recently awarded $280,000 in grants from its Gigabit Communities Fund to projects that are finding beneficial uses of gigabit broadband. This is the latest set of grants and the company has awarded more than $1.2 million to over 90 projects in the last six years. For any of you not aware of Mozilla, they offer a range of open standard software that promotes privacy. I’ve been using their Firefox web browser and operating software for years. As an avid reader of web articles I daily use their Pocket app for tracking the things I’ve read online.

The grants this year went to projects in five cities: Lafayette, LA; Eugene, OR; Chattanooga, TN; Austin, TX; and Kansas City. Grants ranged from $10,000 to $30,000. At least four of those cities are familiar names. Lafayette and Chattanooga are two of the largest municipally-owned fiber networks. Austin and Kansas City have fiber provided by Google Fiber. Eugene is a newer name among fiber communities and is in the process of constructing an open access wholesale network, starting in the downtown area.

I’m not going to recite the list of projects and a synopsis of them is on the Mozilla blog. The awards this year have a common theme of promoting the use of broadband for education. The awards were given mostly to school districts and non-profits, although for-profit companies are also eligible for the grants.

The other thing these projects have in common is that they are developing real-world applications that require robust broadband. For example, several of the projects involve using virtual reality. There is a project that brings virtual reality to several museums and another that shows how soil erosion from rising waters and sediment mismanagement has driven the Biloxi-Chitimacha-Choctaw band of Indians from the Isle de Jean Charles in Louisiana.

I clearly remember getting my first DSL connection at my house after spending a decade on dial-up. I got a self-installed DSL kit from Verizon and it was an amazing feeling when I connected it. That DSL connection provided roughly 1 Mbps, which was 20 to 30 times faster than dial-up. That speed increase freed me up to finally use the Internet to read articles, view pictures and shop without waiting forever for each web site to load. I no longer had to download software updates at bedtime and hope that the dial-up connection didn’t crap out.

I remember when Google Fiber first announced they were going to build gigabit networks for households. Gigabit broadband brings that same experience. When Google Fiber announced the gigabit fiber product most cable networks had maximum speeds of perhaps 30 Mbps – and Google was bringing more than a 30-times increase in speed.

Almost immediately we heard from the big ISPs who denigrated the idea saying that nobody needs gigabit bandwidth and that this was a gimmick. Remember that at that time the CEO of almost every major ISP was on the record saying that they provided more than enough broadband to households – when it was clear to users that they didn’t.

Interestingly, since the Google Fiber announcement the big cable companies have decided to upgrade their own networks to gigabit speeds and ISPs like AT&T and Verizon rarely talk about broadband without mentioning gigabit. Google Fiber reset the conversation about broadband and the rest of the industry has been forced to pay heed.

The projects being funded by Mozilla are just a few of the many ways that we are finding applications that need bigger broadband. I travel to communities all over the country and in the last year I have noticed a big shift in the way that people talk about their home broadband. In the past people would always comment that they seemed to have (or not have) enough broadband speed to stream video. But now, most conversations about broadband hit on the topic of using multiple broadband applications at the same time. That’s because this is the new norm. People want broadband connections that can connect to multiple video streams simultaneously while also supporting VoIP, online schoolwork, gaming and other bandwidth-hungry applications. I now routinely hear people talking about how their 25 Mbps connection is no longer adequate to support their household – a conversation I rarely heard as recently as a few years ago.

We are not going to all grow into needing gigabit speeds for a while. But the same was true of my first DSL connection. I had that connection for over a decade, and during that time my DSL got upgraded once to 6 Mbps. But even that eventually felt slow and a few years later I was the first one in my area using the new Verizon FiOS and a 100 Mbps connection on fiber. ISPs are finally facing up to the fact that households are expecting a lot of broadband speed. The responsive ISPs are responding to this demand, while some bury their heads in the sand and try to convince people that their slower broadband speeds are still all that people need.

Technology The Industry

Do We Really Need Gigabit Broadband?

I recently read an article in LightReading titled “All That’s Gigabit Doesn’t Glitter.” The article asks the question if the industry really needs to make the leap to gigabit speeds. It talks about the industry having other options that can satisfy broadband demand but that telco executives get hooked into the gigabit advertising and want to make the gigabit claim. A few of the points made by the article are thought-provoking and I thought today I’d dig deeper into a few of those ideas.

The big question of course is if telco providers need to be offering gigabit speeds, and it’s a great question. I live in a cord cutter family and I figure that my download needs vary between 25 Mbps and 50 Mbps at any given time (look forward to a blog soon that demonstrates this requirement). I can picture homes with more than our three family members needing more since the amount of download speed needed is largely a factor of the number of simultaneous downloads. And certainly there are people who work at home in data intensive jobs that need far more than this.

There is no doubt that a gigabit is a lot more broadband than I need. If we look at my maximum usage need of 50 Mbps then a gigabit is 20 times more bandwidth capacity than I am likely to need. But I want to harken back to our broadband history to talk about the last time we saw a 20-fold increase in available bandwidth.

A lot of my readers are old enough to remember the agony of working on dial-up Internet. It could take as much as a minute at 56 kbps just to view a picture on the Internet. And we all remember the misery that came when you would start a software update at bedtime and pray that the signal didn’t get interrupted during the multi-hour download process.

But then along came 1 Mbps DSL. This felt like nirvana and it was 20 time faster than dial-up. We were all so excited to get a T1 to our homes. And as millions quickly upgraded to the new technology the services on the web upped their game. Applications became more bandwidth intensive, program downloads grew larger, web sites were suddenly filled with pictures that you didn’t have to wait to see.

And it took a number of years for that 1 Mbps connection to be used to capacity. After all, this was a 20-fold increase in bandwidth and it took a long time until households began to download enough simultaneous things to use all of that bandwidth. But over time the demand for web broadband kept growing. As cable networks upgraded to DOCSIS 3.0 the web started to get full of video and eventually the 1 Mbps DSL connection felt as bad as dial-up a decade before.

And this is perhaps the major point that the article misses – you can’t just look at today’s needed usage to talk about the best technology. Since 1980 we’ve experienced a doubling of the amount of download speeds needed by the average household every three years. There is no reason to think that growth is stopping, and so any technology that is adequate for a home today is going to feel sluggish in a decade and obsolete in two decades. We’ve now reached that point with older DSL and cable modems that have speeds under 10 Mbps.

The other point made by the article is that there are technology steps between today’s technology and gigabit speeds. There are improved DSL technologies and G.Fast that could get another decade out of embedded copper and could be competitive today.

But it’s obvious that the bigger telcos don’t want to invest in copper. I get the impression that if AT&T found an easy path to walk away from all copper they’d do so in a heartbeat. And none of the big companies have done a good job of maintaining copper and most of it is in miserable shape. So these companies are not going to be investing in G.Fast, although as a fiber-to-the-curb technology it would be a great first step towards modernizing their networks to be all-fiber. CenturyLink, AT&T and others are considering G.Fast as a technology to boost the speeds in large apartment buildings, but none of them are giving any serious consideration of upgrading residential copper plant.

It’s also worth noting that not all companies with fiber bit on the gigabit hype. Verizon always had fast products on their FiOS and had the fastest speed in the industry of 250 Mbps for many years. They only recently decided to finally offer a gigabit product.

And this circles back to the question of whether homes need gigabit speeds. The answer is clearly no, and almost everybody offering a gigabit product will tell you that it’s still largely a marketing gimmick. Almost any home that buys a gigabit would have almost the same experience on a fiber-based 100 Mbps product with low fiber latency.

But there are no reasonable technologies in between telephone copper and fiber. No new overbuilder or telco is going to build a coaxial cable network and so there is no other choice than building fiber. While we might not need gigabit speeds today for most homes, give us a decade or two and most homes will grow into that speed, just as we grew from dial-up to DSL. The gigabit speed marketing is really not much different than the marketing of DSL when it first came out. My conclusion after thinking about this is that we don’t need gigabit speeds, but we do need gigabit capable networks – and that is not hype.

The Industry

25 Years Since the First Web Site

Today I was reading and thinking about all of the different ways that various governments around the world are trying to somehow control and regulate content on the web. And while doing so I saw an article that said that we just passed the 25th anniversary of the first web site. That honor belongs to the CERN research facility in Switzerland. Tim Berners-Lee, a scientist there, posted the first web site on December 20, 1990. This first web page explained how the web worked and provided a few links on how to use the new World Wide Web.

Before web sites there was already a very active online community. These were the days of dial-up ISPs and bulletin boards. It’s easy today to complain about the price of broadband, but most of us have forgotten when you had to pay a monthly fee plus rates of between $1 and $6 per hour to gain access to online dial-up services.

This was all before AOL was large. At the time the predominant web services available were Prodigy and CompuServe. These were the first two services that offered a wide range of different services from email and chatrooms, to gaming and news services.

But the real fun online was to be had using the hundreds of different private servers throughout the country. Guys with a computer and an interest in something would start their own service and their own community. I remember joining an online baseball fantasy league in the early 80s and also playing trivia online at a private server.

What is most amazing is how far we have come in just 25 years. At that time the fastest dial-up modems were at 14.4 kbps. The number of people who were active online was a few million at most. Since then we have already gone through several different major ages of the Internet. First was the AOL age where AOL aggregated so much content that they killed most of their competitors and became the predominant way to get online.

Then came web pages and all of a sudden everybody was adding ‘content’ to the web. We all ‘surfed’ the web looking at the huge variety of pages that people would post. Both businesses and people created their own web pages. I can recall a period where I was amused by using a service that would show random web sites.

Web sites also unleashed e-commerce and all of a sudden shopping became a big thing. eBay was huge for a while until it was eclipsed by Amazon. This was followed by the age of social media and social interactions. Facebook obviously won the social media war but in the early days there were many smaller social media sites that were more specialized and fun. And things like dating online became a big phenomenon.

Today we have entered the age of web video and the hours spent watching video on the web eclipses everything else being done.

It’s easy to think of the web as some sort of fixed thing, but the fact is that there have been major changes every few years since 1990 and there was always something new being done or something new being trendy. Kids today would die if they had to endure the web experience of the mid-90s.

During each of these various phases of the web there have been major issues that everybody was concerned about. For every new innovation that came along something negative was also introduced. We’ve suffered through piracy, spam, hackers, trolls, and viruses at various stages of the web’s development.

This flood of memories brings me back to my original thought about regulators trying to control what happens on the web. Mostly I find the idea amusing, because whatever it is they think they are regulating will change faster than any laws they can formulate – their efforts are always going to be a few years behind web innovation. The Internet genie is out of the bottle and I can’t imagine any way to put it back inside.

The Industry

Dial-up is Still Around

Since most people in the country can get some form of broadband many people think that dial-up is dead. We all remember those days of trying to get a connection to a modem and then listening for the beeps and boops. But I looked and found there is still a significant dial-up business in this country.

At the end of 2014 AOL still claims to have 2.3 million paying dial-up customers. That is obviously way down from their peak when they had 126 million customers, but it’s still a very impressive number. AOL said those customers account for $155 million in revenue, which still exceeds the company’s next biggest revenue source which is advertising at $144 million.

AOL is not the only one still in the business. Some other big names from the past are still around like EarthLink and NetZero. EarthLink advertises that it has the most dial-in numbers in major markets like 50 in Miami and 45 in San Diego. And then there are dial-up companies that you probably never have heard of including Basic ISP,, Turbo USA and Finally, many telephone companies like AT&T still offer dial-up. A surprising number of my smaller telco clients also still operate small pockets of dial-up customers.

It’s hard to get industry figures since most of these companies don’t publish their customer counts, but if AOL still has 2.3 million customer then nationwide there must be more than 4 million households still using dial-up. The FCC says that about 2% of households are still on dial-up, but AOL alone is slightly more than 2%.

Dial-up has gotten better than what most of us remember due to the use of compression techniques where the ISP will compress whatever is being sent to the dial-up customer. But it’s still agonizingly slow compared to other broadband and the realized speed of dial-up is still capped at 56 kbps on good copper. And much of the copper that is left is not very good. With compression techniques dial-up can appear to be twice that base speed.

The low speeds keep dial-up customers relegated to using very basic Internet functions such as email. Browsing the web can be incredibly slow since many website now include advertising and video and take a long time to open. Since shopping on the web is now very image oriented that can also be too slow for dial-up speeds. And obviously dial-up households can’t get streaming video of any kind since it requires anywhere from  steady 1 Mbps at the lowest quality up to 6 – 8 Mbps for the new 4K HD video.

So who is still using dial-up? It appears that there are three distinct communities. First are people everywhere who barely use the Internet and want the cheapest connection possible. Such people don’t do a lot more than check email and do basic tasks. Second is in immigrant communities where one would suppose that the low price is also important.

Finally are rural people who have no other alternative except maybe satellite. For those who have never used it, satellite broadband is not a great product. It’s very expensive with base plans between $60 and $80 per month. It is faster than dial-up, but it still has latency issues which make it hard to use for any real time purposes such as web voice or streaming video. It also comes with low and strict ceilings on monthly data usage. WildBlue has a monthly cap of 17 GB in total downloads, HughesNet is 20 GB, Exede is 25 GB and Dish is 30 GB.

One would think that if AT&T is really able to cut down millions of rural copper lines like they want that a lot of dial-up customers will disappear. All of those rural houses that use dial-up today as their most affordable option will end up with either satellite or cellphone data plans.

Writing this blog made me pause to marvel at how fast our technologies change and grow. The heyday of dial-up was only twenty years ago, and we have come so far since then. We think of dial-up as something ancient and yet twenty years is nothing in terms of mankind’s history. But in that very short time we have grown from having over half of the country on dial-up to seeing some cities connected with gigabit speeds. I remember when I was on dial-up and I envied a few of my friends who were on a shared T1 at their office. I would have called somebody crazy if they said then that within twenty years that people would be able to get a gigabit at their house.

The Industry

Those Damned Statistics

One of my biggest pet peeves in life is the misuse of statistics. I am a math guy and I sometimes tackle math problems just for the fun of it. I understand statistics pretty well and my firm performs surveys. I think I disappoint a lot of my clients when I try to stop them from interpreting the results in a survey to prove something that the responses really don’t prove. Surveys are a really useful tool, but too often I see the survey results used to support untruthful conclusions.

A week ago the NTIA (National Telecommunications and Information Administration) released their latest poll looking at broadband usage in the US. The survey asked a lot of good questions and some of the results are very useful. For example, they show that overall broadband penetration in the US is up to 72% of households. But even that statistic is suspect, as I will discuss below.

The problem with this survey is that they didn’t ask the right questions, and this largely invalidates the results. The emphasis of this particular survey was to look at how people use cellphones for data access. And so they asked questions such as asking the various activities that people now use their phone for such as browsing the web or emails. And as one would expect, more people are using their cellphones for data, largely due to the widespread introduction of smartphones over the last few years.

There is nothing specific with any of the individual results. For example, the report notes that 42% of phone users browse the web on their phone compared to 33% in 2011. I have no doubt that this is true. It’s not the individual statistics that are a problem, but rather the way the statistics were used to reach conclusions. In reading this report one gets the impression that cellphone data usage is just another form of broadband and that using your cellphone to browse the web is more or less the same as browsing off a wired broadband connection.

The worst example of this is in the main summary where the NTIA concluded that “broadband, whether fixed or mobile, is now available to almost 99% of the U.S. population”. This implies that broadband is everywhere and with that statement the NTIA is basically patting themselves on the back for a job well done. But it’s a load of bosh and I expect better from government reports.

As I said, the main problem with this report is that they didn’t ask the right questions, and so the responses can’t be trusted. Consider data usage on cellphones. In the first paragraph of the report they conclude that the data usage on cellphones has increased exponentially and is now deeply ingrained in the American way of life. The problem I have with this conclusion is that they are implying that cellphone data usage is the same as the use of landline data – and it is not. The vast majority of cell phone data is consumed on WiFi networks at work, home or at public hot spots. And yes, people are using their cellphones to browse the web and read email, but most of this usage is carried on a landline connection and the smartphone is just the screen of choice.

Cellular data usage is not growing exponentially, or maybe just barely so. Sandvine measures data usage at all of the major Internet POPs and they show that cellular data is growing at about 20% year, or doubling every five years, while landline data usage is doubling every three years. I trust the Sandvine data because they look at all of the usage that comes through the Internet and not just at a small sample. The cell carriers have trained us well to go find WiFi. Sandvine shows that on average that a landline connection today uses almost 100 times more data than a cellphone connection. This alone proves that cellphones are no substitute for a landline.

I have the same problems with the report when it quantifies the percentage of households on landline broadband. The report assumes that if somebody has a cable modem or DSL that they have broadband and we know for large parts of the country that having a connection is not the same thing as having broadband. They consider somebody on dial-up to not be broadband, but when they say that 72% of households have landline broadband, what they really mean is that 72% of homes have a connection that is faster than dial-up.

I just got a call yesterday from a man on the eastern shore of Maryland. He live a few miles outside of a town and he has a 1 Mbps DSL connection. The people a little further out than him have even slower DSL or can only get dial-up or satellite. I get these kinds of calls all of the time from people wanting to know what they can do to get better broadband in their community.

I would challenge the NTIA to go to rural America and talk to people rather than stretching the results of a survey to mean more than it does. I would like them to tell the farmer that is trying to run a large business with only cellphone data that he has broadband. I would like them to tell the man on the eastern shore of Maryland that he and his neighbors have broadband. And I would like them to tell all of the people who are about to lose their copper lines that cellular data is the same as broadband. Because in this report that is what they have told all of us.

The Industry

Broadband Map of the US

Consider the following map of US broadband. This map was compiled by Gizmodo using data on broadband usage gathered by Ookla. This is a rather different map than the official US Broadband Map that is generated by the FCC. The official map uses data that is self-reported by the carriers. However, this map has been created by sampling and pinging actual Internet connections. Ookla owns and tests millions of connections from all across the country and at all times of the day.

This map is at a fairly high level and is shown per congressional district. But more detailed maps are available at the state and County level.

This map shows that there is a wide disparity of broadband speeds around the country. One surprising finding to me is that the average Internet connection is now at 18.2 Mbps download. The map then goes on to show those areas that are faster than average in blues and slower than average in reds. The 18.2 Mbps number is faster than I expected and goes to show that carriers around the country have been increasing speeds. This is certainly faster than the speeds that have been reported by other sources.

When you look deeper than this map at the broadband statistics you see a lot of what you would expect to see. Urban areas generally have faster broadband than rural areas. And the Verizon FiOS areas have much faster broadband than other parts of the country.

And this map shows some areas with fast broadband that might surprise people. For example, North and South Dakota have faster than average broadband. This is because the states are largely served by independent telephone companies that have built fiber into small towns and rural areas. And central Washington has some of the fastest broadband in the country thanks to several municipal networks that have built fiber-to-the-home.

One thing the map doesn’t show, at this high level, is that there are pockets of fast Internet scattered in many places. There are FTTH networks built in many small towns but these towns are not large enough to skew the data for the larger congressional districts shown on this map.

One state with high broadband is Florida, where I live. I have speeds available up to 104 Mbps from Comcast. The map for Florida shows what the cable companies are capable of and it’s a shame they have not improved their networks in more places to be this fast.

Ookla reports that the fastest town in the US is Ephrata, Washington with an average download speed of 85.5 Mbps. Second is Kansas City at 49.9 mbps. One would assume that with gigabit service that Kansas City will become the fastest place as more people are added to Google’s fiber.

One thing the map shows, is that an awfully lot of the country is below the average. Ookla reports that the slowest places are Chinla and Fort Defiance in Arizona which both have an average speed of less than 1.5 Mbps. These towns are within the Apache reservation and many native American towns are woefully underserved. The map shows large swaths of poorly served areas like West Virginia and Kentucky in Appalachia, like north Texas and Oklahoma, like Wyoming and Montana, and Maine.

I know at my house I have a 50 Mbps cable modem service and to me it feels just right. It allows us to watch multiple streaming videos while also working and using  computers for on-line gaming. I just moved from a place where my speeds would bounce between 10 Mbps and 20 Mbps and I can see a big difference. In my line of work I talk to people all of the time in rural areas who are still stuck with only dial-up or satellite as their broadband options. I know I could not do my job from such areas. These areas probably are not even showing up on this map because people who have connections that slow are probably not doing speed tests very often. They know they are slow.

The good news to me from this map is that the average speed in the US is up to 18 Mbps. But the bad news is that there are so many large areas left without good broadband. We still have a lot of work to do.



Exit mobile version