Categories
Current News

Aereo to Get their Day in Court

Digital-tv-antenna-620x400The various Aereo cases have finally worked their way up to the Supreme Court who will hear the case on April 22. There have been a number of District Court findings on Aereo, some for and some against them, prompting the Supreme Court to arbitrate the differences.

Aereo is now a little over two-years old and in that time has stirred up a lot of controversy. Aereo offers a package of network programming and web programming delivered directly to customers devices such as a pad, computer or cell phone. They have been very successful in the market so far and have announced recently that they have sold all they can in several markets like New York City and Baltimore.

But the reason that they are so controversial is that they have assembled their package without paying retransmission fees to the networks. Retransmission fees are those fees paid to the major networks – ABC, CBS, FOX and NBC – for the right to carry their programming. These fees are gigantic. In most markets the cost to now carry a local affiliate station is in the range of $2 per customer per month. And that is a remarkable figure since as recently as five years ago there was no charge to most cable companies for carrying these networks.

These fees are a significant part of the reason why cable rates are climbing so quickly. At $2 per network channel, these fees have increased the cost of cable by $96 per year to a cable subscriber. And nationwide these fees drive over $7 billion per year in revenues.

Aereo found a way around retransmission fees. The networks all use public spectrum, and for the use of that spectrum they broadcast their shows through the air for anybody with a set of rabbit ears to get free. Aereo uses a technology that basically uses a separate set of rabbit ears (actually a tiny one-inch antenna) for each customer. They receive the programming at a centralized hub and then send it to the customer’s devices.

Customers love this. The one thing you can’t get when you try to wean off cable is a live version of the network shows. People by the millions have dropped cable or downsized cable packages. At home such customers can get programming on their TVs for free with an antenna. But there is no easy way for them to get this to tablets and smartphones, which has become a very popular way of watching TV.

I like what Aereo is doing. Certainly as a customer I think this is a good thing. If I can receive this programming for free at my house then I feel I ought to have the right to pay somebody to help me get that free programming onto my iPad. That is really all that Aereo is doing. I certainly am offended that the network channels want me to pay $96 per year for programming that I can get with a $50 set of rabbit ears.

Some network execs have said that if Aereo wins at the Supreme Court they will pull their channels off the free airwaves and only sell them through cable TV. I have a hard time believing that any network would have enough hubris to do this. I am picturing a big political backlash when the networks that carry local sports and the local NFL teams disappears from the airwaves.

The networks are making huge gobs of money today, and while the money from retransmission fees is making them fatter, they still derive most of their revenues from advertising. One would have to think that their advertising model would change drastically if they were to suddenly become just another of the many cable channels. Personally, I hope one of them tries this and then loses their ass and their customer base. Because we would all benefit from using the spectrum they will vacate.

Categories
Current News Technology

Security for the Internet of Things

We are quickly headed towards the Internet of thing where billions of devices will be connected to the Web. The biggest challenge in making this a reality is figuring out how to make the IoT secure. The world today is full of hackers. There are those that hack to find financial gain. There are cyberwars where government-sponsored hackers launch major attacks. And there are just general hackers who do it for the fun of creating mischief.

Today web security is a cat and mouse game between the hackers and security experts. Our PCs need almost daily updates to fight against newly discovered viruses which look to get around the virus checking programs.

The biggest challenge we face is that most of the devices that will be connected are not going to have large computing power like laptops and tablets. Instead we will have thermostats and smoke detectors and security cameras and medical monitors all connected to our home networks. And these devices have very rudimentary computing power, meaning that our current methods of security can’t be used to protect them.

But protect them we must because causing harm to these devices can cause real world damage. Imagine during the latest artic vortex is some hacker had turned off millions of thermostats and furnaces. This could have caused widespread problems, large dollar damages and even deaths. I don’t even want to think what might happen is somebody can hack into people’s medical devices. Perhaps murder by hacking? As we tie more and more of our daily life into devices that are connected to the web need to find solutions for protecting them.

And hackers are already starting to take notice of the weaknesses in our devices. In Brazil over 4.5 million DSL routers were hacked by people looking for credit card and banking information. There is a computer virus called DNS Changer that is attacking home routers in the US. There are already worms that are attacking things like security cameras and other embedded devices.

Security experts are working on the problem and there are several thoughts on the best way to keep our devices safe.

Safer Firmware. Most devices are operated with software called firmware. The security idea is to put this software onto a part of the chip that cannot be addressed from externally. Basically code the chip and throw away the key.

Cloud Security. Another idea is to limit each device to only being able to communicate with one source. This might be a specific cloud. This feels like a big company idea for a fix and it’s a bit scary, because if somebody can break into the cloud they have access to all of the machines that talk to it.

Government Fines. Today there is nearly zero security even considered for companies building IoT devices. They use old versions of open source Linux and out zero effort into making their devices safe. The thought is to impose big fines on manufacturers of IoT devices that get hacked as an incentive for them to do better.

We have to fix this or else there is going to be some really huge examples of hacking into devices that are going to scare the public off IoT. As we tie more and more of our life into our networks we all need to know that we are safe from being hacked by those with malicious intent.

Categories
Technology

Primer on Computer Viruses

Anybody with a computer knows that there are a host of computer viruses that can cause them problems. Most of us just load some sort of virus checking software on our computers and then hope to not get infected. But few of us have ever read much about the different kinds of viruses. So I offer a short description of the most common kinds of viruses that are attacking us each day. This isn’t an exhaustive list and also doesn’t include other malware like worms or trojan horses.

File Infector Virus. These are probably the most common viruses and they attach themselves to root key root files on a computer, the kinds of files that are needed to make the computer turn on and become operational. When the infected file is opened the virus is activated and can do almost anything imaginable inside the computer. These viruses may overwrite all or parts of the root file such that the virus is executed whenever the file is executed.

Browser Hijacker. This virus is the one that mostly gets onto machines by the user opening an infected email or file. These viruses then typically send people to specific websites assumedly to increase web hits.

Macro Virus. These viruses exploit the macro functions of programs like Microsoft Word and Excel. Those programs have very powerful macro tools that allow for sophisticated manipulation of data and files. The version of a macro virus that almost everybody has noticed in a version that infects Microsoft Outlook and then sends a spam email to everybody in the address book.

Boot Sector Virus. These were some of the first viruses developed. They take root in the boot sector, or those key files that are needed to start a computer. I remember a time when these got on the machine through loading an infected floppy disk into a machine. There is one story (or possibly an urban legend) that claims that a boot sector virus infected a major government agency after an employee there loaded a flash drive they had picked up in the parking lot.

Web Scripting Virus. These are virus that are activated when you read an infected web site. More often than not the web site has been hacked without the knowledge of the owner.

Polymorphic Virus. Polymorphic viruses morph over time and change each time they are activated. Normally this function is used to relocate the virus onto a different part of the computer to avoid detection. These viruses are good at evasion and are the hardest ones to eradicate.

Resident Virus. This class of virus embeds itself in a computer’s memory and is no longer associated with any specific file. This is the most common type of virus used when somebody is trying to spy on your keystrokes or hijack your computer in some way.

Direct Action Virus. These viruses only get activated when a specific file is executed. I remember a very early version of this back in the MS DOS days that formatted the entire hard drive and wiped out everything in memory. These viruses are somewhat rare now since they are hard to disseminate.

Multipartite Virus. These are the mac daddy of viruses and they combine multiple forms of viruses from the list above. This is generally done so that these viruses can survive on a machine. They have multiple components and functions and can be delivered to a computer in multiple ways and can spread in multiple ways. They also can take different actions on different computers based upon what they find.

Categories
Current News The Industry

Is Peering the End of Network Neutrality?

NetFlix and Comcast have announced a deal whereby NetFlix will pay to peer into the Comcast network. Numerous articles popped up yesterday talking about how this is the end of network neutrality. But I am not so sure about that. In order to understand this, let me talk a bit about how peering works today. Peering is when two networks decide to make a direct connection between the networks rather than connecting in a more traditional way through the open Internet.

There are two kinds of connections that are typically made. One is local peering. This is when two networks who are geographically close decide to exchange data traffic. This typically benefits both parties. Let’s look at an example of why. Let’s assume the two parties are medium sized carriers, one a telephone company and the other a cable company that are competing in the same community. There is always a considerable amount of Internet traffic that is conducted within a community. People browse the websites of stores in their own community. People do on-line banking with local banks. People work at home and want to get data into and out of their employer’s local networks.

Normally each of these carriers would deliver traffic between their two networks, say between a customer on one network and a bank on the other one by sending this traffic to the open Internet. Each company will have a connection to the Internet, through some wholesale provider that will terminate eventually at one of the major Internet pops like Chicago or Dallas. And so when a customer wants to connect with his bank, the data will travel out through the first network to the major pop where it will be handed off to the data stream going back to the second network.

Such a connection is said to make at least several hops, meaning the times that the message is handled by a data router somewhere in the network in order to figure out where it is going. The more hops, the slower the connection. But local peering solves this problem because the traffic can be exchanged locally and goes straight from one carrier to the other without being sent first to some distant POP. This is a simplistic description because peering arrangements are usually more complicated than this. They are more likely to be between the underlying transport carriers that handles the traffic for the telephone company and the cable company. But peering will make the connection more direct than it would be under normal network circumstances.

The other kind of peering is one that saves money. I have many clients who peer with Google because Google and all of its various subsidiaries accounts for a significant percentage of the traffic on any Internet connection. My clients have done the math and see that it is cheaper to make a direct connection with Google rather than paying their underlying carrier to get it to Google. Anybody who peers with Google this way must pay out of their own pockets to get to a Google POP, probably including paying for the equipment at the POP needed to make the connection. But this kind of peering often results in a significant savings. Most people’s connection with Google is very much one-directional. There is usually a lot more traffic coming from Google than going to Google.

We don’t have the details of the Comcast / NetFlix deal to be certain what the arrangement is. But up until now it’s clear that the two sides have not agreed to a direct peering arrangement. One has to assume that the connection from NetFlix is nearly all in one direction – to download video to customers who sit on the Comcast network. Without a direct peering arrangement the traffic must get to Comcast through intermediate carriers and often would be routed in ways that would slow up the traffic, as is any traffic on the open Internet.

I would assume that there is not one big Comcast network, but instead there are pockets of Comcast all over the country. I would assume that for NetFlix to fully peer with Comcast that they are going to have to make connections with these various pockets, all at NetFlix cost. And if this was normal peering, NetFlix would also be expected to pay for the connections into the Comcast network including owning or somehow paying for the large amount of equipment needed to terminate their traffic.

Again, the two sides aren’t talking about the details. But I would expect it to cost NetFlix something to get their traffic directly to all parts of the Comcast network. That is how normal peering works. Where the line of network neutrality will have been crossed is if NetFlix has to pay a lot more for this connection than what others pay. But since this deal has been under negotiations for a year, one has to assume that both parties had the old network neutrality rules in mine as it was negotiated. I can certainly envision an arrangement that is more like normal peering than of a big violation of the principles of network neutrality. If it was the latter I would expect NetFlix to be putting up a big stink. Network neutrality benefits companies like NetFlix tremendously, and if they aren’t complaining then there is a good chance that this is peering like normal and not a giant money grab by Comcast.

Categories
Technology The Industry

The Evolution of Cellular

There are several big changes on the horizon that are going to really impact cellular networks. One change is transformational, one solves some local network issues and the third, and probably the least important one will get all of the press.

The transformational change is that the technology is being developed that will allow the industry to centralize the brains and the computing functions of the network. Today there are nearly 200,000 cell phone towers in the US and each cell phone tower requires a full set of switching electronics. Much of the smarts of the cellular network is done at these cell sites. That makes the cellular network somewhat unique in that most other types of networks have been able to centralize the brains and computing power of the network into hubs rather than to leave everything on the edge.

There are several groups now working on ways to start migrating the brains of cell sites back to regional data centers. Some people have called this moving the cellular network into the cloud, but that is really not a great description of it. Rather, this is a migration of computing and processing power from the edge back to a core like has happened with all other kinds of networks. The cable industry called this migration ‘headend consolidation’ where they created huge headend that can serve millions of customers.

This will be a transformational change because today it costs a literal fortune for a cellular carrier to implement a technology upgrade since they have so many cell sites. And this matters because upgrades are hitting the industry at a fast and furious pace. With a centralized cellular network, a cell company could upgrade the core software and electronics at only a few hubs since the cell sites will become little more than a transmitter site with little brains.

The second big change is that in 2014 we are seeing the cell industry adding tens of thousands of small cell sites. For a few years there have been network extenders called femtocells, but now the vendors in the industry have developed mini cell sites that are not a whole lot more than a cell site on a card. These small sites don’t have the same power as a full cell site, but they can be placed in areas where there is currently network congestion.

These small cell sites can be deployed in stadiums, downtown districts, convention centers and commuting corridors to provide extra call and data capacity where it is most needed. For example, I have a friend who I talk to regularly during his morning commute and I always lose him when he is crossing the 14th Street Bridge into DC. These mini cell sites ought to be able to fix the holes and dead spaces in the existing cellular network.

Finally is the change that will get all of the hype. Rumor has it that one or more of the cellular companies are going to start talking about 5G cellular networks this year. As I have discussed in the past, there are not even any networks today that are close to being able to call themselves 4G networks. The 4G standard begin with ability for a cell site to deliver 1 gigabit data speeds and there aren’t any sites today who can do 1/20th of that speed.

Sprint and T-Mobile coined the word 4G to promote some incremental enhancements to their HSPA+ and LTE networks. And so 4G was used as a marketing phrase to try to distinguish their technology from the competition. Then the whole industry followed suit and we now have 3G and 4G phones which use the same networks and have essentially the same capabilities.

There are dozens of little improvements to cellular technology being developed in vendor labs, and every time there is a new little tweak that makes speeds a little faster or that somehow enhances the customers experience the cell companies have been itching to say that they now have 5G. And it will happen. One of them will pull the trigger as marketing hype and the rest will follow. Ironically, by the time we finally get real 4G technology we will probably be selling phones in the market labeled as 10G.

So while the marketers make a lot of hype out of little changes in the network, the really huge change is the possibility to centralize the networks into hubs. Once that is done, a company could upgrade a few hubs and introduce a new technology improvement overnight. But that doesn’t sound sexy and is hard to market, so it will just quietly get implemented in the background.

Categories
The Industry

The Rich Get Richer . . .

Just a few days ago I wrote about the new digital divide. That being the fact that larger and more prosperous places have, or are getting faster broadband while smaller and poorer places are being left behind.

And on the heels of that blog, Google just announced that it has invited talks with 34 new cities to discuss the expansion of its gigabit network. And of course, these are all big places and/or prosperous and growing places including Phoenix, Scottsdale and Tempe in Arizona, Atlanta and surrounding suburbs in Georgia, San Antonio in Texas, Raleigh-Durham, Charlotte and surrounding suburbs in North Carolina, Nashville in Tennessee, San Jose and other growing areas in northern California, Salt Lake City and Portland.

This is great news for those communities. There is certainly no assurance that any of them are going to get fiber and Google will be looking for the places willing to give the biggest handouts. But one would think that a decent number of the cities on that list will be able to give Google what they want to get a fiber network.

But not on this list, as you would expect are smaller towns and counties or the inner cities in the east that were ignored by Verizon FiOS. For the most part the Google list represents communities that are relatively economically healthy. The cities on the list are the ones that are growing while much of the rest of the country, like the northeast and smaller towns are shrinking.

In this same week the FCC said that they are going to look at eliminating the state barriers that stop municipalities from building fiber networks. There are over twenty states that have either a total ban or severe restrictions on government entities getting into the fiber business.

Let’s face facts. If you are not one of those places that are thriving, like the places on Google’s new list, then the chances are big that nobody is even thinking about building fiber in your neighborhood. You might live close to an independent telephone company or cooperative that is thinking about it. But most of rural America is not on anybody’s radar.

I always tell rural communities to consider two steps. First, you need to look around just to make sure that there is no company nearby who can be enticed to bring you fiber. Because sometimes, with the right incentives there is somebody. But generally there is nobody willing to make such an investment, so the second part of the advice is, if you want fiber you are going to have to step up and build it yourself.

You may need to gather surrounding communities together to get a pile of households large enough to justify a fiber business plan. But your community needs to take the initiative to get fiber or you are going to be left far behind.

Some of the communities Google is targeting were edging towards the wrong side of the new digital divide. I just read this morning that a large portion of Salt Lake City, as an example, still has 3 Mbps DSL for broadband. But they are large enough and thriving enough to have gotten Google’s attention, and good for them. But if you are a rural county seat or a farming community you are not going to get on Google’s or anybody else’s list.

Categories
Improving Your Business What Customers Want

The Informed Customer

The US consumer is far more informed today than any time in the past. And this goes double for businesses. When somebody goes to sell something as complex as the triple play mix of products, one has to assume that most customers will have already done some research on the web to find out what they can about your company and your products.

In the past one of the first steps of the consultative process of selling telecom services was to inform the customer about all of the options. And so salespeople were equipped with slide shows and handouts that would explain what their products can do. But from what I hear it is getting to be the rare event when a business customer doesn’t already have a pretty good idea of what they want to buy. They will have done their web researched, read reviews, and talked to peers before they are going to accept a sales visit from you.  They are already fairly well-informed before the first sales knock on their door.

This has changed the sales process because often it’s then just a matter of talking price and logistics. Where it used to take multiple visits to sell a business customer, many of them can be sold in one or two visits. This leads me to talk a bit about how telecom companies portray their products on the web. I have browsed through hundreds of telco and cable websites looking at how they portray products and prices and more often than not I am surprised by what I find.

A lot of companies spend time on their web site talking about who they are, but very little time talking about what they sell. In fact, there are a significant number of telecom companies that don’t even list their products on the web. Even among those that do, very few companies list their pricing. This is an interesting trend, because back when the web was new companies routinely had their product list on the web. But over the years the information about product and pricing has shrunk rather than grown.

And I think this is a mistake. This is not being responsive to the way that customers want to shop and buy today. Individuals and companies are used to completing sales transactions on the web without talking to anybody. They are used to now doing their own research and deciding what products are right for them on their own.

I hear a number of different reasons why companies don’t have full disclosure on their products and prices. Here are some of the most common ones:

I don’t want my competitors to know what I am doing. Really? Can you possibly think that any competitor of your does not already have a big pile of your bills that they have gathered from your customers? Do you really want to make it harder for customers to do business with you because you are afraid of your competitors? Customers are going to welcome your openness, candor and ease of use if you make it easier for them to shop with you.

I don’t want to make it easy for customers to disconnect. This is one of the dumbest excuses I have ever heard. Can you really think that a customer is not going to drop services because they have to call you? Just the opposite is true, and the customer that goes to the web to easily drop a service they don’t use today is just as likely to come back and add a different feature six months from now. Your customers will love that you have made it easy to shop with you.

I sell in packages and I don’t want to quote prices on the web. Then say this. Describe your process for how you sell to a customer so that they know exactly what to expect from you. It is far more likely that a customer won’t call you if you have no product information on the web than if you instead tell them about what to expect from you.

Categories
Current News The Industry

The New Digital Divide

There was a time, not very many years ago, when the digital divide meant the difference between pockets of people that had dial-up versus places that had something faster. But this is no longer a good definition and I think the digital divide is growing very quickly and is a huge issue again. The new digital divide is between cities and suburbs that have relatively fast broadband and rural areas and urban pockets that have been left a few generations of technology behind. Below when I say rural areas we can’t forget that there are many parts of inner cities in the same condition and that have become broadband deserts.

Today, most of rural America is several generations of technology behind the cities and there is no real expectation that this gap will ever close. A large portion of rural America is served by DOCSIS 2.0 cable modems and first generation DSL. These technologies are delivering anything from 1 Mbps up to maybe 5 Mbps to the average home and business in these communities. The incumbent carriers claim these areas are served by broadband, and they are always careful to claim that these communities have advertised speeds that are about the paltry 4 Mbps used by the FCC to define broadband.

But every community in this situation has now fallen on the wrong side of the new digital divide. The large telcos and cable companies are making big investments in the metropolitan areas. There are numerous affluent parts of the country that have broadband between 50 Mbps and 100 Mbps download if people are willing to pay a premium price. But in these markets even the slower cable modem products are already between 20 – 30 Mbps.

And I am not talking only about place where Verizon has built FiOS. The larger cable companies have upgraded to DOCSIS 3.0 in many large markets and now have fast speeds. AT&T has launched U-Verse using bonded pair DSL in many of these same markets with speeds of around 40 Mbps. And we are on the verge of AT&T and other copper providers having G.Fast which is going to increase speeds on copper to as much as several hundred Mbps. Even the cellular carriers have stepped up their game in the cities, and the latest version of 3.5 G is delivering speeds of 40 Mbps to 50 Mbps in short bursts.

But these new technology upgrades are not being brought to rural America and are unlikely to be brought there. The incumbent cable companies and telcos installed the current technology over a decade ago and have not upgraded it since. Meanwhile there has been several upgrades in the areas with good broadband.

The incumbents are not willing to make the needed upgrade investments in small markets. They aren’t going to get the same kind of returns they can make for the same investment in a big suburb. They have largely ignored the small markets for years and the wires are in bad shape compared to bigger markets. So I think we now on the verge of a permanent new digital divide defined by areas that keep getting new technology upgrades and areas that will be stuck in the past. And the gulf between these two areas is only going to grow.

There are real life repercussions of this gap. Homes on the wrong side of the digital divide can’t use broadband in their homes the same way that people in a City can. But much more importantly, businesses can’t get the same bandwidth that their competitors in the City have. In the long run this is going to squelch innovation in the rural areas. Areas on the wrong side of the digital divide are going to have a really hard time creating jobs that will let their kids stay in the area. The biggest fear in rural communities is that they are going to become economically irrelevant. They won’t be able to create jobs or keep jobs, their kids will move away and over a few decades the communities will die.

Categories
The Industry

Cellular is Not the Rural Broadband Solution

I’m often asked why we can’t let cellular 4G bandwidth take care of the bandwidth needs for rural America. When you look at the ads on TV by Verizon and AT&T you would assume that the cellular data network is robust and is being built everywhere. But there are a lot of practical reasons why cellular data is not the answer for rural broadband:

Rural areas are not being upgraded. The carriers don’t make the same kinds of investments in rural markets that they do in urban markets. To see a similar situation in a related industry, consider how the large cable companies are upgrading cable modems in the metropolitan areas years before they upgrade rural areas. It seems that urban cellular technology is being upgraded every few years while rural cell sites might get upgraded once a decade.

Rural networks are not built where people live. Even where the cellular networks have been upgraded, rural cellular towers have been historically built to take care of car traffic, referred to in the industry as roaming traffic. Think about where you always see cellular towers and they are either on the top of tall hills or else along a highway not close to many homes and businesses. This matters because like all wireless traffic, the data speeds drop drastically with distance from the tower. Where a 3G customer in a City might get 30 Mbps download speed because they are likely less than a mile from a transmitter, a customer who is 4 miles from a tower might now get 5 Mbps. And in a rural area 4 miles is not very far.  

The carriers have severe data plans and caps. Even when customers happen to live close to a rural transmitter and can get good data speeds, the data plans for the large carriers are capped at very skimpy levels. One HD movie uses around 1.5 gigabits, meaning that a cap of 2 to 4 gigabits is a poor substitute for landline broadband. There are still a few unlimited data plans around but they are hard to get and dwindling in availability. And it’s been widely reported that once a customer reaches a certain level of usage on an unlimited plan that the speeds are choked to go very slow for the rest of the month.

Voice gets a big priority on the network. Cellular networks were built to deliver vice calls to cell phones and voice calls still get a priority on the network. A cell phone tower is limited to a finite amount of bandwidth. And so, once a few customers are downloading something big at the same time, the performance for the rest of the cell site gets noticeably worse. 3G networks are intended to deliver short bursts of fast data, such as when a cell phone user downloads an app. But there is not enough bandwidth at a cell phone tower to support hundreds of ‘normal’ data customers who are watching streaming video and using bandwidth like we use in our homes and businesses.

The plans are really expensive. Cellular data plans are not cheap. For example, Verizon will sell you a data plan for an iPad at $30 per month and a 4 gigabit total usage cap. Additional gigabits cost $10 to $15 each. To get the same plan for an iPhone is $70 per month since the plan requires voice and text messaging. Cellular data is the most expensive bandwidth in a country that already has some of the most expensive bandwidth in the world. 

There are no real 4G deployments yet. While the carriers are all touting 4G wireless, what they are delivering is 3G wireless. By definition the 4G wireless specification allows for gigabit data download speeds. What we now have, in engineering terms can best be described as 3.5 G and real 4G is still sometime in the future. There are reports of current cellular networks in cities getting bursts of speed up to 50 Mbps, which is very good, but is not close to being 4G. But most realized speeds are considerably slower than that.

Categories
The Industry

Killing Municipal Broadband in Kansas

There is a bill in committee in the Kansas Senate that would basically prohibit any municipality from building a broadband network that would bring retail broadband, voice or cable TV to any customer. Kansas SB 304 is attached. If enacted this would add Kansas to the list of many other states that prohibit any form of municipal competition.

I have to declare some bias in the position that I take on this topic due to the fact that I work for a number of municipalities that have built or are thinking of building fiber networks. But I also work for a lot of commercial firms that build broadband networks, and my real bias is against having large parts of our country without adequate broadband. It is my opinion that every part of the country ought to have broadband and I think whoever is willing to step up and make an investment ought to be allowed to do so.

I can tell you from my experience in working with municipalities that decide to get into the broadband business that they feel like they have no other choice. Many rural parts of America are on the wrong side of the digital divide and it’s getting worse all of the time. The large cities are finally getting good broadband and in most metropolitan areas customers can buy broadband speeds today of 50 – 100 Mbps download.

There are still a lot of people on farms who can still only get dial-up or satellite Internet, both which are no broadband at all. But that is not what defines the digital divide any more. The real digital divide can be found in the thousands of towns and counties where the broadband speeds are 3 – 10 Mbps. Those speeds, which were probably okay five years ago, are no longer adequate. Any City that has 5 Mbps download is already on the losing end of the digital divide. With such Internet speeds they are unable to attract or keep businesses or people in their communities.

Small Cities are scared to death of becoming a place where nobody wants to live. Every community hopes for a future where their kids can find jobs somewhere nearby and stay a part of the community. Places on the wrong side of the digital divide can already see that all of their kids move off to find jobs elsewhere, and it’s getting worse all of the time.

A household with only 5 Mbps download is blocked from using the Internet in the same way as people in a metropolitan area. They can’t really do two things at once on such a connection. This means that one member of the family can’t be taking an on-line college course while another is browsing the Internet or watching a streaming TV show.

And businesses with a 5 Mbps connection are hamstrung, You certainly can’t do much if you share such a small pipe with a lot of computers. While this kind of speed might let a tiny retail business squeak by, companies that have multiple employees can’t function with inadequate broadband.

I can tell you small Cities mostly look at offering broadband out of well-founded fear. They always try to get the incumbent provider to offer better broadband before they even think about it. But the ugly reality is that rural markets served by the large national incumbents get the worst service and have the oldest and worst networks in the country. While the large cable companies and telcos have stepped up their game in metropolitan areas, they have ignored investing in rural areas for decades.

So laws like the Kansas one are nothing more than the large telcos and cable companies kicking sand in the face of small town America. They have already shown them that they are not willing to invest in those areas, but they still want to milk them for revenues and don’t want anybody else to help these areas

Exit mobile version