Finally, Cable TV in the Cloud

Cloud_computing_icon_svgThere is finally a cable TV solution for small cable providers that does not require them to own and operate their own headend. In fact, this new solution doesn’t even require them to be a regulated cable company.

The solution is provided by Skitter TV. Skitter TV was started by Skitter Inc. based in Atlanta, Georgia and headed by Robert Saunders, one of the pioneers of IPTV. The company has developed proprietary cable hardware and software that is far less costly than other cable headends.

Skitter TV has been operating for a few years as a sort of a cooperative, and is owned by Skitter Inc. and a number of independent telephone companies. The company’s cable model for the last few years was to come to a small carrier that offers cable TV and to supplant the incumbent product with Skitter TV. Most small cable operators are losing money on their cable product. Skitter TV becomes the cable provider of record and then shares profits with the local provider, which guarantees a small profit on cable.

But Skitter TV just upped their game and has partnered with Iowa Network Services (INS) to bring Skitter TV to more carriers for a lower cost. INS is a consortium of independent telephone companies in Iowa and the company owns a substantial middle-mile fiber network as well as provides a number of services to members.

The latest move takes advantage of the INS fiber network and includes plans to interconnect to other telco-owned fiber networks throughout the country. This will allow companies with access to these other fiber networks to get their cable signal from the INS headend. The same economic model still holds and Skitter will offer a revenue share with local providers, who get to disconnect their existing losing cable business.

There are a few key issues to consider for a small provider looking at this opportunity. The primary one is the cost of transport needed to connect to Skitter and INS. It’s likely that companies that can get a connection to one of the other statewide networks can get this transport for a reasonable cost. But providers outside of those networks need to consider the transport costs in looking at the opportunity.

I’ve looked closely at Skitter TV and it’s a very interesting product offering. They don’t have as many standard network channels as the large urban cable systems, and this helps to hold down the costs of providing the service. But Skitter has made up for a smaller line-up by bringing a large number of non-traditional channels to their line-up. They also have created channels for many of the popular online services. Overall the Skitter lineup is probably an improvement in rural markets and might even be an interesting alternative in urban markets.

One interesting option that Skitter brings is the possibility of offering a cord cutter package that includes local network channels plus a wide array of non-traditional programming. The Skitter cord cutter programming looks to be one of the more robust non-traditional packages on the market.

Customers can connect to Skitter TV using a Roku box, which is cheaper than traditional settop boxes. But Skitter also can support most traditional IPTV settop boxes that providers already have deployed.

Any small cable provider who is losing money on cable TV ought to take a look at this alternative. Even if transport costs look to be a barrier, Skitter TV is often willing to bring their own headend into a market if the numbers look attractive to them.

I think that Skitter TV will do well in the telco and IPTV cable markets because it’s become nearly impossible for a small provider to be profitable on the cable product. It’s a lot more sensible for a provider to partner with Skitter and get a guaranteed small positive margin from cable customers than to continue to bleed cash on the business line. Other than having to provide settop boxes, the Skitter partnership gets companies out of the headend, hardware, and middleware business, taking a lot of pressure off capital budgets.

Farmers and Big Data

johndeereoutsideProbably the biggest change coming soon to crop farming is precision agriculture. This applies GPS and sensors to monitor field conditions like water, soil, nutrients, weeds, etc. to optimize the application of water, pesticides, and fertilizers in order to maximize the crop yields in different parts of the farm. Anybody who has ever farmed knows that fields are not uniform in nature and that the various factors that produce the best crops differ even within one field.

Precision agriculture is needed if we are to feed the growing world population, which is expected to reach almost 10 billion by 2050. As a planet we will need to get the best possible yield out of each field and farm. This might all have to happen against a back drop of climate change which is playing havoc with local weather conditions.

A number of farmers have started the process of gathering the raw data needed to understand their own farms and conditions. Farmers know the best and worst sections of their fields, but they do not understand the subtle differences between all of the acreage. In the past farmers haven’t known the specific yield differences between the various microcosms within their farm. But they are now able to gather the facts needed to know their land better. It’s a classic big data application that will recommend specific treatments for different parts of a field by sifting through and making sense of the large numbers of monitor readings.

In order to maximize precision agriculture new automated farm machinery will be needed to selectively treat different parts of the fields. The large farm equipment manufacturers expect that farming will be the first major application for drones of all types. They are developing both wheeled vehicles and aerial drone systems that can water or treat sections of the fields as needed.

This is a major challenge because farming has historically been a somewhat low technology business. While farms have employed expensive equipment, the thinking part of the business was always the responsibility of each farmer, and the farmers with the best knowledge and experience would typically out-produce their neighbors. But monitoring can level the playing field and dramatically increase yields for everybody.

There are several hurdles in implementing precision agriculture. First is access to the capital needed to buy the monitors and the equipment used to selectively treat fields. This need for capital is clearly going to favor large farms over small ones and will be yet one more factor leading to the consolidation of small farms into larger enterprises.

But the second need is broadband. Gathering all of the needed data, analyzing it, and turning it into workable solutions presupposes the ability to get data from the fields and sent to a supercomputer somewhere for analysis. And that process needs broadband. A farmer who is still stuck with dial-up or satellite broadband access is not going to have the bandwidth needed to properly gather and crunch the large amount of data needed to find the best solutions.

This doesn’t necessitate fiber to the fields because a lot of the data gathering can be done wirelessly. But it does require that farms are fed with high-speed Internet access and good wireless coverage, something that does require rural fiber. I published a blog a few weeks ago that outlined the availability of broadband on farms and it is not currently a pretty picture. Far too many farms are still stuck with dial-up, satellite, or very slow rural DSL.

Some farmers are lucky to live in areas where communications co-ops and rural telcos are bringing them good broadband, but most are in areas where there is neither broadband nor anybody currently planning on expanding broadband. At some point the need for farming broadband will percolate up as a national priority. Meanwhile, in every rural place I visit, the farmers are at the forefront of those asking for better broadband.


Our Lagging 4G Networks

Cell-TowerI have to scratch my head when I read about people who rave about the 4G data speeds they get. First, I travel all over the country and I have yet to see a 4G data speed above 20 Mbps. And yet I’ve seen claims in various online forums for speeds as high as 60 Mbps. I’ve been in a number of major cities in the last six months and have not once seen speeds that I would consider fast.

Second, a report just came out from OpenSignal, a company that provides an app that maps cellular coverage. They collected data recently from 325,000 users around the world and used that data to compare the 4G networks in 140 different countries. Their data showed that the US has the 14th slowest 4G of all these countries at an average paltry speed of 10 Mbps.

Hungary, Denmark, South Korea, Romania, Singapore, and New Zealand have the fastest 4G, all with average speeds of above 25 Mbps, with New Zealand seeing an average speed of 36 Mbps download.

I often run speed tests, but the real way to test 4G speeds is by trying to open web pages I often use at home. I know it’s generally far more efficient to use an app rather than the mobile web, but I open web pages just to see how fast coverage is. It’s well known that speed test results can be rigged by your carrier who knows you are using a speed test site. What I generally find is that web pages that leap onto my screen at home seem to take forever to load on my cellphone, and sometimes they never load.

Why does this matter? I think it matters because there are tons of industry pundits who opine that our broadband future is wireless and that we don’t need to be investing in fiber. They say that wireless is going to get so fast that nobody will feel the need for a landline based internet connection. For a whole long list of reasons I think that argument is totally bosh. Consider the following:

  • Cellular data speeds drop quickly with distance from the cell tower. Today cell phone towers are not situated for data coverage and were built to handle voice traffic. A cell tower can complete a voice call at a much greater distance from the tower than it can make a strong data connection.
  • We could always build more towers to bring transmitters closer to people. But for those new towers to work they are going to have to be fiber fed, something that very few companies are willing to invest in.
  • Cell phone signals don’t penetrate structures very well. I recently visited my dentist. In the parking lot I was easily able to read news articles on Flipboard. I then walked into the waiting room, which has big windows to the outside world, but the ability to read articles slowed down a lot. Then when I was taken back to an interior room that was only one room further from the outside, I couldn’t even get the app to open. This is not an unusual experience and I see it often.
  • Cell sites can only handle a limited number of customers and they get overwhelmed and degrade if they get more demand than optimum. And the more bandwidth that is delivered, the easier it is for a cell site to reach capacity.
  • The various swaths of spectrum used for cellular data each have their own unique limitations. In many cases the spectrum is carved into somewhat small channels (which was done before we conceived using the spectrum for data) and it’s very hard to cobble together a large wireless data path. It generally means linking several frequencies to a given customer data path, which is both complicated and somewhat taxing on a cellphone.
  • Data caps, data caps, data caps. Let’s face it, as long as the cellphone companies want to charge $10 per downloaded gigabit then they cannot be a serious contender for anybody’s real life data usage. I estimate that my household downloads at least 500 gigabits per month at home and I don’t think we are unusual. If I was paying cellphone data rates that would cost me an astounding $5,000 per month. Even should they cut their rates by 90% this would still cost an eye-popping $500 per month. As long as cellphone data rates are 100 times higher than landline rates they are something you use to casually browse the news, not as a real internet connection.

What’s Up with 4K Video?

4K videoIt seems like I can’t read tech news lately without seeing an article mentioning something new going on with 4K video. So I thought I would talk a bit about what 4K is and how it differs from other current types of video.

4K is the marketing term to cover what is officially named Ultra High Definition (UHD) video. UHD video is distinguished from current high definition video by having a higher picture resolution (more pixels) as well as more realistic colors and higher frame rates (meaning more pictures per second).

Let’s start with some definitions. 4K video is defined by the Consumer Electronics Association as a video stream that has at least 3,840 X 2,160 pixels. This contrasts to existing high definition (HD) video that has 1,920 X 1,080 pixels and standard definition video (SD) that has 720 X 480 pixels. These are not precise standards—for example there is some SD video that is broadcast at 540 pixels. There is also an existing standard for some video cameras that record at 4,096 X 2,160 pixels which is also considered 4K.

The 4K standard was developed in an attempt to be able to deliver digital media to movie theaters. This would save a lot of money compared to shipping around reels of film. Standard HD does not project well onto the big screens and 4K will overcome a lot of these shortfalls. But high action movies require more definition than is provided by 4K and will require the upcoming 8K video standard to be able to be digitally transmitted for use on the largest screens.

Interestingly, there is not a huge increase in quality from shifting home viewing from HD to 4K. There is a huge improvement in quality between SD and HD, but the incremental improvements between HD and 4K are much harder to discern. The improvements are more due to the number of different colors being projected, because the human eye cannot really see the pixel differences when viewed on relatively small computers or home TV screens. It’s easy to get fooled about the quality of 4K due to some of the spectacular demo videos of the technology being shown on the web. But these demos are far different than what run-of-the-mill 4K will look like, and if you think back there were equally impressive demos of HD video years ago.

The major difference between HD and 4K for the broadband industry is the size of the data stream needed to transmit all of the pixel data. Current 4K transmissions online require a data path between 18 Mbps and 22 Mbps. This is just below the FCC’s definition of broadband and according to the FCC’s numbers, only around 20% of homes currently have enough broadband speed to watch 4K video. Google just recently announced that they have developed some coding schemes that might reduce the required size of a 4K transmission by 40% to 50%, but even with that reduction 4K video is going to put a lot of strain on ISPs and broadband networks, particularly if homes want to watch more than one 4K video at a time.

I recently read that 15% of the TVs sold in 2015 were capable of 4K and that percentage is growing rapidly. However, lagging behind this is 4K capable settop boxes; anybody that wants to get 4K from their cable provider will require a new box. Most of the large cable providers now offer these boxes, but often at the cost of another monthly fee.

Interestingly, there is a lot of 4K video content on the web, much of it filmed by amateurs and available on sites like YouTube or Vimeo. But there is a quickly increasing array of for-pay content. For instance, most of the Netflix original content is available in 4K. Amazon Prime also has Breaking Bad and other original content in 4K. It’s been reported that the next soccer World Cup will be filmed in 4K. There are a number of movies now being shot in 4K as well as a library of existing IMAX films which fit well into this format. Samsung has even lined up a few movies and series in 4K which are only available to people with Samsung 4K TVs.

One thing is for sure, it looks like 4K is here to stay. More and more content is being recorded in the format and one has to imagine that over the next few years 4K is going to become as common as HD video is today. And along with the growth of 4K demand will come demand for better bandwidth.

Comcast Trying Data Caps Again

comcast-truck-cmcsa-cmcsk_largeYet again Comcast is trying to introduce data caps. They have introduced what they are calling ‘data usage trials’ in Miami and the Florida Keys. For some reason most of their past trials for this have also been in the southeast. The new plan gives customers a monthly data cap of 300 gigabits of downloaded data. After you hit that cap then every additional 50 gigabits costs $10. For $30 extra you can get unlimited downloads.

When Comcast tried caps a few years ago they used a monthly cap of 250 gigabits. Since the average household has been doubling the amounts of data used every three years, the new cap is stingier than the old 250 GB cap since households would have normally almost doubled usage compared to the last time Comcast tried this. This means the 300 GB cap is going to affect a lot more people than the old cap.

What is probably most annoying about this is that Comcast is refusing this time to call these data caps. Instead they are calling this a ‘data usage trial’ and are trying hard to compare themselves to the plans sold by the cell phone companies. Of course, everybody in the world understands those cellular plans to be data caps.

It’s not hard to understand why Comcast wants to do this. While broadband subscriptions continue to grow, with the overall US market at an 83% broadband penetration there is not a lot of future upside in broadband sales. Further, I know that Comcast is eyeing the financial performance of the cellphone companies with envy since they can see the significant revenues generated by AT&T and Verizon with their data caps.

But Comcast also must understand that customers are absolutely going to hate these caps. Households are watching online video more and more and it is that usage that is driving the vast majority of downloads. There are other households that have big usage due to gaming, and some households that still engage in file-sharing, even though that is often illegal and riskier than it used to be.

The last time Comcast did this they saw a massive customer revolt and I certainly expect that to happen again. Take my case. I estimate that we probably use at least 500 GB per month. So for me this is basically means a $30 increase in my data rate. They have already pushed me to the edge of tolerance by forcing me to buy a basic TV package that I don’t use in order to get a 50 Mbps cable modem. If they introduce this cap they would push me over $100 per month just to get a broadband connection. At that point I start taking a very serious look at CenturyLink, the other provider in my neighborhood.

The biggest problem with any data caps is that, no matter where the cap is set, over time more and more customers are going to climb over it. We are just now starting to see the first proliferation of 4K video, and at download requirements of 18–22 Mbps this will blow past the data cap in no time.

What is most ridiculous about data caps either for cellular or landline data is that the US already has the most expensive Internet access of all of the developed countries. ISPs are already reaming us with ridiculously expensive broadband access and are now scheming for ways to make us pay more. The margins on US broadband are astronomical, in the 90% plus profit margin range. So data caps at a company like Comcast are purely greed driven, nothing else. There are zero network or performance issues that could justify penalizing customers who actually use the data they are paying for.

I am not entirely against data caps. For example, I have one client that has a 1 terabit cap on their basic data product and 2 terabits on their fastest product. They don’t use these caps to jack up customer prices, but instead use them as an opportunity to discuss usage with customers. For instance, they might convince somebody who is constantly over the 1 terabit cap to upgrade to a product with a higher cap. But mostly they use these caps as a way to force themselves to monitor customers. Their monitoring found a few customers who went over the cap because they were operating some kind of commercial retail server out of their home. Their terms of service prohibit operating a business service over a residential product and they upgraded such customers to a business product, which has no data cap.

If you want to get really annoyed, look at this Comcast blog which explains the new ‘data usage trials.’ It is frankly one of the worst cases of corporate doublespeak that I have read in a long time. You have to feel a bit sorry for the corporate communications people who had to write this drivel, but the ones to hate are their corporate bosses who are determined to make us all pay more for using data.


US and Europe at Odds over Privacy

Scales-Of-Justice-12987500-300x300A few weeks ago I wrote about the various battles currently raging that are going to determine the nature of the future Internet. None of these battles are larger than the battle between spying and surveillance, and citizens and countries that want to protect their citizens from being spied upon.

Recently, we’ve seen this battle manifest in several ways. First, countries like Russia and Thailand are headed down a path to create their own fire-walled Internet. Like the Chinese Great Firewall, these networks aim to retain control of all data originating within a country.

But even where the solution is not this dramatic we see the same battle. For instance, Facebook is currently embroiled in this fight in Europe. Facebook might have been singled out in this fight because they already have a bad reputation with European regulators. That reputation is probably deserved since Facebook makes most of their money from their use of customer data.

But this fight is different. The Advocate-General of the European Court of Justice (their equivalent of the Supreme Court) just ruled against Facebook in a ruling that could affect every US Internet company doing business in Europe. The ruling has to do with the ‘safe harbor’ arrangement that has been used as the basis for transferring European customer data back to US servers. The safe harbor rules come from trade rules negotiated between the US and the European Union in 2003. These rules explicitly allow what Facebook (and almost everybody else) is doing with customer data.

The Advocate-General has ruled that the EU was incorrect in negotiating the safe harbor rules. He says that they contradict some of the fundamental laws of the EU including the Charter of Fundamental Rights, the equivalent to our Constitution. He says the safe harbor rules violate the basic rights of citizens to privacy. He explicitly says that this is due to NSA spying, and that by letting Facebook and others take European data out of the country they are making it available to the NSA.

This ruling is still not cast in concrete since the Court of Justice still has to accept or reject the recommendations from the Advocate-General. However, they accept these recommendations most of the time. If this is upheld it is going to create a huge dilemma for the US. Either the NSA will have to back off from looking at data from the US companies, or else US companies won’t be able to bring that data out of Europe.

For companies like Facebook this could be fatal. There are some commercial web services that could be hosted in Europe to operate for Europeans. But social media like Facebook operate by sharing their data with everybody. It would be extremely odd on Facebook if an American couldn’t friend somebody from Europe or perhaps be unable to post pictures of their vacation while they were still in Europe. And this might put a real hitch in American companies like Google and Amazon doing business in Europe.

Such a final ruling would send US and EU negotiators back to the table, but in new negotiations safe harbor rules would no longer be an option. This ruling could bring about a fundamental change in the worldwide web. And this comes at a time when Facebook, of all companies, is talking about bringing the rest of the human race onto the web. But perhaps, as a consequence of NSA and surveillance by other companies, each country or region might end up with a local web, and the worldwide web will be a thing of the past.

Lessons from my Father

image1 (2)Today I’m taking a little break from telecom. Earlier this week we buried my father. This has put me into a contemplative frame of mind, as I am sure happens to everybody who goes through this.

I have been thinking a lot about the things my father taught me. My father was never a great communicator, instead he taught me mostly by example. Probably the biggest lessons he taught me were the value of hard work and of showing up and being there. I don’t think my father ever missed a day of work until he was into his 50s and had a leg injury. He got up and went to work, come rain or shine or illness.

And I learned that lesson well from him. I work from home, a setting that might make it easy for many people to find excuses not to work. But I get up and start work early every day and stay busy until quitting time. Over the years it is probably this discipline that has been one of the major factors in my success. If you put in the work and the effort good things happen.

He taught me other lessons in life. My father was big on pithy sayings. I think the one that I remember the most was, “it doesn’t cost anything to be polite.” It turns out the rest of the family doesn’t remember this one, so perhaps I was the youngster in the most need of this advice. But throughout life I have been polite. I say “yes, sir” and “yes, ma’am” almost universally, and I think I often surprise young people when I say this to them.

One of his lessons that I try to practice daily is to be pleasant to everybody you meet. I always said that my father could have been the mayor of his town had he wanted because he always greeted everybody he met with a big grin, a big hug, and a good joke to get them laughing. It’s hard to think that there was anybody who didn’t like my father. I have met very few people in this life who had such a genuine affinity for people. I certainly will never be as natural about this as he was, but I genuinely like talking and working with people, which is a major part of the day for a consultant.

It’s interesting how all of us carry forward things from our parents. I guess it’s human nature to emulate those who had the biggest influences on you when you were young. But I carry traits from both of my parents into my daily life and these traits have served me well.

In case you are wondering, my dad died of a decade-long fight with Alzheimer’s. In the last few years a lot of what he had been was gone or diminished, but I guess some of what he was lives on in me and my siblings. And in my dad’s case, he hopefully lives on in all of the people that he hugged every time he saw them.


Universal Internet Access

navigator_globe_lgWhile many of us are spending a lot of time trying to find a broadband solution for the unserved and underserved homes in the US, companies like Facebook, Google, and Microsoft are looking at ways of bringing some sort of broadband to everybody in the world.

Mark Zuckerberg of Facebook spoke to the United Nations this past week and talked about the need to bring Internet access to the five billion people on the planet that do not have it. He says that bringing Internet access to people is the most immediate way to help lift people out of abject poverty.

And one has to think he is right. Even very basic Internet access, which is what he and those other companies are trying to supply, will bring those billions into contact with the rest of the world. It’s hard to imagine how much untapped human talent resides in those many billions and access to the Internet can let the brightest of them contribute to the betterment of their communities and of mankind.

But on a more basic level, Internet access brings basic needs to poor communities. It opens up ecommerce and ebanking and other fundamental ways for people to become engaged in ways of making a living beyond a scratch existence. It opens up communities to educational opportunities, often for the first time. There are numerous stories already of rural communities around the world that have been transformed by access to the Internet.

One has to remember that the kind of access Zuckerberg is talking about is not the same as what we have in the developed countries. Here we are racing towards gigabit networks on fiber, while in these new places the connections are likely to be slow connections almost entirely via cheap smartphones. But you have to start somewhere.

Of course, there is also a bit of entrepreneurial competition going on here since each of these large corporations wants to be the face of the Internet for all of these new billions of potential customers. And so we see each of them taking different tactics and using different technologies to bring broadband to remote places.

Ultimately, the early broadband solutions brought to these new places will have to be replaced with some real infrastructure. As any population accepts Internet access they will quickly exhaust any limited broadband connection from a balloon, airplane, or satellite. And so there will come a clamor over time for the governments around the world to start building backbone fiber networks to get real broadband into the country and the region. I’ve talked to consultants who work with African nations and it is the lack of this basic fiber infrastructure that is one of the biggest limitations on getting adequate broadband to remote parts of the world.

And so hopefully this early work to bring some connectivity to remote places will be followed up with a program to bring more permanent broadband infrastructure to the places that need it. It’s possible that the need for broadband is going to soon be ranked right after food, water, and shelter as a necessity for a community. I would expect the people of the world to expect, and to then push their governments into making broadband a priority. I don’t even know how well we’ll do to get fiber to each region of our own country, and so the poorer parts of the world face a monumental task over the coming decades to satisfy the desire for connectivity. But when people want something badly enough they generally find a way to get what they want, and so I think we are only a few years away from a time when most of the people on the planet will be clamoring for good Internet access.


The Gigabit Dilemma

common carrierCox recently filed a lawsuit against the City of Tempe, Arizona for giving Google more preferable terms as a cable TV provider than what Cox has in their franchise with the city. Tempe undertook the unusual step in creating a new license category of “video service provider’ in establishing the contract with Google. This is different than Cox, which is considered a cable TV provider as defined by FCC rules.

The TV offerings from the two providers are basically the same. But according to the Cox complaint Google has been given easier compliance with various consumer protection and billing rules. Cox alleges that Google might not have to comply with things like giving customers notice of rate changes, meeting installation time frames, and even things like the requirement for providing emergency alerts. I don’t have the Google franchise agreement, so I don’t know the specific facts, but if Cox is right in these allegations then they are likely going to win the lawsuit. Under FCC rules it is hard for a city to discriminate among cable providers.

But the issue has grown beyond cable TV. A lot of fiber overbuilders are asking for the right to cherry pick neighborhoods and to not build everywhere within the franchise area – something that incumbent cable companies are required to do. I don’t know if this is an issue in this case, but I am aware of other cities where fiber overbuilders only want to build in the neighborhoods where enough customers elect to have them, similar to the way that Google builds to fiberhoods.

The idea of not building everywhere is a radical change in the way that cities treat cable companies, but is very much the traditional way to treat ISPs. Since broadband has been defined for many years by the FCC as an information service, data-only ISPs have been free to come to any city and build broadband to any subset of customers, largely without even talking to a city. But cable TV has always been heavily regulated and cable companies have never had that same kind of freedom.

But the world has changed and it’s nearly impossible any more to tell the difference between a cable provider and an ISP. Companies like Google face several dilemmas these days. If they only sell data they don’t get a high enough customer penetration rate – too many people still want to pay just one provider for a bundle. But if they offer cable TV then they get into the kind of mess they are facing right now in Tempe. To confuse matters even further, the FCC recently reclassified ISPs as common carriers which might change the rules for ISPs. It’s a very uncertain time to be a broadband provider.

Cities have their own dilemmas. It seems that every city wants gigabit fiber. But if you allow Google or anybody into your city without a requirement to build everywhere within a reasonable amount of time, then the city is setting themselves up for a huge future digital divide within their own city. They are going to have some parts of town with gigabit fiber and the rest of the town with something that is probably a lot slower. Over time that is going to create myriad problems within the city. There will be services available to the gigabit neighborhoods that are not available where there is no fiber. And one would expect that over time property values will tank in the non-fiber neighborhoods. Cities might look up fifteen years from now and wonder how they created new areas of blight.

I have no idea if Google plans to eventually build everywhere in Tempe. But I do know that there are fiber providers who definitely do not want to build everywhere, or more likely cannot afford to build everywhere in a given city. And not all of these fiber providers are going to offer cable TV, and so they might not even have the franchise discussion with the city and instead can just start building fiber.

Ever since the introduction of DSL and cable modems we’ve had digital divides. These divides have either been between rich and poor neighborhoods within a city, or between the city and the suburban and rural areas surrounding it. But the digital divide between gigabit and non-gigabit neighborhoods is going to be the widest and most significant digital divide we have ever had. I am not sure that cities are thinking about this. I fear that many politicians think broadband is broadband and there is a huge current cachet to having gigabit fiber in one’s city.

In the past these same politicians would have asked a lot of questions of a new cable provider. If you don’t think that’s true you just have to look back at some of the huge battles that Verizon had to fight a decade ago to get their FiOS TV into some cities. But for some reason, which I don’t fully understand, this same scrutiny is not always being applied to fiber overbuilders today.

It’s got to be hard for a city to know what to do. If gigabit fiber is the new standard then a city ought to fight hard to get it. But at the same time they need to be careful that they are not causing a bigger problem a decade from now between the neighborhoods with fiber and those without.

Service Unavailable

51H2Ytxu9TL._SX361_BO1,204,203,200_I just finished reading Service Unavailable, a new book by Frederick L. Pilot. It’s a quick and easy read for anybody in the broadband industry and covers the rural broadband crisis we have in the US.

The first two-thirds of the book are a great history of how we got to where we are today. Pilot explains the decisions made by the FCC and by the large ISPs in the country that have brought us to today’s broadband network. In far too many places that network consists of old copper wires built to deliver voice; these have deteriorated with age and are wholly inadequate to deliver any meaningful broadband. The large ISPs have poured all of their money into urban networks and Pilot describes how the networks in the rest of the country have been allowed to slowly rot away from lack of maintenance.

It’s a shame that his book went to press right before the FCC took an action that would have been an exclamation point in Pilot’s story of broadband policies gone amuck. The FCC just gave away billions of dollars to the largest telcos to upgrade rural DSL to 10 Mbps download speeds – a speed that is already inadequate today and that will be a total joke by the time the last of the upgrades are done over a six year period. The FCC seems not to have grasped the exponential growth in consumer broadband demand that has doubled about every three years since the early 90s.

Pilot goes on to recommend a national broadband program that would direct many billions of dollars to build fiber everywhere, much in the same way that the federal government built the interstate highways. He says this is the only way that places outside of the urban areas will ever get adequate broadband.

I certainly share Pilot’s frustration and have had the same thought many times. We could probably build fiber everywhere for the price of building one or two aircraft carriers, and one has to wonder where our national priorities are. As Pilot points out, broadband everywhere would unleash a huge untapped source of creativity and income producing ability in the parts of the country that don’t have good broadband today. And as he points out, many of the places that barely squeak by with what is considered broadband today are going to find themselves on the wrong side of the digital divide within a few years.

But then I stop and consider how federal projects are run and I’m not so sure this is the right answer. I look back at how the stimulus grant programs were run. These grants shoveled large sums of money out the door to build a whole lot of fiber networks that barely brought broadband to anybody. And they did it very inefficiently by forcing fiber projects to pay urban wage rates for projects built in rural counties where the prevailing wages were much lower. And these projects required things like environmental and historical structure studies, things that I had never seen done before by any fiber project.

And then there is the question of who would run these networks? I sure wouldn’t want the feds as my ISP and I wonder how they would decide who would be the recipient of these huge expenditures of federal monies? Pilot proposes that such networks be operated as open access networks, a model that has not yet seen any success in the US. It’s a model that works great in Europe, where all of the largest ISPs jump onto any available network in order to get customers. But in this country the incumbents have largely agreed not to compete against each other in any meaningful way.

But beyond the operational issues, which surely could be figured out if we really wanted to do this, one has to ask if this idea can ever get traction with our current government? We have such a huge backlog of infrastructure projects for maintaining roads, bridges, and waterways that one has to wonder how broadband networks would get the needed priority. I have never understood the reluctance of Congress to tackle infrastructure because such expenditures mostly translate to wages, which means full employment, lots of taxes paid, and a humming economy. But we’ve just gone through over a decade of gridlock, and I have a hard time seeing anything but more of the same as we seem to grow more divided and partisan every year.

Still, Pilot is asking exactly the right questions. Unfortunately, I am not sure there can ever be that one big fix-it-all solution that will solve the broadband crisis. I completely agree with Pilot that there should be such a solution and I also agree that this is badly needed. We are quickly headed towards a day of urban America with gigabit speeds and the rest of the country with 10 Mbps DSL, a wider broadband gap than we have ever had. And all we have is an FCC that is shoveling money out the door for band-aid fixes to DSL networks on old copper.

So I’m not sure that the solution suggested by Pilot can be practically implemented in today’s political world, but it is one possible solution in a world where very few others are proposing a way to fix the problem. I would think that the industry could figure out a workable solution if there was any real inclination to do so. But instead, I fear we are left with a world of large corporations running our broadband infrastructure who are more interested in quarterly earnings than they are about reinvesting in the future. If I could, I would wish for a more utopian society where we could put Pilot and other thinkers into a room together until they worked out a solution to our looming broadband crisis.