What Are Smart Cities?

Jetsons cityI’ve been seeing smart cities mentioned a lot over the last few years and so I spent some time lately reading about them to see what all the fuss is about. I found some of what I expected, but I also found a few surprises.

What I expected to find is that the smart city concept means applying computer systems to automate and improve some of the major systems that operate a city. And that is what I found. The first smart city concept was one of using computers to improve traffic flow, and that is something that is getting better all the time. With sensors in the roads and computerized lights, traffic systems are able to react to the actual traffic and work to clear traffic jams. And I read that this is going to work a lot better in the near future.

But smart city means a lot more. It means constructing interconnected webs of smart buildings that use green technology to save energy or to even generate some of the energy they need. It means surveillance systems to help deter and solve crimes. It means making government more responsive to citizen needs in areas like recycling, trash removal, snow removal, and general interfaces with city systems for permits, taxes, and other needs. And it’s going to soon mean integrating the Internet of Things into a city to perfect the many goals of governments doing a better job.

I also found that this is a worldwide phenomenon and there is some global competition between the US, Europe, China, and India to produce the most efficient smart cities. The conventional wisdom is that smart cities will become the foci of global trade and that smart cities will be the big winners in the battle for global economic dominance.

But I also found a few things I didn’t know. It turns out that the whole smart city concept was dreamed up by companies like IBM, Cisco, and Software AG. The whole phenomenon was not so much a case of cities clamoring for solutions, but rather of these large companies selling a vision of where cities ought to be going. And the cynic in me sees red flags and wonders how much of this phenomenon is an attempt to sell large, overpriced hardware and software systems to cities. After all, governments have always been some of the best clients for large corporations because they will often overpay and have fewer performance demands than commercial customers.

I agree that many of the goals for smart cities sound like great ideas. Anybody who has ever sat at a red light for a long time while no traffic was moving on the cross street has wished that a smart computer could change the light as needed. The savings for a community for more efficient traffic is immense in terms of saved time, more efficient businesses, and less pollution. And most cities could certainly be more efficient when dealing with citizens. It would be nice to be able to put a large piece of trash on the curb and have it whisked away quickly, or to be able to process a needed permit or license online without having to stand in line at a government office.

But at some point a lot of what the smart city vendors are pushing starts to sound like a big brother solution. For example, they are pushing surveillance cameras everywhere tied into software systems smart enough to make sense out of the mountains of captured images. But I suspect that most people who live in a city don’t want their city government spying and remembering everything they do in public any more than we want the NSA to spy on our Internet usage at the federal level.

So perhaps cities can be made too smart. I can’t imagine anybody who minds if cities get more efficient at the things they are supposed to provide for citizens. People want their local government to fix the potholes, deliver drinkable water, provide practical mass transit, keep the traffic moving, and make them feel safe when they walk down the street. When cities go too much past those basic needs they either have crossed the line into being too intrusive in our lives, or they are stepping over the line and competing with things that commercial companies ought to be doing. So I guess we want our cities to be smart, but not too smart.

Rise of the Robots

Rise of the RobotsI recently finished reading Rise of the Robots: Technology and the Threat of a Jobless Future by Martin Ford. In the book he paints a rather bleak look at the not-too-distant future where robots begin taking over a lot of the jobs that support us. He predicts that within a few decades over 50% of the jobs that are here today will be gone through being done better and cheaper by automation.

Unlike past predictions, he’s not just talking about factory jobs, although those are well on the way to being replaced, too. I read that Apple is working hard to automate as much of the production of the iPhone as possible. But Ford says that the first jobs to disappear will be white collar jobs and that jobs that require physical skills and people-to-people interactions like plumbers, electricians, and auto repair might be the most immune from automation.

It’s a bit glib to use the word robots, because white collar jobs are going to be taken over by the combination of supercomputers and big data processing. And it’s also a bit of a misnomer to say that jobs will be replaced. Some jobs will be 100% supplanted, but for many jobs computers will be able to take over some significant portion of the work currently done. And in doing so companies will need fewer people.

The way to understand how this might happen is to consider the recent announcement that a deep learning computer learned to play master-level chess in 72 hours. There have been computers beating the best chess players in the world for a few decades now, but this is very different. The old computers were good at chess by using brute force and the computer programs looked ahead at millions of possible outcomes before making a move. But the new deep-learning computer learned to play chess the same way that humans do.

And if the computer can learn to play chess it can learn how to do a whole lot of jobs that people do. I have a friend, Danny, who operates a large CPA firm and he has already started his business down this path. He has written programs that have fully automated the process of reconciling bank statements and using that data to produce a set of books and a draft tax return, all with almost no human intervention. This has allowed him to save on labor and probably puts him five years ahead of his competition. But the industry will catch up to him and that is going to eliminate the bookkeepers and accounting clerks who have been doing that function everywhere.

It’s not hard to picture this same computer process eliminating the jobs of almost anybody that manipulates data. If you work in front of a computer screen you probably can and will be replaced. Some of the major news outlets already are using computers to generate sports stories for box scores. These programs can take a box score and turn it into a short word blurb that is hard to distinguish from one written by a human. But the new deep-learning approach is going to go far beyond this very simple task. I just read about an AI program that is helping scientists by searching through millions of published papers to find research related to what they are working on.

Ford is an industry insider and it’s hard to find fault with his conclusions. There are many others making the same prediction, but Ford makes the case more clearly than most others I’ve read. Where I disagree with Ford is in his suggestion of a solution. It’s obvious that if half of the people don’t have work that we will have to find a social solution. He suggest what he calls a Minimum Annual Income given by the government to those who can’t find work. It’s hard to picture with today’s politics that the county is going to step up and pay people to live who don’t work and will probably never work. If not, we face a very bleak future of haves and have-nots.

What I find most dismaying is that as a society we are paying zero attention to this issue while it is right in front of us. We are going to start seeing the jobs disappearing in the coming decade, with the decade following seeing the full brunt of technology replacing people. Take the example of pharmacists, which is typical of the kind of knowledge job that can easily be automated. It’s fairly unlikely that somebody just starting pharmacy school today is going to find lifetime employment in that field. There are already hospitals trialing robot pharmacists and reporting that they operate flawlessly without errors. While all pharmacists won’t go, one would think that the big drugstore chains will replace most pharmacists over the next decade or so. This is just one example and the same thing is going to happen to another thousand job descriptions. As good jobs start disappearing this is going to create possibly the biggest shift that society has ever faced over such a short period.

I don’t get paid anything to write this blog so I don’t worry about becoming unemployed as a writer. But the predictions are that a lot of technical writing will be done by computer soon. I don’t know if a computer is ever going to be as opinionated as me, but that is probably something that can be built in as well. So perhaps one day a computer will take over this blog. I’ll be sure to check in from time to time, though, just to see how I’m doing.

Is Net Neutrality Hurting Telecom Investment?

Network_neutrality_poster_symbolLeading up to the net neutrality decision a lot of the big carriers claimed that putting broadband under Title II regulation would kill their desire to make new investments in broadband. AT&T went so far as to tell the FCC that they would stop all capital investments if net neutrality was ordered. Chairman Tom Wheeler of the FCC quickly called their bluff on this and AT&T backed down.

But net neutrality has been the law now for a while and the large carriers are still crying the same tune. There are regular postings by USTelecom, the trade group for the large carriers, claiming that Title II is hurting investment in the industry.

There is uncertainty in the industry due to the fact that much of the FCC’s ruling is being challenged in the courts. At a recent hearing of the House Communications Subcommittee, Frank Louthan of Raymond James told the committee that the big carriers are making investments in line with what they think will be the final rules after the court fights. I’m sure he’s right because that’s what large companies always do when they face uncertainty – they pick what they think will happen and operate under that assumption. But in this case the uncertainty is ironic since it is being caused directly by the lawsuit brought by the carriers that don’t like the uncertainty.

And it’s hard to see that net neutrality is hurting the biggest carriers. Certainly nobody was affected more by the rule changes than the big cable companies who had essentially no regulation on their broadband before the net neutrality rules. Ars Technical has dug into the most recent financials for these companies and reports that large cable companies have increased capital spending. They report that Comcast’s capital spending this year is up 11% over last year and Time Warner is up 10%. And every large cable company has said they are going to be pouring money into upgrades to DOCSIS 3.1 over the next year, so their capital spending is not going to go down in the foreseeable future.

I always wonder what exactly the carriers don’t like about net neutrality. Net neutrality stopped the carriers from making deals that enriched themselves but that restricted customer choices. But it seems like the public was very much on the FCC’s side on this issue and it’s hard for the carriers to find any sympathy for their cause. Probably more problematic for the carriers is that the FCC, in a surprising part of the net neutrality ruling, put in place rules on the network side of the carriers. They were all getting ready to start charging large content providers like Netflix for bringing content to their networks. The FCC decided that the millions of customers paying for monthly broadband were already paying that cost. What the carriers seem most annoyed about is that customers actually want to use the broadband to which they subscribe. The carriers have plans to put an end to that and everybody is watching Comcast’s new attempt at data caps.

One topic that came up in the House hearings was the fact that much of the net neutrality order was done by forbearance, meaning that the FCC chose which existing rules for telephone service would apply or not apply to regulating data. The carriers fear that a future activist FCC could change their mind at any time on the rules that have been forborne and either change the regulations or else hold change over the carriers’ heads while negotiating other issues. On that topic I agree with the carriers and what the FCC has given they also have the right to take away. The only real fix for this would be a new telecom act from Congress, and with our divided political parties that doesn’t seem very likely.

I remember this same level of teeth gnashing and hair rending by AT&T and Verizon after the Telecommunications Act of 1996. They challenged that ruling and spent the next several years complaining about it loudly. But eventually they were complaining to deaf ears, and in this case one has to think that whatever the courts order is going to stand as the law for a while. But eventually the complaining about the Act died down and both companies have gone on to be far more profitable than they were before the Act. Perhaps they ought to go back and revisit their own recent history and just get on with what they do best—make money.

Trusting Our Digital Universe

BeetleI was thinking about the Volkswagen cheating scandal, where they had a computer chip change the emissions of their cars during testing. It got me to thinking about how customers trust or don’t trust businesses. Volkswagen not only lied to regulators that their cars have low emissions, but they went and made that the centerpiece of their advertising campaign.

What made the Volkswagen scandal worse for the rest of us is that they cheated using software. Pretty much everything we do in the telecom industry these days involves software. The Volkswagen scandal, along with many others, might eventually make the public untrusting of everything that includes software.

There are already examples of telecom companies who have violated their customers’ trust. For example, Comcast has made everybody’s WiFi routers into a dual purpose router that can serve people outside of your house. Comcast very quietly told the public about this once, but if I wasn’t in the industry I probably wouldn’t have noticed this change and I’m sure the average household has no idea that Comcast is using their routers that way. Security experts everywhere warn about how dangerous it is to let the public into your WiFi router.

I paused when considering buying a smart TV after it was revealed that Samsung TVs had the ability to watch whatever happens in front of them and to hear everything within earshot. Our PCs have had that same weakness ever since they started building cameras into every monitor.

And a lot of people now mistrust their ISPs who have been funneling all of their data to the NSA. Of course, your ISP already knows everything you do online anyway and there is no telling what some of them might be doing with the information.

We are about to enter an age where people are going to be filling their homes up with many more smart devices. We’ll obviously buy them because they will make our lives easier or more fun, but every one of these devices that is hooked to a network could end up being used to spy on us. You have to know that at least some of the makers of IoT devices will try to spy on us since there is a lot of money in selling data about us all.

I’m not quite sure as a society how we deal with this issue because we have entered uncharted legal waters. Almost all of our product liability laws concentrate of the mechanical nature of the things we buy. In the case of Volkswagen, the mechanical parts of their cars worked just fine; the fault was in their software that had been deliberately manipulated to lie about the performance of the cars. It’s hard to think that anybody except the smartest technical people are going to have any way to know if our devices are doing things we don’t want them to do. Once they get hooked up to a network, their software can spy on us in devious was that are as hard to detect as the Volkswagen software.

Telecom companies have a particularly important obligation to the public. As the ISP most directly serving people we must earn and keep their trust. This is why I am particularly dismayed to see the big carriers like AT&T and Verizon so willingly handing over customer data to the NSA. If the law makes a telecom company do something then they must obviously comply. But these companies chased the big bucks from the NSA as if they were just another customer and sold out everybody else who sends them a monthly check. And sadly, since AT&T controls a lot of the Internet hubs, the data from all of the little ISPs was given over as well, without the consent or knowledge of the smaller companies or their customers.

I fully expect some day that we’ll have a terrible scandal or tragedy involving the ability of our new IoT devices to spy on us. And when that happens there might well be a backlash with people ripping out and stopping their use of the devices. The whole industry needs to realize that a few bad events can spoil the market for all of them, and so it’s my hope that companies that abuse the public trust get exposed by those who do not. Unfortunately, we don’t have a lot of history of that happening.

Barriers to Broadband Deployment on Federal Lands

Road through parkCenturyLink’s VP of Regulatory Affairs, Jeb Benedict testified recently at the House Energy and Commerce subcommittee that there are a number of barriers to rural broadband deployment when a fiber needs to pass though federal or tribal land. He said that CenturyLink would support legislation that would do the following:

  • Require that federal agencies give priority to rights-of-way applications and establish time frames in which they must respond to applications to build broadband.
  • Make it easier to put fiber into areas that were previously disturbed like the shoulder of a roadway.
  • Minimize or even eliminate permitting fees and leases for rights-of-way for fiber projects.
  • Require federal agencies to work together when necessary for fiber projects.

CenturyLink is right about all of these items and I’ve seen projects get bogged down over these issues many times. For example, the process and paperwork required to build fiber through federal park land can be gruesome and time consuming. There are different processes to follow for different kinds of federal land and so the process differs depending upon whether something is a national park, a national forest, or just general federal land. And there are often numerous barriers for getting fiber through tribal lands as well.

What I’ve always found mystifying is that building on park land always treats each new application like it is the first time that telecom is being built there. It’s no easier to put fiber where copper has been run before and you have to start from scratch. What is particularly frustrating is that, as Mr. Benedict pointed out, a lot of hoops have to be jumped through to build into the rights-of-ways or roads where the dirt was dug up already while constructing the road. There are often environmental and archaeological studies required to bury conduit in these rights-of-ways that have clearly already been fully excavated in the past when building the road.

National Parks are the hardest places to build. I have a client who found grant money to bring wireless service to the Channel Islands off San Diego, much of which is a national park. The area had cellular coverage in the past but the carriers were removing the cell towers which means that the islands would be cut off from communications. The park wanted basic services like the ability of park visitors to call 911 and wanted data for the park rangers and a few others who still live on the islands.

But the barriers to building there were so stringent that the project could never be made to work. The park wouldn’t allow the construction of any new buildings or enclosures of any kind. They would not allow any dirt on the islands to be disturbed, meaning no digging of any kind. And there were incredibly expensive environmental studies required that I recall cost $150,000. Even though the people that worked at the park wanted new wireless service, and even though there would be great public benefit, the national park service rules basically made it impossible to install telecom gear.

And I have similar stories from all over the country. Trying to get fiber through national forests is almost as hard as national parks. Applications to build can be delayed for seemingly forever. There are usually environmental studies to be done even to build in existing rights-of-ways on existing roads, and there are numerous rules about how and when construction can be done. I’ve seen companies route fiber many miles out of the best path just to avoid the hassle of building through the federal land.

The problem is that these federal lands are often surrounded by rural communities that badly need broadband. But it’s hard to build fiber, cellular towers, or any kind of infrastructure if the parkland creates a barrier for reaching the areas with fiber.

It’s not just parklands that are a problem. Just trying to build under an interstate highway overpass or across a federal bridge can also be a very slow process. And those are found everywhere. As CenturyLink points out, there is no requirement that the agencies involved look at such requests in a timely manner. Sometimes such requests get processed quickly and sometimes they languish for a very long time.

If the federal government really wants to promote more rural fiber then they need to eliminate the barriers that they have created for their own lands, highways, and bridges.

The Battle for Eyeballs

There is an interesting aspect of the web that happens behind the scene and that doesn’t get a lot of press: the tracking and maximizing of web views on social media sites like Facebook and Twitter. Large content providers like the Huffington Post, BuzzFeed, and the New York Times very closely monitor how many shares they get on the various sites. The reason that shares matter is that the more eyeballs they get to look at their pages, the more they make from advertising. It’s easy to forget that advertising drives the web, but to these companies advertising is the major, and in some cases the only source of revenue.

Following is a list from NewsWhip showing the 10 largest content providers, based on Facebook shares, for August, 2015. Some of these are familiar names, but some post content under various names that a Facebook reader would more likely recognize.

NewsWhipContent providers are currently in a bit of a panic because the largest social media sites are working very hard to keep eyeballs on their own pages. When somebody clicks on a web article on Facebook they are sent away from Facebook and they often don’t return. Social media sites know that keeping eyeballs on their site increases their own ad revenues.

Twitter recently launched Moments, a space for content that stays inside the Twitter platform. Twitter directly creates content for Moments and has also invited partners to write and create content inside the Twitter platform. Facebook has been doing similar things through its Trending Topics pages that lead you to content within Facebook. They are also looking at a more aggressive platform they are calling Notify. LinkedIn probably started the trend and has enlisted heavy hitters from various industries to write content directly inside their site.

It’s a tough time to be a content creator. They are already seeing a downward trend in revenue due to ad blockers. It will be that much harder to make money as a content provider if they have to also compete the social media sites directly for content. After all, the social media sites know a lot more about what each of us is interested in, and companies like Facebook can use that knowledge to entice us to view content that they think is of interest to us.

The content creators have a real concern. For example, the Huffington Post has lost about 2 million Facebook shares per month over the course of this year. The issue matters to web users, because it is the content creators that make the web worth visiting. I personally use Twitter as a way to find articles about various tech industries and I am not that much interested in personal tweets by the people I follow. I am sure that many other people use these platforms the same way – as a way to follow topics they are interested in. But whenever large sums of money are involved somebody is always going to be scheming to capture market share, and the tug of war for advertising eyeballs is in full force.

The Disappearing Web

CompuserveSomeday you are going to click on the link to today’s blog and it will no longer be on the web. Let’s hope that it’s because I am retired and have stopped paying my annual fees to keep this blog on WordPress. But it also might be for another reason – that WordPress is sold, goes bankrupt, or just decides to get out of the web business.

We like to think of the web as a giant repository that is recording and storing everything that we are doing in this century – but nothing could be further from the truth. The vast majority of content on the web is going to disappear, and a lot sooner than you might imagine. There is very little of today’s content that will still be around even fifty years from now, and most of it will disappear long before that.

And this is because somebody has to spend money to put and keep the vast majority of content on the web. In the case of this blog I would have to keep paying WordPress. A lot of web content is on private servers and is not dependent upon a larger company like WordPress to keep going. But somebody has to pay for the bandwidth to connect these servers and to replace and migrate the content somewhere else when the servers inevitably wear out and die.

I don’t know much about the company behind WordPress, but what is the likelihood that they will still be in business fifty years from now even if I somehow paid them to maintain this blog forever? I would think that over the next fifty years that most of today’s big web companies will be gone. It’s hard to think that even the largest content repositories like Facebook will last that long. In the fast moving world of the web, fifty years is forever and companies will be supplanted by something new as tastes and trends change.

And even should the platform that has your content survive for fifty years, what are the chances that the coding underlying your content will still be supported fifty years from now? In the short history of the web we have already obsoleted much of the earliest content due to its format.

Web content already disappears a lot faster than people might believe. I’ve seen several sources that suggest that the average life of a web page is 100 days. And links die regularly. Around 8% of links die every year for one of the many reasons I’ve mentioned.

What is sad about all of this is that a lot of the content on the web doesn’t exist anywhere else. There are many blogs and news websites that are the main chroniclers of our times that don’t exist in any other format. It’s certainly possible that future historians will look back on this time as a big black hole of historical data.

Even should content be stored somehow off the web there really is no off-line electronic storage medium today that lasts very long. There are a few storage technologies that have the possibility to last longer, but there is very little web content that people value enough to turn into a long-term off-line format. And even if you bother to archive content, being able to read anything electronic years from now is likely to be a puzzle. No matter the technology used to store your content, that technology will be obsoleted by something better. It’s already getting hard to find somebody capable or reading content from as recent as twenty year ago.

A few years ago I read the correspondence between John and Abigail Adams. That correspondence provided a great peek into what it was like to be alive then. As a whole we are even more prolific today than people a few hundred years ago. People blog, email, and tweet in great volumes. But I find it a bit sad that nobody in the future is likely to be able to read this blog – because, gosh darnit, this is good stuff worthy of the ages!



The Cherry Picking Dilemma

iProvoI ran across an article written by somebody in Provo, Utah who claims that the penetration rate for paying data customers on the Google network has fallen to around 20% from a previous penetration rate of 30% when the network was operated by the city. I have no idea if the 20% penetration rate cited is accurate, but it is not surprising since Google also offers 5 Mbps for free in the City as part of the deal for buying the network from the city. I’m sure that a lot of households and students are taking the free option.

But the article did prompt me to think about cherry picking – the phenomenon where telecom carriers tend to mostly pursue customers who spend the most money. This topic is of particular interest when talking about Provo because the network that Google now operates was once an open access network. And I think the pre- and post-Google situations are worth comparing.

Back when the city operated the network they operated it on an open access basis, as required by Utah law. This means that the city was prohibited from being an ISP, but they could sell access to other last-mile service providers on the fiber network. Provo sold lit fiber loops on the network for roughly $30 per month. ISP using the network were then free to sell any services that a customer wanted to buy.

An open access network leads to a form of cherry picking in that no ISP is going to buy a $30 fiber loop and then offer a standalone inexpensive data product. There just is not enough profit in such a situation to sell a standalone $40 or $45 data products. Instead an ISP in an open access network will either price standalone data high, or else bundle it with lots of other stuff. You can contrast this to Qwest who would have competed against iProvo by selling low price DSL. I am sure that Qwest had some data products in the $30 per month range. They would have been much slower than the iProvo fiber but would have been attractive to the budget-minded customer.

And then consider Google who is definitely a cherry picker. They sell a gigabit of data for $70 per month. There are very few markets where a significant percentage of households are going to find that affordable, regardless of how attractive they might find the speed. I don’t know what Google’s target penetration rate is, but they can’t be shooting for the same overall penetration rate as the cable company can shoot for. The cable companies have a full range of products from slow to fast, from cheap to expensive.

I work with hundreds of ISPs and the one thing that I have consistently seen in every market across the country is that when customers have a choice between a low and a high priced data product that the vast majority of them will take the lowest priced data product that will give them a speed they can live with. Cable companies don’t expect more than a few percentage of households to buy their fastest and most expensive data product.

And so, even if the author of the article is right, I’m not sure that this is a negative thing for Google. If the city was selling broadband to 30% of households in an open access environment, then one has to imagine this represented a broad range of products at different prices and speeds. There would have been no really cheap products due to the $30 monthly loop rate, but there still was probably a range of packages between $50 and $200 with various combinations of data, video, and voice.

If Google has been able to get 20% of the people in Provo to pony up $70 per month for broadband they might be very happy with the results. They bought the network for $1, but obviously had to make some capital investments to get the network capable of gigabit everywhere. I see nothing automatically distressing about a 20% penetration rate of a very high margin product.

There are a lot of other new ISPs hitting various markets around the country today. A few of them are also peddling gigabit as their only product like Google. But most competitive ISPs still sell a mix of products. Whenever I talk to these companies I always caution them that given a choice that very few people are going to buy the gigabit if there is an affordable 100 Mbps alternative. I’ve seen a number of business plans that predict a high penetration rate of the fastest data product, but I’ve just seen human nature rear its head in almost every market I’ve ever worked in. Given a choice people will save money when they can. And all of the marketing in the world won’t get the to spend more than they are comfortable with.

Congress Considering Mandate for Conduit

innerduraFuturePathGroupThere is a bill making its way through Congress that ought to be of interest to the carrier community. It’s called the Broadband Conduit Deployment Act of 2015. It’s a bipartisan bill being sponsored by Rep. Anna Eshoo (D-CA) and Greg Walden (R-OR).

In a nutshell this requires that all federally funded highway construction projects include the installation of empty fiber conduits in cases where it is determined that an area has a need for broadband in the fifteen years after the construction. I have no idea who makes this determination.

There are a number of cities and counties around the country that have had this policy in place and it works, albeit slowly. People don’t realize it, but most local roads get rebuilt to some degree every thirty years, and so every year about 3% to 4% of roads in an area ought to be getting rebuilt. That number varies according to weather conditions in different parts of the country and according to how heavily a road is used. Roads that carry a lot of overweight loads wear out a lot faster. But federal interstate highways are built to a higher standard and are expected in many parts of the country to last up to forty years. And there are now some stretches of interstate highways that are fifty years old.

One has to wonder about how quickly there might be benefit from such a policy. Certainly any conduit put into urban stretches of highway would probably be grabbed up. But in a lot of places it might be a decade or more until the new conduit provides any real benefit. Once you get out of urban areas conduit is mostly used for long haul fiber, and so have having a patchwork of conduits here and there isn’t going to get many carriers excited.

But over time such a system will provide benefits as more and more stretches of a highway get empty conduits. The same thing has happened in the cities that have this policy. They hoped for a quick benefit for broadband when they introduced this kind of ordinance, but it often takes many years until there is enough conduit available to get any fiber provider excited. The place where almost any empty conduit is of immediate interest is if it runs through neighborhoods, because saving any construction costs on the last mile matters to a fiber builder.

The law is silent on how this conduit would be made available. I’ve worked with getting rights to government-owned fiber before and it has always been difficult. The government owner of a conduit doesn’t have the same sense of urgency as a carrier who is trying to build a fiber route. If you have to wait too long to get access to conduit you’re probably better off finding a different solution.

But it’s step in the right direction and over time this will produce benefits in some places. I also don’t know exactly what kind of roads qualify as receiving federal highway funding assistance. Obviously all interstate highways meet that test. But I’ve sat through many city council meetings where I’ve heard that state highway projects sometime get some federal funding assistance. If so, then this greatly expands the scope and potential of the law.

Similar bills have been bouncing around in congress since 2006 and never passed for one reason or the other. The White House is in favor of this bill as one more piece of the puzzle in promoting more broadband. The White House tried to implement an abbreviated version of this idea a few years ago through executive order, but apparently the implementation of that has been very spotty.

Like many good ideas that work their way up to Congress, this bill is probably twenty years too late. If this had been implemented at the time of the Telecommunications Act of 1996 then we would already have conduit all over the country that would provide cheaper transport. But I guess you have to start somewhere, so I hope this bill becomes law.

Lifeline and Rural America

FCC_New_LogoEarlier this year Chairman Tom Wheeler of the FCC proposed to change the Lifeline program to support broadband in addition to voice. In that proposal he suggested that a household should get at least 10 Mbps download and 1 Mbps upload in order to qualify for a Lifeline subsidy.

Here is where it gets weird. Frontier has filed comments that the 10/1 Mbps threshold is too high and that using such a high standard will stop a lot of rural households from getting Lifeline assistance. They are right, of course, but their solution is to lower the Lifeline threshold to whatever level is necessary to meet actual speeds in a given rural market.

Meanwhile, Frontier has taken a huge amount of money recently from the Connect America Fund for the purpose of raising rural DSL up to the 10/1 Mbps level. But they have six years to get to those speeds, and most of us in the industry think that even after all of their upgrades a lot of the rural households in the upgraded areas still won’t get 10/1 speeds. It’s going to be very hard for Frontier to do that with DSL in a rural setting where people are on scattered farms or back long lanes. I find it unlikely that Frontier, or any of the big telcos, are going to put enough fiber in the rural areas to actually achieve that goal.

But far more importantly, 10/1 DSL is not broadband. It’s not broadband by today’s current FCC definition that says broadband must be at least 25/3 Mbps, and it’s not broadband for real life applications.

I use my own household as the first example. There are two adults and one teenager. We work at home and we are cord cutters and get all of our video online. We have a 50 Mbps cable modem, and as cable modems tend to do, sometimes it slows down. When our speed hits 25 Mbps we’re all asking what is wrong with the Internet. So our household needs something greater than 25 Mbps for normal functioning. If we get less than that we have to cut back on something.

I have a friend with two teenage boys who are both gamers. He has a 100 Mbps Verizon FiOS connection on fiber, and when there are multiple games running everything else in the house comes to a screeching halt. For his household even 100 Mbps is not enough speed to meet his normal expected usage.

And yet here we are having discussion at the federal level of setting up two major programs that are using 10/1 Mbps as the standard goal of Internet speed. As a nation we are pouring billions of dollars into a project to improve rural DSL up to a speed that is already inadequate and by the time it is finally finished in six years will be massively below standard. It won’t take very many years for the average household to need 100 Mbps and we are instead taking six years to bring a huge amount of the rural parts of American up to 10/1 DSL.

I know that the FCC is trying to help. But it’s sad to see them crowing about having ‘fixed’ the rural broadband problem when instead they are condemning millions of households to have nearly worthless broadband for the next couple of decades. Imagine if they had instead allowed those billions of dollars to become matching funds for communities willing to invest in real broadband? Communities wanting to do this are out there and many of them were hoping to get some federal help to bring broadband to their areas. Building rural fiber is expensive, and even a little federal help would be enough to allow many rural areas to find the rest of the funding needed to build their own solutions.

And the problems are going to get worse, not better. Verizon didn’t even bother to take the federal subsidies to improve DSL because they don’t want to invest anything in rural copper. AT&T has told the FCC repeatedly that they want to tear down copper to millions of households and put rural households on cellular data. And while Frontier is going to try to make their rural copper plant better, how much can they realistically accomplish with 50–70 year-old copper that was neglected for decades before they bought it?

I just shake my head when I see that Frontier and the FCC are going to be wrangling about households getting Lifeline subsidies for speeds slower than 10/1 Mbps. The FCC has already decided that they are going to throw billions at rural copper and call it job done. It’s about time that we instead start having a conversation about bringing real broadband to rural America.