Lifting the Ban on Municipal Competition

FCC_New_LogoFCC Chairman Tom Wheeler said a few weeks ago that the FCC was strongly considering lifting state barriers to municipal competition. His reasoning is that there is such a large need for broadband infrastructure that governments should not stand in the way of anybody who is willing to make the last mile investment. There are twenty-two states that have barriers that are significant enough to either ban or make it almost impossible for municipalities to compete with broadband.

There are a number of kinds of existing barriers to entry in State laws. First there are states like Texas that just have an outright ban on municipal competition. But there are also a few states that handled this a different way, like North Carolina, which instead created a list of barriers to entry that are impossible for any City to meet. North Carolina’s law is an effective ban, but it never explicitly bans municipal competition, but instead is written to sound like there might be a path to compete.

Utah has an unusual restriction in that Cities there are allowed to build fiber networks, but they can only operate them on a wholesale basis, meaning that some commercial provider must come in to provide the services. This same restriction is also in place in Washington where the Public Utility Districts (rural electric companies) have this same restriction. As it turns out it is very difficult, and maybe impossible over the long-run to make money with a wholesale network. This was the issue faced by Provo who finally gave up and sold their network to Google. And the problems faced by Utopia in Utah are well known.

Finally, there are states like Louisiana that create extra hurdles for a municipal provider. These restrictions are generally couched in language that creates a ‘level playing field’. That sounds good on paper, but the municipal provider ends up being regulated and having to comply with more rules than the incumbent. An example of this is the Fair Competition Act in Louisiana that places a lot of requirements in Lafayette, the only municipal provider there so far. But since that law has been written, AT&T has been effectively deregulated in the state but leaving Lafayette with significant and expensive regulation

What is interesting is that these bans are sponsored by the large incumbents like AT&T and Comcast. But for the most part the Cities that have decided to build fiber are small, rural and are in places that the incumbents have written off years ago. No large NFL City has ever given really serious consideration to building fiber. And only two Tier 3 Cities – Chattanooga and Lafayette have done so. Most of the places that want to get into the fiber business are towns that realize that no commercial company is going to make an investment in their community.

The cities that want fiber see that their kids leave town because there are no decent jobs in the local economy. With poor broadband the businesses that are there have trouble competing today and many of them will eventually relocate to where there is fiber. Most towns that decide to consider fiber feel they have been pushed into that decision. They generally have asked the incumbent providers to make the investments, but those big companies are not investing in small town America. In fact, just the opposite is happening and AT&T told the FCC that they would like to disconnect millions of rural landlines.

I don’t know a town that built fiber that didn’t do it reluctantly. But I think everybody is finally coming to understand that fiber has become basic infrastructure. It’s as essential to the well-being of a community as streets and sewers. Places without broadband are going to fade away over time and become irrelevant in economic terms. And towns get this. They want fiber as a way to make sure that their community is still here and still relevant twenty and thirty years from now.

It just seems incredibly selfish and greedy for the incumbents to work so hard to ban small towns from building fiber, when they themselves will never make any investment in those communities. I guess that is just in the nature of large corporations to generically fight against any competition if it can be legislated. They want to milk the last dollars out of their aging copper plant before they cut it down one day and leave these communities stranded.

The Return of Active Ethernet

WDM_FORecently I have been seeing new fiber construction for residential service favoring Active Optical Networks (AON) over Passive Optical Networks (PON). This reverses a decade-long trend where PON had clearly won in the US market. The reemergence of AON has been sparked by Google and others who have pushed the industry to be able to offer gigabit connections to homes.

When fiber was first built to residential neighborhoods both technologies had some success in the market in the US. But once Verizon chose PON to build its FiOS network, other builders took advantage of the price reductions driven by Verizon’s large deployment and there were very few new active networks built. Around the world the choice of technology varies. Europe also has largely chosen PON technology due to a success there by Alcatel-Lucent. But South Korea went with active Ethernet due to the large number of apartment buildings in the network, where AON is a better solution.

AON technology has never disappeared in the US and it’s the technology of choice when building fiber networks to serve business districts and business parts. Metro Ethernet is one form of an AON technology. In fact, most fiber networks have a mix of technologies and many networks use PON to serve residences and small businesses but use active connections to serve large data customers like schools and large businesses.

There are pros and cons to both technologies when building in residential neighborhoods. But the two most distinctive characteristics are the way that bandwidth is delivered and the configuration of the physical fiber network. Active Ethernet requires a dedicated fiber for each customer. PON can put up to 32 customers onto a single fiber, although it’s generally some smaller number in actual field deployment. This means that the physical fiber bundles in the network have to be significantly larger in an AON deployment. This won’t make much of a difference when deploying in small towns or rural areas. But in a densely-packed urban area the extra fiber pairs required by AON can cause some concerns, particularly for fiber management where fiber connections are aggregated at headends and hubs.

Bandwidth is handled very differently with the two technologies. With Active Ethernet each customer gets whatever bandwidth the network provider supplies. Active networks routinely can deliver 1 gigabit to 10 gigabits. Larger connections are also possible but can get significantly costlier. A PON network shares bandwidth between customers. The mostly widely deployed PON technology in the country is GPON which delivers a gigabit path dedicated to delivering cable TV and a separate data Ethernet path of 2.4 gigabits to share among the customers on a given PON. That is sufficient bandwidth to give everybody a 100 Mbps connection, but it’s not enough bandwidth if you want to guarantee a gigabit connection. There is a new PON today that can deliver 10 gigabits to each PON, but many network operators still see that as inferior to direct gigabit connections when shooting for a gigabit delivery.

The other trade-off is cost and there are cases where each technology can cost less. The customer electronics generally cost more with PON. A PON network of any size requires placing huts in neighborhoods used to terminate the feeder fibers and electronics. But an active network has a greater cost for fiber since there are more fiber pairs in each bundle. But in many cases these two costs somewhat cancel out and we have seen small town deployments that price out almost identically under the two different technologies.

There was a time five years ago when anybody building a residential fiber network could not even consider active Ethernet. There were very few vendors active in the space that supported US markets. PON had won the US marketplace and when building a network once simply decided between the various PON vendors. But today it’s all up for grabs again and anybody building a new fiber network needs to give strong consideration to both technologies.

Net Neutrality Enters the Twilight Zone

tzIn the telecom world we are not very used to our issues getting a lot of notice from the public. But it’s obvious that net neutrality has become a political issue as much as it is an industry issue. Compared to the normal way we do business as an industry the debate has entered the twilight zone. This all got started when new FCC Chairman Tom Wheeler said that he was proposing new rules that would allow for the creation of an Internet ‘fast lane’, By that he meant that the FCC is going to allow the large ISPs to charge large content providers for premium access to their networks.

Of course, Chairman Wheeler is not himself neutral in this decision having spent years as the head lobbyist for the cable industry and opposing net neutrality. It’s somewhat ironic that he made this new announcement at the annual cable show with his cable company peers. The headlines that day made it sound like the FCC was going to take a legitimate shot at maintaining net neutrality, but within days it became understood that the fast lane idea was just the opposite and that he was handing the cable companies exactly what they wanted.

What I don’t think that Wheeler expected was that the public would jump all over his idea. And so, before the proposal was even released the Internet companies like Google and NetFlix weighed in against it. A huge number of consumer groups and many citizens weighed in against it.

And so, quite unexpectedly, the Chairman announced yesterday that he is changing the proposed rule, one that hasn’t even been released yet. He said that the revised rules would allow for ISPs to charge companies like NetFlix and Amazon for faster access to customers, but that non-paying companies would not be put into the slow lane. This makes no sense and is political double-speak. From a network engineering perspective you either give priority to bits or you don’t. If some companies get priority routing, then all other traffic gets degraded. That is the only way it can work on a network and no amount of regulatory talks can change the way that bits operate.

The idea gets even more bizarre if you think it through. What happens if 20 companies pay Comcast for priority access? Does the one who pays the most get slightly more priority than number two, and so on? The fact is that networks can’t do that. Bits are either prioritized or they are not, and so if a lot of companies pay for priority access we end up back where we are today for those companies, while the rest of the Internet would get degraded service.

One thing that pushes this into the Twilight Zone is that Rasmussen did a push poll on the topic and concluded that only 21% of Americans are in favor of net neutrality. Push polls are generally only used for hot button political topics where somebody wants to prove the opposite of what’s true. In this case, the main question of the poll was, “Should the FCC regulate the Internet like it does radio and television”. None of the questions asked had anything to do with net neutrality and instead were designed to elicit a specific negative response. Obviously there are dozens of better ways to have asked the public about net neutrality, including actually asking about it.

I have not conducted a poll, but I traveled all last week and in conversation I asked a number of people what they thought about the idea that the ISPs could give some companies priority access, which implies that others would get something less. Nobody thought that was a good idea and the general consensus was to leave things working the way they are. I believe there will be a huge amount of public discontent should the ISPs be allowed to break the Internet.

I don’t think Chairman Wheeler has any comprehension how important the Internet is to most people. He is skirting with making a huge blunder if he allows the Internet to get screwed up. He is making himself the public face of how the Internet functions, and if he breaks it people will blame him personally. He has the chance to become the next infamous political appointee to get compared to Michael Brown who was running FEMA during Hurricane Katrina. But perhaps he won’t mind being vilified since he is handing the cable companies a billion dollar opportunity to charge more to Internet companies.

The Ethics of Big Data

SpyVsSpyBack in 2010 Eric Schmidt, the CEO of Google at the time said something really frames the issues associated with big data. He said, “One day we had a conversation where we figured we could use [use Google’s data] to predict the stock market. And then we decided it was illegal. So we stopped doing that.”

Google doesn’t say this, but when you look at all of their data-gathering efforts it is clear that their ultimate goal is to create a database in of all human knowledge. They are far away from that today, but they have already amassed the largest database in history of the human race. Google is not the only one tackling this task. For example, Facebook has mapped out the social connections between more than a billion people. Wikipedia has undertaken something much smaller but has accumulated over 4.5 million factual articles in English. The nerd’s favorite web site WolframAlpha has accumulated an amazing number of facts about the world and can display them in easy-to-understand presentations. Both Google Maps and the OpenStreetMap database are both trying to create a database of our physical world.

Google has the scariest data about each of us because they know what we are thinking about and looking for from the Google search engine. And when they pair this up with other web data that identifies each of us they know what we are doing individually, but also what we are doing collectively. It’s said that Google now has the ability to predict many things about you due to the profile they have built on you. As an example, they know who’s an insomniac and what behavior insomniacs engage in, and so they probably understand insomniacs better, at a macro level than anybody else in the world.

This is not to say that Google is analyzing their data in that specific way, but they could be. And certainly they are making their data available for sale to other large companies who do want to know that kind of thing about us. Perhaps there is not yet a company who wants to market to insomniacs (but there probably is), but there are certainly companies who want to pinpoint their marketing to the most likely people to respond.

If you don’t think that big data companies are watching you, spend fifteen minutes looking at new cars on the Internet and then watch how many times new car ads pop up in your web experience in the next week. At the marketing level big data is already manifesting itself. But marketing is only the beginning, but the one that is making Google so wealthy today.

One can only begin to imagine the possibilities for Google and others to profit from the data they are going to be gathering from the upcoming Internet of Things. That data will include a lot of detail about our personal lives, from such mundane things like when we turn lights on and off to very personal things they will gather from medical monitors. And it’s all relevant and tells them a little more about us and lets them categorize you. Because in the end they want to profile you in great detail so that they can sell your data to those who are most interested in people just like you.

The question about whether this is good or bad is going to depend upon how they use this data. If they will sell this data to anybody willing to pay the price, then it’s bad, because not everybody is going to do good things with the data. There is already talk of companies using big data to prey upon the most vulnerable among us. It’s a well-known fact that poorest among us spend the most money on mundane things like cashing checks or getting a car loan, and with big data companies can pinpoint advertising to the most vulnerable of us. It’s certainly also possible for big data to be sold to companies that will use it overtly to do us harm. For instance, it’s not hard to envision a group of private investigators using personal data about us in all sorts of unsavory ways.

And probably most scary to me is if the government or the press has access to this data. I’ve heard the old axiom that nobody’s life can survive total scrutiny and that we all have things we would like to keep private. If the government and the press have access to big data, everybody can be made to seem guilty of something. This is the premise of ‘1984’ and many other science fiction books. We are getting very close to the day when that is no longer fiction.

The Death of the Browser

internet explorerI wrote yesterday a bit about the evolution of the devices that we use, but today I want to talk about a much more substantive change that is happening. There is a generational shift in the way people use the Internet and people under 30 years of age are using the Internet in a fundamentally different way than older people. This is starting to manifest itself in the services that are available over the Internet and we are reaching a point where it is going to affect what is available to everybody.

I’ve written before about how differently young people today use video. They rarely watch video in a linear fashion from the beginning to the end of a show. They would rather look at a highlight of a movie on YouTube than actually watch the movie. They generally are multitasking when watching anything and they don’t give video their undivided attention. One of the most popular ways for them to watch video is in the 7-second film clips on Vine.

The same fundamental differences are also there in the way that younger people use the Internet. People of this younger generation have now been raised on smartphones. And from that experience they predominantly prefer smartphone over PCs and tablets. You can see it in everything they do. They hate email and rarely use it. They instead text or chat with others directly. They don’t like sites like Facebook because the communications are too linear for them and instead use sites like Reddit, Imjur, 4Chan and 9gag, which are more akin to the way they communicate.

One of the biggest differences is that young people don’t use browsers and don’t even much like PCs or tablets. When people over 30, like me, think about the Internet we are really thinking about the browser experience. That means using programs like Internet Explorer, Google Chrome or Firefox. That is how we navigate the web, find information and communicate. We all learned the Internet with the AOL or similar browser and we still use the Internet in pretty much that same way. We generally drive our web experience based upon a browser and an email reader. We use Google to search for websites we are interested in. The way we use the web is very linear and based upon reading web sites, playing games, reading emails.

But younger people prefer the smartphone over the PC or tablet. To them computers are what they are forced to use for schoolwork, but the smartphone is where they do everything else. They assemble a pile of individual apps, each one to do a specific task. They are quick to swap any one of these apps when something better comes along. This is backed up by surveys. Pew Research Group has shown that 74% of teenagers use the Internet from their cellphone and 55% of them only use the cellphones to be on-line.

This trend is having a big influence on what is being developed for the web. It’s projected this year that web hits from cellphones will surpass hits from PCs and tablets, and so the Internet is flipping from PC-based to smartphone-based. We’ve already seen this in the marketplace where there are now major applications like WhatsApp that don’t even have a desktop equivalent. We also see PC staples like Facebook now being released as a series of apps rather than as a unified platform.

As more and more development is done for apps rather than for PCs, users of PCs are going to start falling out of the mainstream. And I get this, to some degree. I get my news from Flipboard, an application on my smartphone because it beats anything I have found on the PC. And most older users have a few apps they like, but as a group they mostly still use browsers. But as more and more new things are developed only for smartphone, older users will be lured more and more towards apps. In not too many years new development of the browser is going to die, and a few years after that the browsers will probably die.

I am not much enticed by the things that kids like today and I would rather be shot than spend an afternoon watching Vine. But when it comes down to being productive and actually getting work done, I turn to my PC or laptop. I am a PC man through and through, and if it comes down to it, you’ll have to pry my PC from my cold dead hands!

The Next Devices We Will Use

tricorder-okI’ve always thought that it’s hard to predict the future of how technology will affect the average person because it’s very hard to predict the kinds of devices that most people will use in the future. There certainly were not many people thirty years ago predicting how ubiquitous smartphones would be today. Nothing defines our technology culture any more than the devices that most of us use each day. We are now into an era where we adopt and then abandon new devices in short order. For example, look at the relatively short life-span of the iPod. I remember a day not too many years ago when everybody on a plane had one and now I rarely see them when traveling. I’ve run across several new devices that have the potential to become widely used.

HearablesHearables are a new class of wearable wireless devices that will sit in your ear. Think of them as earbuds without the wires. But they won’t just be for talking on the phone or listening to music. They could become your connection to the Internet as you move through your day, always connected live without the distraction of having to look at a smartphone screen.

 

Hearables seem a lot more practical than smart watches. With a hearable you will always be connected no matter what you are doing. It’s hard to imagine any practical use for a smart watch while driving a car. The simplest application of hearables will be a new interface to your smart phone. But the real potential is to tie them in with your own personal assistant so that they are with you all the time.

 

To see a first generation device that will be hitting the market soon look at the Dash, by Bragi, a German firm. It’s hard to know how hearables might change our life, but they will. For instance, you might be walking past a restaurant and you can check the specials of the day. You will always be able to check facts by talking to your personal assistant, and so our future might consist of a lot of people walking around mumbling to themselves. Certainly the personal assistants we use are going to have to improve, but everything I read says that they will be doing so soon. There are some experts predicting that hearables will be big business by 2018.

 

Finally, A Real TricorderEven more exciting to a science geek like me is to see the first device that could legitimately be called a tricorder. The device is a tiny molecular sensor called Scio built by Consumer Physics. This device will be hitting the market later this year and is the first personal device that will be able to detect and identify any substance except metals. The uses for it are almost unlimited. A dieter can use the device to count the calories in something before they eat it. Somebody who is allergic to shrimp could check every restaurant meal for shrimp residue before they eat it.

 

Consumer Physics will be building a substance database in the cloud, and so after a user of the device has sampled something into the database, every user of Scio will be able to identify that substance. The device could be used to test for chemical residues in food. It can search for pollution in soil. Or it could be used by a little old lady in the grocery store to find the most perfectly ripe pear as you patiently wait behind her. You should be able to check if the meat or milk you are about to eat is still good, or if the pill you are about to take is what it is supposed to be.

 

I’ve wanted a tricorder since watching the original Star Trek (I think I just dated myself) and this device can put one into everybody’s hands. You do not want to be with me the first month after I get one of these because I will be testing everything. There will definitely be a test to see if it can tell the difference between pale ale and a hoppy IPA. After all, you don’t want to consume the wrong beer!

Can Open Access Work?

MLGW_Substation_Whitehaven_Memphis_TN_2013-01-06_006Today I am meeting with the Public Utility Districts (PUDs) in Washington State and they have been gracious enough to invite me to be the keynote speaker at their convention. These are the rural electric companies that serve much of the state outside of the major cities.

Whenever there is any listing of the fastest Internet speeds in the country the areas served by some of these PUDs show up among the fastest places because many of the PUDs have invested heavily in fiber. But they have a unique business plan because there is a legal restriction in the state that prohibits PUDs from being in the retail telecom business. This has forced them into operating open access networks where they build the fiber network and let other companies provide services.

No two of the PUDs have gone about this wholesale business in exactly the same way, and so together they provide multiple experiments on ways to operate a wholesale open access network. I know several of the PUDs well and they have one universal problem – no large, well-financed service provider has agreed to offer service on their networks. No big cable companies or telcos or anybody you ever heard of wants to serve the many customers on these fiber networks. There are a handful of connections sold to companies that serve large businesses, like Zayo and Sprint, but no bit company that wants to serve smaller customers.

What is lacking is vigorous competition on their networks from multiple companies willing to serve residents and small businesses. And that is what open access is supposed to bring. Instead, most of the retail service on these networks is provided by local ISPs who took advantage of the opportunity to reach more customers. In many cases the local ISPs were so small and undercapitalized that the PUDs had to assist them to expand onto their networks.

There are not many other open access networks in the US. One of the largest ones was in Provo, which struggled with the model and eventually sold their network to Google. I was privy to Provo’s books and they could not find a business plan model that would make their business model cash flow. But if we look outside the US there is another great example of how open access can work if done right. Europe has a number of cities that have built fiber networks and invited ISPs and others to serve customers. In Europe this has been a big success because numerous service provider show up to provide service. Some of these providers were the former state monopoly companies that were unleashed to compete after the formation of the European Union. But there are also new competitors there akin to our CLECs and ISPs.

The big difference between US and Europe is that here none of the incumbent competitors are willing to operate on somebody else’s network. I can’t think of one example in the US of large cable companies competing against another one. And there is very little competition between the big telcos other than some fierce competition for some giant government and business accounts. Here in the US the PUDs have only been able to attract small local ISPs to operate on their networks. For the most part these ISPs do a good job, but they are small and have the problems that all small telecom companies have.

Many of the PUDs are in the uncomfortable position of only have one real service provider on their network. Should the owner of that business die or just go out of business a PUD could see most of their network go dark and all of the residents and businesses in their towns lose their fast Internet.

Anybody who understands telco finances instantly understands why this model is so hard to make work. A company must spend a lot of money to build a fiber network and then can charge only relatively small fees to others that use it. A typical revenue for wholesale access to a fiber network is in the range of $30 per customer per month and that is really not enough revenue to pay for building and operating the fiber network. By comparison, most triple play providers have an average revenue per customer north of $130 per customer per month.

The PUDs built the fiber because they are in rural areas where nobody else was going to do it. Their communities have already benefitted tremendously from the fiber. But they have their work cut out to keep this going, and I am sure they will figure out a way to do so.

A Solution for Net Neutrality?

Network_neutrality_poster_symbolToday Mozilla filed comments with the FCC with a clever solution that would fix the net neutrality fiasco. Attached is the Mozilla filing. I call the solution clever, because if the FCC wants to solve net neutrality Mozilla has shown them a path to do so.

Mozilla has asked to split Internet traffic into two parts. First is the traffic between ISPs and end-user customers. Mozilla is suggesting that this part of the business can remain under the current regulatory rules. The second portion is the traffic between ISPs like Comcast and AT&T and content providers like Facebook, NetFlix, etc. Mozilla recommends that the FCC reclassify this as transport under Title II of the Telecommunications Act of 1996.

The current dilemma we are facing with net neutrality is that FCC lacked the courage to classify the Internet network as common carrier business. Instead, in 2002, when broadband was growing explosively, the FCC classified all Internet traffic as an information service. And that decision is why we are even having the debate today about net neutrality. If the FCC had originally decided to regulate the Internet then it would have full authority to enforce the net neutrality rules it passed a few years ago.

But even in 2002 the FCC was a bit cowed by the political pressure put on them by lobbyists. The argument at the time was that the FCC needed to keep hands off the burgeoning Internet so as to not restrict its growth. It’s hard for me to see how classifying the Internet business as common carrier business would have changed the growth of the Internet and I believe it all boiled down to the fact that the cable companies did not want to be further regulated by the FCC.

The net neutrality rules written a few years ago by the FCC basically say that ISPs have an obligation to deliver all packets on the Internet without discrimination. Mozilla is suggesting that there is an additional legal obligation between ISPs and content providers to deliver their traffic without discrimination.

This argument might seem a bit obscure to somebody not in the industry, but it removes the dilemma of not being able to regulate the traffic between ISPs and content providers. The suggested change is to not classify data packets at the carrier level as information services, but to recognize it by its normal network function – that is the transporting of data from one place to another. Today transport is regulated in the sense that if a carrier sells a data pipe of a certain amount of bandwidth to another carrier they are obligated to deliver the bandwidth they have charged for. By putting the gigantic data pipes that extend between companies like NetFlix and Comcast under the transport regime it would treat Internet traffic like any other data pipe.

This change makes a lot of sense from a network perspective. After all, it’s hard to think of the transaction where NetFlix hands a huge data pipe to Comcast or AT&T as an information service. Comcast is doing no more than taking the data on that pipe and moving that data where it is supposed to go. That is the pure definition of transport. It only becomes an information service on the last mile of the network where the data traffic is handed off to end-user customers. There are already millions of other data circuits today that are regulated under the transport rules. It make logical sense to say that a 10 gigabit Internet circuit is basically the same, at the carrier level, as a 10 gigabit circuit carrying voice or corporate data. Data pipes are data pipes. We don’t peer into other data pipes to see what kind of traffic they are carrying. But by classifying the Internet as an information services that is exactly what we do with those circuits.

This idea gives the FCC an out if they really want net neutrality to work. I personally think that Chairman Wheeler is thrilled to death to see net neutrality being picked apart since he spent years lobbying against it before taking the job. So I am going to guess that the Mozilla suggestion will be ignored and ISPs will be allowed to discriminate among carriers, for pay. I hope he proves me wrong, but if he ignores this suggestion then we know he was only paying lip service to net neutrality.

Will We Be Seeing Real Artificial Intelligence?

robbyI have always been a science fiction fan and I found the controversy surrounding the new movie Transcendence to be interesting. It’s a typical Hollywood melodrama in which Johnny Depp plays a scientist who is investigating artificial intelligence. After he is shot by anti-science terrorists his wife decides to upload his dying brain into their mainframe. As man and machine merge they reach that moment that AI scientists call the singularity – when a machine becomes aware. And with typical Hollywood gusto this first artificial intelligence goes on to threaten the world.

The release of this movie got scientists talking about AI. Stephen Hawking and other physicists wrote an article for The Independent after seeing the movie. They caution that while developing AI would be the largest achievement of mankind, it also could be our last. The fear is that a truly aware computer will not be human and that it will pursue its own agenda over time. An AI will have the ability to be far smarter than mankind and yet contain no human ethics or morality.

This has been a recurrent theme in science fiction starting with Robby the Robot up through Hal in 2001, Blade Runner and The Terminator. But when Hawking issues a warning about AI one has to ask if this is moving out of the realm of science fiction into science reality.

Certainly we have some very rudimentary forms of AI today. We have Apple’s Siri and Microsoft’s Cortana that help us find a restaurant or schedule a phone call. We have IBM’s Deep Blue that can beat the best chess players in the world, win at Jeopardy and that is now making medical diagnosis. And these are just the beginning and numerous scientists are working on the next breakthroughs in machine intelligence that will help mankind. For example, a lot of the research into how to understand big data is based upon huge computational power coupled with some way to make sense out of what the data tells us. But not all AI research leads to good things and it’s disconcerting to see that the military is looking into building self-aware missiles and bombs that can seek out their targets.

One scientist I have always admired is Douglas Hofstadter, the author of Godel, Escher, Bach – An Eternal Golden Braid, that won the Pulitzer prize in 1980. It’s a book I love and one that people call the bible of artificial intelligence. It’s a combination of exercises in computing, cognitive science, neuroscience and psychology and it inspired a lot of scientists to enter the AI world. Hofstadter says that Siri and Deep Blue are just parlor games that overpower problems with sheer computational power. He doesn’t think these kinds of endeavors are going to lead to AI and that we won’t get there until we learn more about how we think and what it means to be aware.

With that said, most leading scientists in the field are predicting the singularity anywhere from 20 to 40 years from now. And just about everybody is sure that it will happen by the end of this century. Hawking is right and this will be the biggest event in human history to date – we will have created another intelligence. Nobody knows what that means, but it’s easy to see how a machine intelligence could be dangerous to mankind. Such an intelligence could think circles around us and could compete with us for our resources. It would likely put most of us out of work since it would most of the thinking for us.

And it will probably arise without warning. There are numerous paths being taken in AI research and one of them will probably hit pay dirt. Do we really want a smart Siri, one that is smarter than us? My answer to that question is, only if we can control it. However, there is a good chance that we won’t be able to control such a genie or ever put it back into its bottle. Add this to the things to worry about, I guess.

The Cost of Bond Financing

ppp_logoI have worked on fiber projects where the project has a choice to finance something through municipal bonds or through commercial loans. Such projects involve a government entity as well as a commercial partner. These public private partnerships are becoming more common as cities are looking for fiber and commercial companies are looking for help getting projects financed.

I have always told people that financing through municipal bonds is the most expense kind of debt possible. People at first don’t believe this until I show them. Afterall, the interest rates on municipal bonds is generally a lot lower, and in today’s market, and depending upon the rating of the bonds involved, you see 4% or 4.5% interest on bonds versus 7.0% – 10% interest on the equivalent commercial debt. And so people assume that municipal financing is a better deal.

In fact, around the country when the large incumbents try to pass laws that make it hard for municipalities to get into the fiber business they generally list the ability to obtain municipal financing as one of the big benefits that municipalities have over commercial firms. However, as the following numbers show, this is not true. Consider a project that is going to build a $30 million dollar fiber network. The project is also going to ask for $2.3 million in working cash to cover operating expenses. Following shows the financing using a revenue bond and using commercial debt.

Revenue Bond  Commercial Loan
Assets to be Built $30,000,000 $30,000,000
Fees $900,000 $100,000
DSRF $2,700,000 $      –
Bond Insurance $300,000 $      –
Capitalized Interest $6,500,000 $      –
Construction Interest $1,600,000
Working Capital $2,300,000 $2,300,000
Loan $42,700,000 $34,000,000
Interest Rate 4.50% 7.00%
Term 30 15
Annual Payment $2,621,419 $3,733,017
Total Outlay $78,642,566 $55,995,259

The first obvious difference is that you have to borrow a lot more money with a bond. Here are some of the reasons:
• Bonds require you to take the money in a lump sum and then pay interest on the full amount of borrowing during the time the project is being constructed. Further, bonds generally require the project to capitalize interest, that is borrow the amount up front to make the first three years of bond payments. In contrast, a commercial loan generally uses construction financing, meaning you draw the money as needed and only pay on what you have borrowed.
• Revenue bonds generally require a Debt Service Reserve Fund (DSRF) which puts one year of debt payment into escrow as a hedge against the project having trouble making the bond payments.
• Bonds often also require bond insurance, which is a policy that will make a full annual payment to bond holders should the bonds default.
• Finally, there are huge fees associated with floating bonds. There are many attorneys involved as well as substantial payments to bond trading desks for selling the bonds.

In this example, the bond debt is $8.7 million higher than the equivalent commercial debt. Bonds typically have lower interest rates and longer terms than commercial debt, and in this example mean that the annual payments are $1.1 million less per year. But there is a penalty to be paid for financing anything over a long term (like your home mortgage) and that is that you pay a lot more out over the life of the loan. In this example, the total cash outlay is $22.6 million higher for the bond debt, which is a 40% cash premium to pay for using bonds.

Municipal entities generally use bonds for several reasons. First, bonds rarely require any equity and the borrower can borrow 100% of the cost of the project. But the main reason that municipalities use bonds is that they are comfortable with this kind of financing and they don’t know anything else.

The problem this causes is that everything that the government builds in this manner costs more than if a commercial entity built the same project. I said the above example was for a fiber network, but it could just as well been for a water processing plant, a new high school, a new court house or any other municipal project.

We have an infrastructure crisis in this country and all of governments added together are capable of only borrowing a small percentage of the money needed to build and fix everything that is needed. So we need to abandon the bond model of financing a lot more often and start looking at public private partnerships as a way to get things done.