An Effective Federal Broadband Program, Part 3

outdoor-indoor-cable-161This is the third in my series of blogs looking at the best way to administer a federal broadband construction program. Since there is talk of having an infrastructure program that might include money for broadband, I hope that the folks at places like the NTIA are giving these issues some thought. The last time around the stimulus grants caught them and the whole industry by surprise. But this time, with some advanced thought and planning we can do better and get more bang from any federal dollars. After all, if there is a broadband program, it ought to have the number one goal of bringing broadband to as many people as possible. Following are some additional thoughts on structuring a federal program:

Consider Local Conditions More. The stimulus grants included a simplistic formula that offered different levels of grant funding to served and underserved communities. We need to get more sophisticated this time around and realize that the cost of broadband networks has a lot more to do with terrain and density than it does with whether customers are served or unserved. There is a huge difference in the cost to reach an unserved customer in the open plains of the Midwest compared to Appalachia. And other local conditions like the state of poles can make a big difference in cost. The CAF II funding took a stab at the differences by using proxy cost models to try to reflect the relative cost to construct in different parts of the country. But even those models are too simplistic and we can do better.

This also means that there should be no predetermined formula that determines of the amount of matching funds that are available for any project. Sparsely populated areas might require more than 50% federal matching to make the numbers work. I know it’s difficult to not be formulaic, but ideally each proposal for funding should be analyzed on its own and the appropriate funding award made according to the circumstance.

Be Open to Funding All Qualified Providers. The stimulus grants (particularly the ones awarded by the RUS) had a built in bias to give the money to existing RUS borrowers. For broadband that means basically small telcos and some electric coops. If we want to get broadband to the most rural places, then anybody willing to step to the plate with a good business plan and some experience needs to have an equal chance. This might mean ISPs, municipalities, cooperatives, cable companies or fiber overbuilders. There is angst among smaller carriers that any federal funding will go to the largest telcos and that smaller providers won’t get an opportunity to try for the money, as was done with CAF II.

Takes Time to have Shovel Ready Projects. At any given point in time there are not many shovel ready projects that are positioned to take funding immediately. My fear is that any federal program is going to come with a built-in clock ticking and will try to give out the money in a relatively short amount of time like was done with the stimulus grants. It can easily take a year to create a shovel ready project even for a community that is highly motivated. There are a lot of steps that must be undertaken before completing a grant application. And if there is a requirement that the matching funding must be in place in order to participate then that time frame can easily be a lot longer. So my hope is that any program gives the industry enough time to get ready. If the funds are going to be awarded within a year then it’s going to be a disaster and a lot of bad projects will get funded just because they were able to scratch together the funding request quickly. This can be successful if broadband money can be awarded over a two to four-year period rather than all at once. The longer the time frame, the better the proposed projects will be.

Don’t Break the System. There are a limited number of firms available to help put together business plans and to make engineering estimates. If a federal program tries to give out a lot of money too quickly there are not enough qualified engineers and financial consultants available to get the work done – and it’s not easy for these firms to staff up with people that have the necessary existing knowledge. We also saw shortages with fiber cable and electronics right after the stimulus plan. All segments of the industry are staffed and geared to an anticipated level of demand and it’s hard for the whole industry to pivot and react quickly to a massive new demand for services and components.

Make the Grant Forms Understandable. I have been doing telecom accounting since the 1970s and there were things on the stimulus grants forms that I didn’t understand. Bring in a panel of industry experts early to make sure that the forms used to ask for money are done in a way that the industry understands. A format that asks for financial input in the manner that the industry keeps their books will provide a lot more consistency between grants requests.

Control of the Internet

The InternetIf you follow presidential politics you may have heard a few candidates claim that the US government is giving away control of the Internet. This one puzzled me, and it turns out what they are talking about the transition of the control of the DNS function from US control to a non-profit international body. It turns out that this is something that has been in the works for decades.

The issue involves DNS, or the Domain Name System. This is the system that matches the name of a web site with an IP address. This system allows you to go to the amazon.com website by typing the name address “amazon.com” into your browser instead of having to know the numerical IP address for Amazon.

DNS is essential to ISPs because it tells them how to route a given request on the web. There is one master file of all worldwide web names and the associated IP addresses. And obviously somebody has to be in charge of that directory to add, delete and make changes to web names and IP addresses.

After the early days of the Internet this function went to a group called IANA, the Internet Assigned Numbers Authority. This group was largely managed by a few staffers, academics, and help from some of the early web companies – all techies who only wanted to make sure that the burgeoning web worked well. And although they didn’t exert any control, the group was loosely under the auspices of the NTIA (National Telecommunications and Information Administration), a part of the Department of Commerce which had veto power over anything done by IANA.

This power was rarely exercised, but there were many around the world that were uncomfortable with the US Government being in charge of a vital web function. There was a push for an international group to take over the DNS function and in 1998 the function was transferred to ICANN, the Internet Corporation for Assigned Names and Numbers. ICANN brought in Board members from around the world and the group has effectively since then been operated with international consensus. But the NTIA still maintained a veto power over things done by the group.

But since it was founded there has been a planned transition to a fully international ICANN with no ties to the US government and on October 1 control of ICANN changed hands and is now operated only by an international Board without oversight from the US government.

Just a few weeks before the planned transfer four states sued to stop the transfer in the US District Court in Texas. Their argument was that the directory of IP names and addresses belonged to the US and could not be given away without approval from Congress.

The opponents to this suit argued that not turning over the control of ICANN was a much bigger threat because it might lead to other countries developing their own DNS databases – and the ability of anybody in the world to reach any web address using the same nomenclature is vital to the concept of an open and free Internet. Interestingly, it was this same concept a century ago – that anybody with a telephone ought to be able to call any other telephone number in the world – that was a driving principle in creating an efficient worldwide telephone network.

The suit was processed quickly and the judge came down on the side of the open Internet and the transition to ICANN. In the end this fight was more about politics than anything substantial. At the end of the day the DNS database is nothing more than the equivalent of a gigantic white pages listing of every address on the Internet. All that really matters is that this database be kept up to date and be available to the whole world. ICANN has had the same international board of techies since 1998 and this transition was planned for a long time. So there is no threat to the US losing control of the Internet folks that saw the headlines can sleep well knowing that this issue was about politics and not about a real threat.

Government and the Digital Divide

Capitol_domeThere were two interesting announcements from politicians in the last week concerning the digital divide. First, there was an announcement from President Obama saying that he wants to connect 20 million more Americans to broadband by 2020. Then Greg Abbott, the governor of Texas, announced that he wants to connect all of the 5.2 million schoolchildren in Texas to the Internet by 2018.

President Obama’s announcement was accompanied by a plan called ConnectALL. The plan was prompted in part by a recent study that shows that households making less than $25,000 per year are half as likely to have broadband as households that make more. The plan makes a number of specific proposals for things the federal government can do to increase broadband penetration rates:

  • The primary tool proposed is to revise the Lifeline program that subsidizes telephone service for low income households and to redirect the $1.2 billion spent annually on that program to subsidize broadband connections instead. This is something that is already underway at the FCC and the proposed rules on how this might work are expected out later this year.
  • The plan also includes an initiative to improve digital literacy skills. The plan would engage a number of volunteer and non-profit organizations to make this a priority. This would include AmeriCorps volunteers as well as organizations like the Corporation for National and Community Services, and the Institute of Museum and Library Services. The plan would also promote more computer skill training at two-year colleges.
  • The plan would also promote the reuse of computers and similar equipment no longer needed by the federal government.
  • The plan would also direct the NTIA to get more involved in supporting community broadband planning. It would also bring in a number of non-profits and philanthropic groups to help with this effort.
  • The plan also calls for ISPs to offer more lower-price products for low-income households.

The Texas governor has not yet released any details of how he might go about connecting all school children to broadband in such a short period of time. The only solution I can imagine that could happen that quickly would be some sort of cellular plan just for kids to get connected to school servers. 2018 is practically right around the corner in terms of solving broadband issues.

These kind of announcements always sound great. Certainly both politicians have identified real issues. It’s becoming quite clear that poor households are increasingly finding broadband unaffordable. But one has to ask how much success the federal plan might really have. Certainly subsiding internet connectivity for low-income households will bring some new households onto the Internet. But you need to ask how much of an incentive $10 per month is for a home that can’t afford broadband today.

Certainly the $1.2 billion per year in Lifeline funding can reach 20 million people – that amount will provide cheaper broadband to 10 million homes. But you would have to think that a lot of those homes are already receiving this same subsidy today for their home phone, and when a household swaps a phone subsidy for a broadband subsidy they are no better off in terms of total telecom spending. They will just have swapped a $10 per month discount from one bill to another.

And all of the other proposed solutions sound wonderful on paper – but will they work to get more people on the Internet? I know that computer literacy training can work well if done right. I have one client who has been holding training sessions for customers for well over a decade and over the years they have brought a lot of elderly in their community onto the Internet. But they say that it takes a major time commitment for each potential customer and a concentrated effort for this to work – they often will work with a given customer for many months before that person is comfortable enough to buy Internet at their home.

And none of the federal ideas really fix the underlying problem of affordability. The Lifeline program will reduce broadband by $10 per month, but in homes that are surviving on jobs that pay $12 per hour or less, broadband at any price is hard to afford. I certainly don’t have an answer to this problem, but there are other ideas that I think ought to be considered as well. For example, $1.2 billion per year could supply a lot of broadband by building a huge number of neighborhood WiFi transmitters that could bring cheap or free Internet to many homes at the same time. I’ve always thought that the cities that are looking to provide free WiFi broadband are on the right track because that brings broadband the neediest households  without the paperwork and expense that comes with subsidy programs.

The last item on the list above has the most promise. A lot of good could come from pushing the major ISPs to offer a $10 or $20 broadband alternative. But this was forced onto Comcast a number of years ago and they largely shirked the responsibility and provided low-price broadband to very few homes.

I’ve been skeptical for years that the Lifeline program makes a lot of difference. It probably did when the program first started in 1985 and the typical phone bill was under $20. But the $10 discount that was a lot in 1985 is worth a lot less now. It just doesn’t feel like enough of an incentive to make the difference the government is hoping for.

Getting Access to Existing Fiber

Fiber CableFrontier, the incumbent in West Virginia that bought the property from Verizon, is fighting publicly with Citynet, the biggest competitive telco in the state, about whether they should have to share dark fiber.

Dark fiber is just what it sounds like – fiber that has not been lit with electronics. Most fibers that have been built have extra pairs that are not used. Every fiber provider needs some extra pairs for future use in case some of the existing lit pairs go bad or get damaged too badly to repair. And some other pairs are often reserved for future construction and expansion needs. But any pairs above some reasonable safety margin for future maintenance and growth are fiber pairs that are likely never going to be used.

The FCC has wrangled with dark fiber in the past. The Telecommunications Act of 1996 included language that required the largest telcos to lease dark fiber to competitors. The FCC implemented this a few years later and for a while other carriers were able to lease dark fiber between telephone exchanges. But the Bell companies attacked these rules continuously and got them so watered down that it became nearly impossible to put together a network this way. But it is still possible to lease dark fiber using those rules if somebody is determined enough to fight through a horrid ordering process from a phone company that is determined not to lease the dark fiber.

The stimulus grant rules also required that any long-haul fibers built with free federal money must provide for inexpensive access to competitors willing to build the last mile. I don’t know the specific facts of the Citynet dispute, but I would guess that the stimulus fiber is part of what they are fighting over.

The stimulus grants in West Virginia are about the oddest and most corrupt of all of the stimulus grants that were awarded. The stimulus grant went originally to the State of West Virginia to build a fiber line that would connect most counties with a fiber backbone. There were similar fiber programs in other states. But in West Virginia, halfway through construction, the network was just ‘given’ to Verizon, who was the phone company at the time. The grant was controversial thereafter. For instance, the project was reduced from 915 miles to 675 miles, yet the grant was not reduced from the original $42 million. This means the final grant cost a whopping $57,800 per mile compared to similar stimulus grants that cost $30,000 per mile.

According to the federal rules that built the fiber, Citynet and any other competitor is supposed to get very cheap access to that fiber if they want to use it for last mile projects. If they don’t get reasonable access those grants allowed for the right to appeal to the FCC or the NTIA. However, the stimulus grants were not specific about whether this was to be dark fiber or bandwidth on lit fiber.

But this fight raises a more interesting question. Almost every long-haul fiber that has been built contains a lot of extra pairs of fiber. As I just noted in another recent blog, most rural counties already are home to half a dozen or more fiber networks that almost all contain empty and un-used fiber.

We have a rural bandwidth problem in the country due to the fact that it’s relatively expensive to build fiber in rural places. Perhaps if the FCC really wants to solve the rural bandwidth shortage they ought to take a look at all of the dark fiber that is already sitting idle in rural places.

It would be really nice if the FCC could force any incumbent – be that a cable company, telco, school system, state government, etc.– that has dark fibers in rural counties to be forced to lease it to others for a fair price. This is something that could be made to only apply to those places where there is a lot of households that don’t have access to FCC-defined broadband.

We don’t actually have a fiber shortage in a lot of places – what we have instead is a whole lot of fiber that has been built on public rights-of-way that is not being used and that is not being made available to those who could use it. It’s easy to point the finger at companies like Frontier, but a lot of the idle fiber sitting in rural places has been built by government entities like a school district or a Department of Transportation, that is not willing to share it with others. That sort of gross waste of a precious resource is shameful and there ought to be a solution that would make truly idle fiber available to those who would use it to bring broadband to households that need it.

Is the Internet Already Broken?

The InternetI’ve always been interested in the people who run the Internet behind the scenes. The process is known as Internet governance and it’s not the kind of topic that makes for many news articles, but the governance process has gotten us to the Internet we have today, which is very impressive. But there are changes in the governance coming that has some people worried.

Last year it was announced that the National Telecommunications and Information Administration (NTIA), a US government agency, was going to to relinquish its oversight of the global Internet naming authority ICANN (International Corporation for Assigned Names and Numbers). ICANN is the private nonprofit organization that oversees how domains are named and assigned, and until now the US has had formal oversight of the process.

Adding to this, there is a huge amount of concern worldwide about how the Internet is being used to spy on governments and people everywhere. Edward Snowden showed that the NTA is basically spying on everybody. Since then it’s been revealed that many other governments are doing the same sort of thing.

Last month one of my blogs had a poll that showed that people in the US don’t like being spied upon, but that as a whole we think it’s okay to spy on everybody else. As you can easily imagine, all of those other people don’t think that is a very comforting idea. And so we now have a number of countries looking for ways to somehow build a firewall around the data originating in their country.

As the NTIA is transitioning out of the governance of the Internet, there is a worldwide scramble to figure out what is going to replace it. The latest buzzword associated with this effort is ‘multi-stakeholder internet governance’, meaning the discussions are asking how the concerns of each country are to be heard in the process. There is a lot of talk going on about ruling by consensus. And this makes a lot of technology experts uneasy, an unease which can quickly be understood when looking to see how other multi-national consensus-based efforts at places like the UN actually function.

The general open concepts of the Internet as we know it today are based upon the strong views of the tech people who built the Internet that it ought to be open and free whenever possible. And so we ended up with this wonderful free-for-all that we call the web where ideas and content of all varieties are available to all. And those tech people are rightfully concerned of somehow handing the decisions off to bureaucrats who won’t care what works the best but who will bring other agendas into the governance process.

Governments around the globe differ extremely by what they want their citizens to see or not see on the web. Even a country that is as close to us culturally as England has some very different Internet policies and has built screens and firewalls that stop citizens from viewing pornography and a large list of other types of content. At the extreme end of that range are places like North Korea that doesn’t let the average citizen see the Internet at all.

And so many of the folks who have been governing the Internet behind the scenes are worrying if we have already broken the Internet as it was originally structured. This issue is not so readily apparent to Americans since we filter very little of the Internet here other than the effort that ISPs make to block malware generating sites.

But much of the rest of the world has already started down the path to wall themselves off from us us and this trend is building momentum. We probably will reach multinational consensus on the easy stuff – how to name web sites and how to route things. But one can legitimately ask if the Internet is already broken when there are already so many countries that block their citizens from using large chunks of what we Americans think of as the Internet.

Those Damned Statistics

thCAVW45NPOne of my biggest pet peeves in life is the misuse of statistics. I am a math guy and I sometimes tackle math problems just for the fun of it. I understand statistics pretty well and my firm performs surveys. I think I disappoint a lot of my clients when I try to stop them from interpreting the results in a survey to prove something that the responses really don’t prove. Surveys are a really useful tool, but too often I see the survey results used to support untruthful conclusions.

A week ago the NTIA (National Telecommunications and Information Administration) released their latest poll looking at broadband usage in the US. The survey asked a lot of good questions and some of the results are very useful. For example, they show that overall broadband penetration in the US is up to 72% of households. But even that statistic is suspect, as I will discuss below.

The problem with this survey is that they didn’t ask the right questions, and this largely invalidates the results. The emphasis of this particular survey was to look at how people use cellphones for data access. And so they asked questions such as asking the various activities that people now use their phone for such as browsing the web or emails. And as one would expect, more people are using their cellphones for data, largely due to the widespread introduction of smartphones over the last few years.

There is nothing specific with any of the individual results. For example, the report notes that 42% of phone users browse the web on their phone compared to 33% in 2011. I have no doubt that this is true. It’s not the individual statistics that are a problem, but rather the way the statistics were used to reach conclusions. In reading this report one gets the impression that cellphone data usage is just another form of broadband and that using your cellphone to browse the web is more or less the same as browsing off a wired broadband connection.

The worst example of this is in the main summary where the NTIA concluded that “broadband, whether fixed or mobile, is now available to almost 99% of the U.S. population”. This implies that broadband is everywhere and with that statement the NTIA is basically patting themselves on the back for a job well done. But it’s a load of bosh and I expect better from government reports.

As I said, the main problem with this report is that they didn’t ask the right questions, and so the responses can’t be trusted. Consider data usage on cellphones. In the first paragraph of the report they conclude that the data usage on cellphones has increased exponentially and is now deeply ingrained in the American way of life. The problem I have with this conclusion is that they are implying that cellphone data usage is the same as the use of landline data – and it is not. The vast majority of cell phone data is consumed on WiFi networks at work, home or at public hot spots. And yes, people are using their cellphones to browse the web and read email, but most of this usage is carried on a landline connection and the smartphone is just the screen of choice.

Cellular data usage is not growing exponentially, or maybe just barely so. Sandvine measures data usage at all of the major Internet POPs and they show that cellular data is growing at about 20% year, or doubling every five years, while landline data usage is doubling every three years. I trust the Sandvine data because they look at all of the usage that comes through the Internet and not just at a small sample. The cell carriers have trained us well to go find WiFi. Sandvine shows that on average that a landline connection today uses almost 100 times more data than a cellphone connection. This alone proves that cellphones are no substitute for a landline.

I have the same problems with the report when it quantifies the percentage of households on landline broadband. The report assumes that if somebody has a cable modem or DSL that they have broadband and we know for large parts of the country that having a connection is not the same thing as having broadband. They consider somebody on dial-up to not be broadband, but when they say that 72% of households have landline broadband, what they really mean is that 72% of homes have a connection that is faster than dial-up.

I just got a call yesterday from a man on the eastern shore of Maryland. He live a few miles outside of a town and he has a 1 Mbps DSL connection. The people a little further out than him have even slower DSL or can only get dial-up or satellite. I get these kinds of calls all of the time from people wanting to know what they can do to get better broadband in their community.

I would challenge the NTIA to go to rural America and talk to people rather than stretching the results of a survey to mean more than it does. I would like them to tell the farmer that is trying to run a large business with only cellphone data that he has broadband. I would like them to tell the man on the eastern shore of Maryland that he and his neighbors have broadband. And I would like them to tell all of the people who are about to lose their copper lines that cellular data is the same as broadband. Because in this report that is what they have told all of us.

Finding a Broadband Partner

Logo of the United States National Telecommuni...

Logo of the United States National Telecommunications and Information Administration, an agency in the Department of Commerce. (Photo credit: Wikipedia)

The NTIA issued a notice last week that asks if they should continue the BroadbandMatch website tool. This tool was created during the stimulus grant process and the original goal was to connect partners for applying or implementing the broadband grants. And the process worked. One of the components of the grants was the requirement for matching funds and there were many grant applicants with a great idea who had to find a partner to supply the matching funds. A significant percentage of the stimulus grants involved multiple parties and many of them found their partners using this tool.

On the NTIA tool a company would describe what they were trying to do and would describe the kind of partner they were looking for. And the main reason this worked was that the government was giving away billions of dollars for fiber construction, and so a lot of companies were looking for a way to get in on the action. Many of the companies involved in the grant process were new companies formed just go get the grants. The NTIA tool gave companies who were not historically in the telecom business a way to find potential partners

The NTIA asks if they should keep this service going, and if so how it ought to work. I will be the first to say that I was surprised that the tool was even still around since it was clearly designed to put together people to make stimulus grants work. The only way a tool like this can work now is if everybody in the industry knows about it and thinks to look there when they are interested in making an investment.

But I am going to guess that if I didn’t know that this tool was still active that hardly anybody else does as well. It was great for the purpose it was designed for, but one has to ask if this is going to be a place where companies look when they are seeking a partner. It has been my experience that outside that grant process, which was very public, that most people want to keep the process of forming new ventures as quiet as possible to avoid tipping the competition too early. And so, without the billions of public dollars that made the grants attractive I can’t see this tool being of much interest.

But this leads me to ask how a company can find a partner for a new telecom venture? The most normal type of partnership I see is one made between people with technical expertise looking for investors and people with cash looking for opportunities. So how do these kinds of partners find each other?

At CCG we have helped numerous carriers find partners and the following, in our experience, is what has worked and not worked:

  • Put out a formal request for a partner. This may mean issuing an RFP or an RFI or advertising somewhere to find interested parties. I have not found this process to be particularly fruitful, because it normally doesn’t uncover any potential partners that you didn’t already know.
  • Get to know your neighbors better. I have found that most partnerships end up being made by people in the same geographic area. It is not uncommon for the parties to not know each other well before the partnership, and sometimes they are even competitors. But there is a lot more chance that people in your region will best understand the potential for local opportunities.
  • Don’t be afraid to cross the line. Commercial CLECs and independent telephone companies are usually dismayed by municipalities that get into the telecom business. But generally those cities are just hungry for broadband and in almost every case they would prefer that a commercial provider come and build the infrastructure in their community. So crossing the line and talking to municipalities might uncover the best local partnership opportunities. If a town wants broadband badly enough (and many of them do) then they might be willing to provide concessions and cash to make it work.

Of course, this doesn’t even begin to answer the question of how to make a partnership work, which I will address in later blogs this week.