AT&T and the NSA

NSAIt’s been revealed recently that AT&T has been the largest partner of the NSA in collecting data and spying on the Internet. A number of leaks came from the Edward Snowden data that points to AT&T and a lot of other evidence points to them as well.

It appears that AT&T has been a willing partner in surveillance activities since 1985, so this is not a new relationship, but one that has been ongoing for decades. It looks like AT&T is far larger than any of the other NSA partners.

The surveillance goes beyond just telephone records – something that started in a major way in the years after 9/11. All large phone companies have been required through the Patriot Act to submit phone records, and so naturally AT&T would comply just like the other big phone carriers.

But it appears that the NSA, with AT&T’s help, has been spying on Internet traffic for a very long time. AT&T certainly is a natural partner in this because they are one of the few companies that has always had a reason to be present at every major Internet hub in the country.

I have been a customer of AT&T’s for a long time. While their cellular service is significantly overpriced, it works in all of the places I have lived and traveled and I can’t recall ever having had a problem with them as a service provider.

But I wonder how much I should be bothered about this revelation. Part of me says that what AT&T did all these years was a good thing. Clearly the US security system must have a way to keep tabs on suspects in a world where terror is so prevalent. But today you have to wonder how much good all of this spying does. It seems like most of the ‘terrorist’ plots that have been uncovered in the US over the last decade have been misguided people who didn’t have the wherewithal to actually do any real harm. Actual terrorists are sophisticated enough to know that everything is being watched and certainly they have developed ways to avoid surveillance.

And then there is the realization that the NSA hasn’t just been watching the bad guys, but they have been spying on all of us. I don’t have anything in particular to hide, but I still don’t like the idea of the government watching me or the rest of us. And that is mostly because the government is not some nebulous force for good but is rather made up of regular people who are undoubtedly going to abuse the power this surveillance gives them. It’s inevitable that there will be abuses of power when regular people are given access to far more knowledge about people than they should have.

And so I’ve been considering if I should perhaps boycott AT&T as a quiet protest against what they have done and continue to do? But while they have been the largest partner in helping the government spy on us they are certainly not the only one. I have to imagine that the other large carriers all have some role in this as well. For instance, it’s clear in Snowden’s material that Verizon has been involved in this, just not to the extent of AT&T. In the carrier world our choices are limited and I doubt that any of the big carriers are totally clean in this. That would make a boycott somewhat hollow.

So I have really mixed feelings about this. I’ve read all of the science fiction stories that paint a picture of how surveillance eventually works to strangle any society. We have plenty of evidence of how this might look by looking back at the Soviet bloc or looking at places today like North Korea. It would be naïve to think that something like that couldn’t happen here. There would be an uprising in the US if the government tried to impose harsh rules on us in one big swoop, but if they just chip away at freedoms a little bit, day by day, we can eventually end up with a very restricted society. And surveillance is the number one tool that would allow the government to gain that kind of control. In some ways it feels like we have already started down that path.

FCC Orders Rules for Copper Retirement

FCC_New_LogoThe FCC in Docket FCC 15-97 issued some new guidelines for telcos that are going to cut customers off copper or impair legacy services. The same order also asks further questions in the firm of a Further Notice of Proposed Rulemaking. This docket is part of the ongoing effort that has been called the IP Transition, where the national goal is transition from the traditional PSTN to an all-IP network.

One of the primary results of the order is notice to customers. Notice is of two types – notice to end-user customers and notice to other carriers. The new requirement is to give a three-month notice to residential customers when they are going to lose copper and give a six-month notice to business customers. Phone companies would be allowed to retire and remove copper with no notice as long as no customer service is discontinued, reduced, or impaired.

The order clarifies that ‘retirement’ means de facto retirement such that a carrier changes the service so that legacy products are no longer available. This means that a phone company can’t arbitrarily cut customers from analog copper services to a digital equivalent without supplying this notice. This is something that companies that are upgrading customers from copper to fiber need to take notice of and be prepared to notify your customers in a timely manner before the transition.

The FCC has taken note that there are legacy services that can be impacted by removing copper or from shifting to IP. For example, there are some burglar alarm services that still run off copper phone lines, although most providers of these services can readily switch over to an IP connection. There are also some old fax machines that cannot easily be made to work on VoIP and work only on copper. But for most homes, moving from a copper to a newer network has very little practical effect, except perhaps for the fact that their phones will no longer be powered by the network, which the FCC addressed in another Docket.

There can be a much bigger impact on businesses. I’ve worked with clients recently that are still using numerous ISDN connections and T1s provided over copper. There are also still huge numbers of business key systems and PBXs that are fed by T1s and PRIs. When a telco is going to discontinue these services they often require a business to make a big expenditure. This could entail buying a new phone system, new phones and perhaps even finding a new provider if the phone company can’t supply the new connection they are going to need.

The biggest issue I see with getting rid of copper is where the phone company doesn’t have an alternate landline network ready for the transition. It doesn’t seem like a big issue to me when a company like Verizon wants to move customers from copper to FiOS. There have already been tens of millions of customers who have changed from copper to either FiOS fiber or to a cable company network who have experienced and accepted the required changes.

But AT&T has said that they want to walk away from millions of rural copper customers. That would force customers to migrate to either the cable company or to cellular wireless. This could be a huge problem for business customers because there are still a lot of business districts that have never been wired by the cable companies. And even where a business can change to a cable company network, they are not always going to be able to buy the services they want from the cable company. For example, those businesses might be using trunks or Centrex today that isn’t supported by their cable provider. These businesses are going to be facing an immediate and expensive upgrade cost to keep the kind of service they have always had.

And of course, once you get outside of towns there generally is no alternative to copper other than wireless. As bad as rural DSL can be, having a 1 Mbps DSL connection is still better than having to use cellular data for a home broadband connection. At least DSL has unlimited usage where most cellular data plans have tiny data caps. And unfortunately, there are many rural areas where cell phone coverage is poor or non-existent. Cutting down copper in those areas would be basically cutting homes and businesses off from any communications.

Too Much Fiber?

When communities consider building fiber, one of the first questions a community often asks me is how much fiber already exists in their community and how they can take advantage of it. The bad news I almost always have to give them is that their community probably contains several existing fiber networks that will be of little or no use to them. It seems there is a lot of fiber in the world that is not being put to good use.

So what do I mean by this? What I have found is that many communities have numerous existing fiber networks that have been built for one specific purpose and which can’t be used for anything else. Here are some examples:

  • K-12 Schools. School districts often own fiber networks to connect all of their schools.
  • Colleges. Colleges will often be on a different network than the other schools.
  • Traffic Lights. A number of cities now have fiber systems that feed traffic lights.
  • State Highways. They often have fiber network systems for cameras and electronic message boards.
  • Federal Highways. They build for the same reason as state highways.
  • Commercial networks. It’s more understandable why a network built by a telco, cable company, wireless company, or CLEC might not be available to a city, but most cities today contain a significant amount of fiber built by these companies.

I first ran into this issue in the late 90s when a city in Virginia asked me this question. I was helping them design a fiber network that would connect all of their government buildings. In doing so I discovered that there was already a fiber network built to traffic lights that probably already covered 80% of the network they were going to need – and they already owned it. But in looking deeper, we found that the traffic light network had been built with funds from the state highway department and that it had a prohibition in the funding language against sharing the network for other uses, including other uses by the city. That network was basically off-limits for any other use.

When you consider that building fiber can range in price from $25,000 per mile to place on poles, or $75,000 per mile to bury (in most places) or even up to $150,000 per mile in urban downtowns, it’s crazy to think that such money has been spent without considering all of the other benefits the outlay could have created.

I still see this all of the time and it is very common for a government-built fiber to be off limits to all commercial uses. But surprisingly there are often also prohibitions against other municipal uses. I can understand restrictions against commercial uses, even if I don’t like them. The fear is always there that when the government and commercial entities work together that it creates a chance for corruption. But this kind of fear should not be a reason to automatically write-off the opportunity for public-private partnerships.

I’ve always found that commercial companies are glad to share the cost of building a new fiber route. In the commercial world companies routinely share fibers and they typically create a clear division of the use of fiber pairs on a new route when multiple companies agree to share in the build costs. Governments could save a fortune if they would join into this well-established commercial practice of building fiber for more than one company.

But the restrictions of a government-owned fiber that precludes other parts of the government from using it are just wrong. When highway departments or universities or other big agencies build fiber and then don’t let other government agencies benefit from the expenditure something is very wrong and we have let bureaucracy override common sense. I often hear excuses for the practice such as the need for security, and frankly all such excuses are bosh.

I’ve told cities that there are two solutions when they run into this problem. One is to create a huge public stink so that the agency that won’t share the fibers might be shamed into doing the right thing. But the other fix is longer term, and that is to take full control of their rights-of-way. For example, one long-term fix is to require that anybody who digs a ditch in the ground must include empty conduit which will create a lot of opportunity for cheap fiber over time. But the best fix is for somebody in the city to act entrepreneurially and to get to know the fiber providers in town and develop partnerships with them. That is actually easier to do than you might think.

The Connect America Fund Dilemma

USACI doubt that this is what the FCC had in mind, but they are creating an impediment to building new rural networks with the Connect America Fund. I know that sounds exactly the opposite of what they are intending, but consider the following.

The large telcos get first crack at taking the Connect America Funding in their service territories. Frontier and Fairpoint, for example, have already claimed this money for a lot of their rural service territory. The other large companies must elect this by the end of this month. In the places where they take the funding a large telco will get support for seven years to help pay for broadband upgrades in those areas.

Most of the places that are covered by the Connect America Fund have either abysmal broadband, or no broadband at all. Where they have any semblance of broadband there will be customers on very slow rural DSL, generally 1 Mbps or much slower down to speeds close to dial-up. Customers can also get satellite data or, which surprises me, many rural households are making do with their cellphone data and the associated tiny data caps.

The large telcos are almost universally going to use the Connect America Fund money to upgrade DSL. In order to do that they will have to extend fiber further into the rural areas and then place rural DSLAMs in cabinets that are closer to customers.

That sounds good on the surface and a lot of rural people are going to get faster Internet service. So where is the dilemma? The dilemma is two-fold. First, the incumbents have up to seven years to build all of the new infrastructure. Households at the far end of that timeline are going to view seven years as an interminable future date.

But the real dilemma comes in how this affects rural communities that are looking at their own broadband solutions. Most of the DSL built under the Connect America Fund is going to 10 Mbps or less download speeds, something that is not even broadband by the FCC’s definition. And not every customer in these areas will get that much speed – many of them are going to live at the ends of the new DSL routes and will still get very slow speeds.

The dilemma is that for areas without any broadband today, customers are going to find 10 Mbps to be wonderful. If your house has been living with dial-up or cellular data, then this is going to feel great, particularly since the usage will not be capped. You’ll be able to watch Netflix for the first time and partake in a lot of things you couldn’t do before on the Internet.

But it is not going to take too many years until those speeds feel as slow as dial-up feels today. And this is going to be the last upgrade these areas are ever going to get from the big telcos. And the copper is going to keep aging and the DSL will get worse and worse over time. So while most urban areas today already have download speeds far faster than 10 Mbps, these rural areas are going to be stuck at 10 Mbps while the rest of the world gets faster and faster every year. When other homes in the US have 100 Mbps or a gigabit connection, these rural areas are going to be stuck with something far slower. There will be many future applications that need the higher bandwidth, and so the rural areas will again be shut out from what everyone else has.

But the real killer is that when any area getting these funds is going to have a much harder justifying building a fiber network that is faster than the DSL. I’ve helped rural areas get fiber networks and those business plans often need 60% or more of the homes in an area to take service to work. By creating this bandaid approach the FCC’s program means that there will be be just enough people who are happy with this faster DSL that these areas will probably not be able to get the support needed for a community-based solution. While the FCC has good intentions, they are going to be damning a lot of US counties to having crappy DSL for decades to come using copper wires that are already ancient today. The Connect America Fund money should have been used only for building real broadband rather than letting the big telcos put a bandaid on an aging copper network. The FCC is going to feel good about bringing broadband to rural America, when in fact they will have damned large chunks of the country from getting real broadband. 

FCC Establishes Guidelines for Back-up Power for Voice

FCC_New_LogoIn perhaps one of the oddest rulings I have ever seen out of the FCC, they just ordered a very specific set of rules about providing back-up power for telephone lines that are not powered by copper.

I see this as odd for two reasons. First, this is fifteen years too late. There are tens of millions of customers that are served without backup power today. Customers that get phone service from cable companies, fiber networks, and other VoIP systems don’t have backup power unless the provider has gone out of their way to provide it.

Second, this is obviously the FCC’s way of making it harder for the large telcos to knock people off of copper networks. But in doing so the FCC is punishing the rest of the industry by adding new rules and new costs . .

These new rules seem like a solution without a problem. Why do I say that? We no longer have a world full of the old Western Electric telephones that are powered by the copper network. A phone that has any features, which most modern phones do, must be plugged into home power to work. Further, there have been tens of millions of customers who have elected to take phone service from cable companies and fiber providers, which do not provide backup. And there has not been a huge outcry from these many customers over the last ten years about lack of power backup. The main reason for that is probably that the vast majority of homes have a cellphone today and don’t rely on their home telephone as a lifeline.

Here is what the FCC ordered:

  • The ruling only covers residential fixed voice services that do not provide line power (which is done by telephone copper). This does not apply to business customers.
  • This must be implemented within 120 days by large companies and within 300 days by companies with less than 100,000 domestic retail subscriber lines.
  • The back-up power must include power for all provider-furnished equipment and anything else at the customer location that must be powered to provide 911 service.
  • From the effective date, companies must describe to each new customer, plus to every existing customer annually the following:
    • The solutions offered by the company to provide 8 hours of backup for phone service, including the cost and availability;
    • Description of how the customer’s service would be affected by loss of power;
    • Description of how to maintain the provided backup solution and the warranties provided by the company;
    • How the customer can test the backup system;
  • Within three years of the effective date of the order a provider must provide a back-up solution that is good for 24-hours and follow the above rules.

This just seems like something that should have been addressed in 2000 and that it is far too late to be putting rules in place for this now. This merely adds regulatory cost to every provider without any real benefit to customers. In a lot of networks, if the neighborhood loses power so does the service provider. If such a network is down then no amount of power at the home is going to provide voice service. And there are networks that are going to require a very expensive solution for providing 24-hour back-up if it is even reasonably affordable at all.

The order doesn’t say the back-up solution has to be affordable, and the cynical me says that this is an opportunity for the phone companies to also go into the solar power business. Perhaps the solution they should offer for providing 911 backup is to sell a home a $20,000 solar power system. I know the FCC wouldn’t see the humor in that, but this order is so far out in left field that I have a hard time taking it seriously.

Large Telcos and DSL

Copper wireThere has been a spate of articles recently talking about how the number of cable customers at the large cable companies took their first big dip last quarter. This was the first time when the cable industry as a whole saw an overall significant customer loss, and this raises the question the question if cord cutting is real.

But there was another significant statistic in these same press releases. AT&T and Verizon together lost 474,000 DSL customers in the second quarter of 2015. The two made up some of these losses by adding 313,000 data customers to their FiOS and U-verse networks, so certainly some of the losses are offset by customers who shifted from DSL to something faster.

But this continues the trend that these two largest telcos are shedding DSL customers. The numbers just keep growing and this is the first time that number approached half a million customers.

Verizon has made it clear for years that they have no love for their copper networks. They have been selling significant chunks of the older networks to Frontier. They have been pestering the FCC for years to be able to turn down the copper in neighborhoods where they already have FiOS fiber.

Perhaps more surprising is that Verizon recently sold a significant number of FiOS customers to Frontier, and I have speculated before that Verizon doesn’t want to stay in any landline business. When you read their annual reports, any mention of their landline business is buried deep inside and they obviously have put all of their emphasis on the wireless business.

AT&T is a bit more perplexing. They have not been selling copper customers. But they have told the FCC a number of times that they would like to walk away from millions of customers on rural copper networks. AT&T recently promised the FCC as part of the DirectTV deal that they would aggressively add new broadband customers. While they have insinuated to the FCC that the new customers would all be on fiber, I would not be surprised to see a lot of them on U-Verse.

Many people speculate why AT&T bought DirectTV. My guess is that they want to get out of the business of delivering video over wires. U-Verse becomes a much better data product  if it doesn’t have to carry video so that all of the bandwidth would be used for data. There must already be a lot of current U-Verse customers bumping up against their bandwidth and wanting faster data connections.

It’s also interesting that AT&T hasn’t divested of rural copper networks in the same manner as Verizon. Again, I am only speculating, but my guess is that they don’t want those networks to be revitalized and then compete against their wireless networks. I think AT&T has a long term plan to serve rural areas with wireless only.

The one shame about cutting down the copper networks, particularly in urban and suburban neighborhoods, is that those networks could be upgraded relatively inexpensively with G.fast to deliver much faster speeds. CenturyLink just announced that they are testing 100 Mbps copper in Salt Lake City. Some of the copper networks in Europe are doing this with even faster speeds and the technology is generally referred to there as fiber-to-the-curb.

But obviously both companies have decided that G.fast is not a technological path they want to follow, and both are going to be aggressively decommissioning copper over the next five years.

I don’t feel too bad about a customer who is told they have to move from a copper network to a FIOS fiber network. But I am really worried about rural customers if somebody cuts down the only telecommunications wire to their home when the copper comes down. At that point those folks are going to be paying cellphone prices for both voice and data, and for some millions of them there is not enough coverage to provide those services over cellular. I predict we are going to be cutting customers off from communications and moving parts of the country back seventy-five years. I hope I am wrong.

Russia and the Internet

Russian flagWe’ve all known for a long time that the Chinese have their own version of the Internet within the country. The Golden Shield, which the west has dubbed the “Great Firewall of China” is a huge government apparatus that closely monitors and edits everything that happens on the Chinese Internet.

And now we are perhaps seeing Russia starting down a similar path. There is a new Russian law that takes effect on September 1 that is going to start fundamentally changing the way the Internet works in that country. The law basically requires that anybody that obtains information online from a Russian citizen must store that data on servers that are physically located in Russia.

This law was ostensibly created to protect the country against the spying of the US government and the commercial tracking done by US corporations. But of course, this also provides a great tool for the Russian government to monitor everything going in and out of the country.

American companies like Google and Facebook are going to have to locate servers in Russia and abide by the Russian rules if they want to have Russian citizens using their services. Some of them will certainly do that, but you have to wonder in the future how many start-ups will make the effort to do this, and over time one would expect Russia to get more and more separated from the US Internet companies.

Probably of more concern are the various European companies that have a lot more Russian users than do the US companies. This change effectively walls Russia off from the rest of the world including its nearby neighbors and trading partners.

It’s unlikely for now that the Russians will go as far as the Chinese. The Chinese completely censor large parts of the content on the web including pornography, anything pro-democracy, religious content such as anything having to do with the Dalai Lama, anything having to do with Taiwan, anything having to do with protests inside the country, and anything else they decide to block. But still, for the Russians to know that their content is not leaving the country they are going to have to look at everything closely.

One would assume that the Russians will use the same techniques used by the Chinese to enforce the new law. This includes such things as IP blocking, DNS filtering and redirection, USL filtering, packet filtering, VPN recognition and blocking, and active IP probing.

I just wrote last week how the basic architecture of the Internet promotes freedom. While this architecture was originated largely in the US, over time much of the rest of the world has joined into the governance of the web and the basic architecture is now accepted by most countries.

But obviously Russia, China, and a few other countries have a very different view of what the web ought to be, and largely for totalitarian purposes of controlling their citizens. Anybody who has read any science fiction, even back to Orwell’s 1984, understands how the Internet could easily be turned into a tool of control.

What we are likely to see with Russia is the same thing we see today with China. Outside companies often come to China to create a presence and to expand their footprint, but over time many of them leave in protest against the control they are subjected to.

It’s unlikely that Russia and China will have much influence in changing the web architecture for everybody else. What is more likely is that their citizens will not partake in the newest innovations on the web, for the good or bad they will create. But most countries already today understand how important the web is for their industries and for most countries that’s a good enough reason not to tinker with something that works.

New Technology for August 2015

ibm_chip1This is my monthly look at new technologies that might eventually impact our industry.

Small Chip from IBM. IBM and a team of partners including Samsung, GlobalFoundries, and the State University of New York at Albany have made a significant leap forward by developing a computer chip that measures 7 nanometers, or billionths of an inch. That’s half the size of other cutting-edge chips in the industry. Experts are calling IBM’s new chip a leap that is two generations ahead in the current chip industry. IBM is also introducing a 10 nanometer chip as well.

IBM’s trial chip contained transistors just to prove the concept and so the chip wasn’t designed for any specific purpose. But this size breakthrough means that the industry can now look forward to putting much greater computer power into small devices like a smart watch or a contact lens.

A chip this small can be used in two different ways. It can reduce power requirements in existing devices, or it could be used to greatly increase computational power using the same amount of chip space.

IBM has contracted with GlobalFoundries to build the new chip for the next ten years. This should provide significant competition to Intel since currently nobody else in the industry is close to having a 7 nanometer chip.

Cheaper Batteries. Yet-Ming Chiang of MIT has developed processes that will significantly reduce the cost of building batteries. Chiang has not developed a new kind of battery, but instead has designed a new kind of battery factory. The new factory can be built for a tenth of the price of an existing battery factory which ought to result in a reduction of battery prices by about 30%.

This is an important breakthrough because there is a huge potential industry for storing electric power generation offline until it’s needed. But at today’s battery prices this is not really practical. This can be seen by looking at the price of Elon Musk’s new storage batteries for solar power – they are priced so that socially conscious rich people can use the technology, but they are not cheap enough yet to make this a widespread technology that is affordable for everybody.

A 30% reduction in battery costs starts to make lithium-ion batteries competitive with fossil fuel power. Today these batteries cost about $500 per kilowatt-hour, which is four times the cost of using gasoline. Chiang’s goal is to get battery costs down to $100 per kilowatt-hour.

Metasheets. A metasheet is a material that will block a very narrow band of radiation but let other radiation pass. Viktar Asadchy of Aalto University in Finland has developed a metamirror that will block specific radiation bands and will reflect the target radiation elsewhere while letting other radiation pass.

This is not a new concept, but attempts to do this in the past have usually bounced the target radiation back at the source. This breakthrough will let the target radiation be bounced to a material that can absorb it and radiate it as heat.

This could be a big breakthrough for numerous devices by creating small filters that will allow the dissipation of dangerous radiation from chips, radios, and other devices. This could result in far safer electronics and also can cut down on interference caused by stray radiation and make many electronics components function better.

A New ‘Do Not Track’ Policy

EFFThe Electronic Frontier Foundation (EFF) released a new version of ‘Do Not Track’ which is supposed to provide stronger protection for Internet users. This is something that consumer advocates have been pushing for a long time, so the question is: what does this new standard provide for the average Internet user?

To confuse matters a bit, the EFF is not the only group working on this issue. The W3C group that controls the standards for most Internet protocols is also working on its own version of Do Not Track. But regardless of which of these efforts becomes the new standard there are serious questions about how effective this might be in the marketplace.

Neither of these groups can impose Do Not Track rules on Internet companies, and so compliance with any new standard is voluntary and one has to wonder who is going to implement it. There is a current Do Not Track standard that very few in the industry are following. For instance, the search engine I normally use, DuckDuckGo, follows the current standard and doesn’t track what people search for on the web. You can count on two hands the other companies that currently publicly agree not to track their users.

This is another one of the big tug-of-wars going on in the industry. There are a lot of people who don’t like the idea of web companies tracking their every move and then selling that data to others. A lot of people find targeted ads creepy and feel like the big web companies are spying on them.

And to a large degree they are. Companies like Google and Facebook and many others make a lot of money from advertising and from selling data about their customers to others. These companies feel that if you come to their site that you have waived privacy for what you do on their platform. Big data is perhaps the biggest money maker on the web, and having flocks of people opt out of being tracked would significantly reduce revenues for a lot of web companies.

Here are a few of the major points of the new policy. Honoring a DNT request means:

  • Not collecting information from the user and not placing tracking cookies except with specific permission;
  • Not retaining details of the interaction with the user except in those few cases where data retention is required by law;
  • Information needed to complete a transaction, such as address or credit card number are only retained until the transaction is complete;
  • Users can be given the option to have web sites remember their data. This might be convenient for places where somebody shops regularly;
  • While these rules aren’t binding, existing law says that if a company says they will not track you they must live up to that commitment.

It will be interesting to see if this new round of Do Not Track gets any more industry buy-in than the last version. There certainly is a significant portion of Internet users who would opt out of being tracked if that was possible. However, there is a good chance that a lot of the industry will only give lip service to any voluntary guidelines. They might not send specific ads to somebody who says they don’t want to be tracked but would likely otherwise track them like everybody else.

It would require a change of law to make this mandatory. There certainly are a number of consumer privacy laws that have been enacted, such as the laws that protect medical records. It probably requires an action by Congress to make these protections mandatory. I find it unlikely that big companies like Facebook and Google and many others are going to voluntarily offer this to users. Offering it costs money and the loss of adverting and data revenues would cause a big hit to the bottom line of these companies. They are already seeing big hits from ad blocking revenues and this could be even a bigger hit.

To some degree consumers who really care about their privacy have options. They can use web sites today that promise to not track them. But almost all ecommerce is tracked and today there are not many places you can go on the web that aren’t tracked. Certainly almost all social media sites are tracked. I know I get anywhere from 50 to 200 tracking cookies on my computer each day from fairly light browsing, so there are a lot of companies out there trying to find out more about us.

Taking Federal Broadband Funding

USDAThe USDA recently announced a new round of loan financing for the Rural Broadband Loan and Loan Guarantee Program as authorized by the 2014 Farm Bill. The loans are administered by the Rural Utility Service (RUS), a part of USDA.

The loans are available to bring or improve broadband in areas where at least 15% of the households do not have broadband today. The loans can be used to build technologies that are as slow as the old FCC definition of broadband – 4 Mbps download and 1 Mbps upload, although the RUS will strongly encourage building technology capable of meeting the new broadband definition of 25 Mbps down and 3 Mbps up. The projects can range between $100,000 and $20 million.

Over the years I have helped numerous clients acquire these loans, but I have seen more and more reluctance to use them in recent years for a variety of reasons. Following are some of the issues my various clients have with this loan program:

Slow Response Time. I don’t know what the current backlog is, but there have been times over the last five years when a loan application might wait 18 months or more for a decision from the RUS. Those kinds of wait times might have been acceptable back in the days of all-regulated telephony, when companies worked slowly on five and ten year capital plans. But the world has gotten more competitive for everybody and nobody is willing to wait that long for a yes or no answer on a major capital program.

Paperwork. The loans take a lot of paperwork. The application itself is like writing a book and my firm has historically charged up to $20k for writing one of these applications – it’s that much work. And the paperwork doesn’t stop with the application. Once you’ve taken the loan there is major annual compliance paperwork that can overwhelm the staff of smaller borrowers.

Engineering. The loan applications for larger projects must be signed by a professional engineer, and this means that projects must be nearly fully engineered just to apply for funding. That differs from the rest of the industry where projects typically are done with ‘pre-engineering,’ which means that an engineer has made a very good estimate of the cost of the project, and in my experience those pre-engineered estimates are usually pretty reliable.

Extra Costs. Sometimes the loans require extra steps that are not required for other financing. For example, I’ve seen federally-funded loans require an expensive environmental study. Nobody else ever does this because fiber is almost always built into existing public rights-of-way, which by definition have already been cleared for these purposes. Depending on the size of the loan there can also be some kind of customer survey required.

Mostly Still for Regulated LECs. Most of the loans still go to regulated telephone companies for a variety of reasons. For instance, the projects usually require 10% to 20% equity from the borrower and also first lien against the assets built with the loan. These requirements have largely stopped government entities from using these loans. Another issue that these loans entail is that they have loan covenants that can be burdensome. As an example, there might be limits on dividends that can be paid to company owners while one of these loans is outstanding.

Rates are Not that Attractive. There have been times in the past when the RUS interest rates were significantly lower than commercial bank rates and thus were very attractive. But with today’s low interest rates there is currently not a lot of difference between the government rates and commercial rates. By the time you factor in all of the extra costs of applying for and complying with these loans, the RUS loans might be more expensive.

At times in recent years the RUS has built up billions of uncommitted funds because not enough borrowers have been interested in the money. Over the last decade I have helped more clients refinance RUS loans with other lenders than I have helped people get new RUS loans. I’ve read other articles that say that the RUS is too conservative. That may or may not be true, because for the carriers I know it’s generally one or more of the above factors that have turned them off government money.

I don’t want to sound like I am trashing the program, because RUS loans have helped to fund many worthwhile projects. But a lender needs to weigh all of their options and consider all of the costs of borrowing money from different sources. Borrowing money is about a whole lot more than just the interest rate and you need to take all of the other aspects of any loan into consideration.