The Future According to CenturyLink

telephone cablesRecently the CFO of CenturyLink, Stewart Ewing, spoke at the Bank of America / Merrill Lynch 2014 Global Telecom and Media Conference. He had some interesting things to say about the future of CenturyLink that contrasts with some of the things that other large carriers like AT&T and Verizon have been saying.

The most interesting thing he had to say is that CenturyLink sees the future of broadband in the landline link to a home. He cannot foresee 4G wireless as a substitute for a landline wireless connection. He doesn’t see wireless delivering enough bandwidth in coming years as demand at homes keeps growing. Already today the average CenturyLink residence uses slightly less than 50 Gigabits of data per month and that is far above the data caps for 4G plans. He doesn’t think cellular can deliver the needed speeds, and unless the cellular model is drastically changed, it’s too expensive and capped at really low levels.

So CenturyLink plans to continue to upgrade its rural plant. About two thirds of CenturyLink’s customers can get 10 Mbps or higher today and the company is working to make that available everywhere. Contrast this to AT&T and Verizon. They have both told the FCC that they have plans to convert millions of rural lines to 4G LTE. I have written about this many times and see it as one of the biggest threats on the horizon to rural broadband. LTE is a great product when you want a burst of fast data to your cell phone. But the LTE network is not designed to serve multiple video streams to large numbers of households. 4G is also capable of some fairly impressive speeds that are significantly in excess of 10 Mbps, but those speeds drop quickly as you move away from a cell site.

It’s fairly obvious that AT&T and Verizon favor LTE since it is their own best economic benefit - their wireless operations dwarf their landline businesses. Nobody can argue with a straight face that LTE is the best way to provide data for customers from either a a performance or a cost basis. Cellular coverage is still poor in a lot of rural America and so forcing people off of copper and onto wireless will  degrade the ability of many rural households to get broadband. But these two companies have a big financial incentive to move people from low-priced landlines to expensive cellular connections. It makes me think that if the FCC really cares about rural America that they ought to be divesting the landline business from AT&T and Verizon to remove the wireless bias.

CenturyLink says they are worried about the FCC changing the definition of rural broadband to be higher than 10 Mbps. They say that speed is difficult to achieve in their DSL plant and that they are far more comfortable with a definition of around 6 Mbps. It’s honestly refreshing to hear a telco executive tell people the truth for a change. The other big telcos spew a lot of rhetoric to sway the FCC or to assuage the public and it’s unusual to hear somebody tell the unvarnished truth to the public.

Those who follow my blog know that I promote a high definition of broadband. Households want the ability to stream multiple videos simultaneously. And you can expect in just a few years for there to be a much greater demand for HD quality video and a climbing demand for 4K video. The average urban household that has choice is already finding 10 Mbps to be far too slow. Just this week Verizon increased its minimum FiOS speeds to a symmetrical 35 Mbps. I know this is a really big technological expectation for CenturyLink and other rural telcos still using copper, but the definition of broadband needs to keep pace with what the normal household wants to buy, and that number is going to keep climbing year after year. If we don’t set the bar high then rural places are going to fall further behind the speeds available in cities.

CenturyLink does expect to continue to expand the footprint of its Prism TV product. This is a paired and bonded DSL product that can deliver up to 50 Mbps for customers somewhat close to a central office. CenturyLink has made this available to over 2 million customers and plans to make it available to 300,000 more in 2015.

Interestingly, CenturyLink does not plan to expand WiFi hotspots. Some other carriers seem to be in a race to put in hot spots but CenturyLink cannot see a way to monetize the effort. Of course, CenturyLink will put a hotspot in for a business that asks for one, but they don’t intend to build hotspots of their own. I have also written about this topic several times. Nobody who is not serving a captive audience like at an airport or in an airplane has been able to get much money from selling Internet by the hour. And while the giant cellular carriers benefit greatly by more WiFi, I have yet to hear of a deal where they are paying somebody to install public hot spots. Comcast says they have installed hundreds of thousands of hot spots and they recently announced that they are turning the wireless modems of all of their residential customers into hot spots. But to me that seems more like a move that is going to antagonize the public greatly with little obvious monetary benefit. I think CenturyLink is being prudent to stay away from this business until somebody shows a way to make money with it.

Cracking Down on USF Fraud

FCC_New_LogoLast week the FCC announced what it is calling a USF Strike Force that will be “dedicated to combating waste, fraud and abuse” in the Universal Service Fund (USF). This Strike Force will be headed up by Loyaan Egal who was a former senior assistant US Attorney in the fraud and public corruption section of the US District Attorney’s office for the District of Columbia.

They are not just talking about doing audits and getting refunds of improper charges to the USF. They are going to partner with the FCC’s Office of the Inspector General and the US Department of Justice to prosecute any unlawful conduct they find.

It is, of course, always a good idea for any federal program to make sure that the funds they are handing out are being done so rightfully. But that is not the only motivation in this instance. The FCC recently announced that they were going to dedicate an additional $2 billion over two years towards expanding broadband into schools. This is not new funding, but instead they want to find this money within the existing USF. Part of that funding is going to come from redefining what is eligible for schools to receive out of the Schools and Libraries Fund. For example, some schools today get reimbursed for phone lines and pagers and that is expected to go away. But the FCC hopes to find the rest of the money by aggressively seeking out and ending abuse in the USF.

To put this into perspective, let’s look at the size of the USF in 2013. Last year the fund collected over $8.3 billion. Those funds were used as follows:

  • $4.17 billion – Rural High Cost Program
  • $1.80 billion – Lifeline Program
  • $2.20 billion – Schools and Libraries Program
  • $0.09 billion – Rural Health Care Program
  • $0.07 billion – Pilot Program for Health Care
  • $0.11 billion – Administrative Expense

The FCC probably needs to find $500 million per year in cost savings to fund their school initiative. We don’t yet know the exact number until they put out more detail of the amended reimbursement rules for School and Libraries Program. But in order to find $500 million the investigators will have to successfully determine that 6% of the monies being paid out as USF today are fraudulent. That is a tall order. Do they really think there is that much fraud in the program, and if so, why haven’t they acted before now?

There have been some recent headlines about fraud in the Lifeline Program. This program pays up to $9.25 per month towards a phone for low income customers. Just recently the FCC and FBI announced that three Florida men had defrauded the Lifeline program for $32 million by charging the fund for imaginary customers. One would have to assume that is a multi-year number and it will take finding a slew of additional examples of fraud to uncover $500 million per year.

Anybody who receives USF ought to be nervous by this investigation even if they are following the rules to a tee. One can expect audits and questionnaires to ask you to prove your eligibility for the USF funds. Over the years I have reviewed the USF filings from numerous firms who draw from the USF and it is not unusual to find errors in their filings, due mostly to not understanding the filing rules. The rules for being eligible for USF are complex, particularly in the High Cost Fund, and companies routinely hand this work off to outside consultants. I would be at least a little nervous that an error in my filing could be interpreted as fraud.

I also have one last word of advice to the FCC. Take a look at the $110 million a year it costs to administer these funds. While that is only 1.3% of what is collected, it just seems like this function could be done for a lot less than $110 million per year.

Something New

500px-LG_Logo_svgI often report on new scientific breakthroughs, but today’s blog is about a few new technological applications that will benefit our industry.  I rarely go a day without seeing some new innovation, and I thought the following were the most interesting ones I’ve seen lately.

First, Apple is working on a fuel cell for portable devices. This would be a big breakthrough for laptops, tablets and other sizable portable devices because fuel cells have some distinct advantages over the batteries we use today. For one thing, the fuel generating chemicals in a fuel cell can be refilled and so the battery could theoretically be kept going for a long time.

This is important because laptop batteries today are considered as toxic waste. Fuel cells would cut down on pollution by eliminating some of the nastier metals used in today’s batteries. By  lasting longer they would cut down on the huge number of batteries we are burning through as a society. For instance Apple could develop a universal tablet battery that you would keep as you move through newer generations of devices. Our devices today are not particularly green and disposing of our devices is creating a challenge for landfills and groundwater, mostly due to the batteries. So the small fuel cells will be a big step towards greener devices.

Next, the Korean manufacturer LG has announced a TV screen that can be rolled up like a poster. There have been flexible TVs around for a few years, but LG has made a screen that can be rolled up into a 1.5 inch tube. This will drastically change the supply chain for TVs. They are expensive today to ship and store due to the large sizes and they often get damaged in transit. But TVs this flexible can be sent by UPS in a tube.

The LG technology can also produce transparent TVs. This opens up a whole new world of applications for TVs and monitors. For instance you could put TVs on bathroom mirrors to watch while you brush your teeth. They could go on any window or on any wall and would disappear when not being used as TVs. I remember reading science fiction books many years ago that predicted that there would be TVs everywhere in the future, and with this technology that might be finally possible. These screens also advance the trend for separating the TV electronics from the screens. We will be able to put screens anywhere controlled by the same centralized smart box.

LG says they will be able to make a transparent 60-inch flexible TV capable of 4K quality by 2017. But the promise of this technology is not just for giant TVs, but also for little TVs screens that can be put anywhere – in the bathroom, kitchen, shop, garage – wherever somebody wants to watch a screen. The biggest outcome of cheap TVs everywhere would be an explosion in the demand for bandwidth. It’s not hard to picture households wanting to have ten football games on at the same time on Sundays.

Google has announced a new feature for Android that allows devices in proximity to each other to automatically connect. They are calling this technology Nearby. Any device using this technology would seek out and find any other nearby devices and would connect to enable communication. This has a lot of applications. For example, when friends or family meet their phones could automatically synch up and update calendars or whatever else they want to share.  This technology might be the platform to let stores contact shoppers as they pass through the store to offer specials or point out items of interest. And for the Internet of Things this is a handy way to make the smartphone the controller of other devises. Whenever you walk into a room in your house your phone would be instantly talking to all of the Nearby devices there.

Nearby would do this by automatically turning on the Bluetooth, WiFi and microphones as needed. There are some privacy concerns about this capability and certainly there will be apps to let each user set the degree to which they are willing to be open to others, and to also control who might be able to connect to you. But Google is counting on most people wanting to have an interactive shopping experience and that is probably what they see as the biggest commercial application of the technology. Google has been looking for a way to compete with Amazon in he shopping arena, and this might be that platform. Where Amazon dominates the online shopping experience Google could come to dominate the in-store shopping experience.

 

What’s the Truth About Netflix?

Polk County SignClearly a lot of customers around the country are having trouble with NetFlix. The latest round of finger pointing is going on between Verizon, Netflix and some intermediate transport providers.

Netflix uses adaptive streaming for its standard quality video and this only requires about 2 Mbps at the customer end to get the quality that Netflix intends for the video. HD videos require more bandwidth, but customers are complaining about standard video. A Netflix download requires a burst of data up front so that the movie can load ahead of the customer. But after that it stays steady at the 2 Mbps rate and the download even pauses when the customer pauses. It’s getting hard to find an urban ISP that doesn’t deliver at least that much speed, so one would assume that any customer who subscribes to at least 2 Mbps data speeds should not to be having trouble watching Netflix.

But they are. On their blog Verizon talks about a customer who has a 75 Mbps product and who was not getting good Netflix quality. On that blog Verizon says that it checked every bit of its own network for possible choke points and found none. For those not familiar with how networks operate, a choke point is any place in a network where the amount of data traffic passing through could be larger than the capacity at the choke point. In most networks there are generally several potential chokepoints between a customer and the outside world. In this blog Verizon swears that there is nothing in its network for this particular customer that would cause the slowdown. They claim that the only thing running slow is Netflix.

This is not to say that there are no overloaded chokepoints anywhere in Verizon networks. It’s a big company and with the growth of demand for data they are bound to have choke points pop up – every network does. But one would think that their fiber FiOS network would have few chokepoints and so it’s fairly easy to believe Verizon in this instance.

Verizon goes on to say that the problem with this Los Angeles customer is either Netflix or the transit providers who are carrying Netflix traffic to Verizon. Verizon is not the only one who thinks it’s the transit interface between the networks. Here is a long article from Peter Sevcik of NetForecast Inc. that shows what happened to the Netflix traffic at numerous carriers both before and after Netflix started peering directly with Comcast. This data shows that traffic got better for everybody else immediately upon the Comcast transition, which certainly indicates that the problem is somewhere in the transit between Netflix and the ISPs.

Verizon says the problem is that Netflix, or the intermediate carriers don’t want to buy enough bandwidth to eliminate chokepoints. Sounds like a reasonable explanation for the troubles, right? But then Dave Schaffer, the CEO of Cogent came forward and pointed the finger back at Verizon. He says that the problem is indeed in the interface between Cogent and Verizon. But Schaffer claims this is Verizon’s fault since they won’t turn up additional ports to relieve the traffic pressure.

So now we are back to square one. The problem is clearly in the interface between Verizon and carriers like Cogent. But they are blaming each other publicly. And none of us outside of this squabble are going to know the truth. Very likely this is a tug-of-war over money, and that would fall in line with complaints made by Level3, who says that Verizon is holding traffic hostage to extract more money from the transit carriers.

The FCC is looking into this and it will be interesting to see what they find. It wouldn’t be surprising if there is a little blame on both sides, which is often the case when network issues devolve into money issues. Carriers don’t always act altruistically and sometimes these kinds of fights almost seem personal at the higher levels of the respective companies. The shame from a network perspective is that a handful of good technicians could solve this problem in a few hours. But in this case even the technicians at Verizon and the transit carriers might not know the truth about the situation.

FCC to Fund Experimental Broadband Projects

FCC_New_LogoLast Friday the FCC voted to establish a $100 million fund to provide one-time grants to fund what they are calling experimental rural broadband projects. The announcement was at a high level of detail and we’ll have to wait a bit to see the specifics. Grant filings will be due within 90 days of the release of the final rules. The FCC hopes to award all of the grants by the end of this year.

Here are a few things that can be gleaned from the high- level release:

  • $75 million of the funds will be awarded based upon the ability of projects to deliver at least 25 Mbps download and 5 Mbps upstream. $15 million will be awarded to projects in high cost areas that must deliver at least 10 Mbps download and 1 Mbps upload. And $10 million will be awarded for extremely high cost areas that also can deliver 10 mbps download and 1 Mbps upload.
  • The awards will be made based upon a comparison of the amount per passing that is requested compared to the costs calculated for that area by the CAF cost model. Those willing to take the least amount of money compared to costs should win the grants.
  • Those willing to serve Indian tribal areas will get a 25% bidding credit.
  • There will be some sort of cap set on the amount of any given award.

Here are a few of the things I can glean from these rules so far:

  • Nobody should expect these grants to pay for most or all of a broadband project. This is not going to be anything like the Stimulus grants. Some of those grants were for large amounts and paid for a substantial amount of construction. While $100 million may sound like a lot of money, expect the FCC to spread this money to a lot of projects in a lot of states to cover a wide range of technologies.
  • You are not going to get a lot per customer. These grants are going to reward those who can pay for most of the cost of a project on their own. So think of these grants as providing a little bit of assistance to construct a broadband project.
  • You better already have a project in mind, because 90 days is not a lot of time to understand the filing rules and to complete a grant application.
  • Most of this money, except for some very rural places, must be able to deliver at least 25 Mbps download to all of the customers in the proposed service areas. That is going to eliminate a lot of potential projects like point-to-point WiFi networks or even cellular 4G projects that might deliver that much bandwidth to a few customers close to a tower but a lot less bandwidth to those further away. This makes the grant a real technological challenge since there aren’t a lot of technologies other than fiber or a coaxial cable network that can deliver that much bandwidth to everybody. But the awards are not going to be nearly big enough to fund building fiber. The FCC is walking a tightrope between wanting high bandwidth and also expecting filers to pay for most of the project. This combination is going to be hard for a lot of filers to meet.
  • Like any federal monies, these grants will come with a lot of paperwork. It’s one thing to have accepted the paperwork burden for taking $10’s of millions of stimulus grant, but you need to consider if that paperwork burden will be worth it for getting a $1 million or smaller grant.
  • Because of the quick nature of the process and because the awards will only fund a portion of construction costs, these grants are going to favor incumbent providers who can submit projects that are already in their capital budgets. Since incumbents were already planning on paying the full cost of a project it will be easy for them to just take a little assistance.

The rules should be issued soon and once we see the detailed rules we will understand more about who should and should not bother with this process. There were over 1,000 entities that expressed an interest in these grants at the beginning of the year. I am going to guess that a significant percentage of those projects will find that either they or their projects won’t qualify for the grants.

I hope my caution about the grant process doesn’t come across as too negative, but I have learned from experience that free money is not really free and usually comes with a lot of strings. So before spending money to file these grant requests make sure that you qualify and that you are requesting substantially less than the CAF cost model projects for your study area. There are plenty of folks out there who will be glad to charge you for filing a grant request even if you have little or no chance of winning one.

It’s Not Your Father’s 911

Mashpee_Mass__Ambulance_363_-_2007_Ford_E-450_HortonSince its inception in the late 60’s and wide-spread deployment in the 80’s we have all come to take 911 for granted. No matter where you are in the US, if you dial 911 you expect to be connected to an emergency center for police and fire service.

All telephone providers in the US are required by FCC rules to connect a caller to the proper 911 center based upon their location. These 911 centers are referred to as Public Safety Answering Points (PSAPs). The PSAPs are operated by counties, cities or regionally. PSAPs vary in sophistication from large 911 centers in major cities that have hundreds of operators, to small rural 911 systems where the calls get routed to the local sheriff’s office and involve almost no technology.

I have recently seen two different sets of headlines that put 911 back in the news. The first was about the emergence of text-to-911, where texting to 911 will connect you to the closest PSAP. This grew out of the movement to create next generation 911, which has the goal of allowing voice, text or video emergency calls from any communications device using IP. Historically 911 has been limited to voice calls made from landline or cellphones, except for calls made by deaf and hearing-impaired people using teletypes and other similar devices.

In 2013 the largest wireless carriers began testing text 911 with some large urban PSAPs. People can text 911 and be connected to their PSAP, which will then respond to them via text. The genesis of this upgrade is to provide 911 from anywhere for the hearing-impaired, who can only now do this using special devices. But texting to 911 would be available to anybody

The FCC issued a policy statement in January of this year that said that every wireless carrier should provide text-to-911 service, although it is not yet mandatory. The FCC also mandated that the wireless carriers send back a ‘bounce-back’ message to the sender if they are unable to complete the call to a PSAP. Without that return message a person would assume that the text message successfully got to 911. Both the FCC and the PSAPs encourage people to make the call by voice whenever possible and only use text when there is no other alternative.

There was also some recent more disturbing news about 911. The FCC recently released data that showed that in 2013 that 90% of the 911 calls in Washington DC originated from wireless devices did not deliver the precise location data of the caller. This is a bit chilling for several reasons. First, a large percentage of the population now only uses cell phones, and so this is their only way to call 911. And secondly, not everybody knows their address when they call. If the caller is a child or a tourist they might not have any idea of their location. And sometimes callers who are in danger call 911 and can’t speak and rely on 911 knowing where they are at

Mobile 911 makes a determination of a callers location using triangulation. This means that the 911 PSAP is able to ping back to the cell phone and see the location of several nearby cell towers. By looking at the relative strengths of those ping-backs they were historically able to pinpoint a caller within 50 – 100 feet, often closer.

But this system was established when there was only a handful of cell towers in the world, and so it became fairly easy to locate a caller. But today there is a proliferation of cellular transmitting devices in the network, particularly in urban areas. The cell phone companies are reported to be installing millions of mini-cell sites this year – sites which act as cell towers, but for a much smaller area like part of a stadium, a busy street or on a commuter bridge. Additionally, anybody is able to buy a cell phone booster. These are essentially cellular repeaters with a short range and are used to bring strong outside signals to the inside of a building.

But to a PSAP all of these devices look like enough like cell towers to cause confusion in the triangulation algorithms. And so, where mobile 911 was once fairly accurate, it is now a jumbled mess in urban areas where there is a proliferation of transmitting devices. I am sure there is a technological solution to this, but it is going to take the cell phone carriers start over to find a way to locate a cell phone in an urban environment.

While the headlines of 9 out of 10 being inaccurate sounds scary, the reality is that the lack of precise data didn’t affect most of these calls. Otherwise we’d be seeing a lot of shocking headlines. Remember that in most cases that the 911 PSAP speaks to the caller who can verify their location. And even when the mobile 911 system in not entirely accurate it probably gets close enough to be effective most of the time. But I remember the headlines in the early 80s when several people having heart attacks died because they called 911 from a payphone and didn’t know their location. I hope this latest report prompts the FCC and the cell companies to find a solution before we go back to those headlines again.

Securing the IoT

MLGW_Substation_Whitehaven_Memphis_TN_2013-01-06_006I read this week that a security company was able to hack into somebody’s WiFi network through a smart LED light bulb. This obviously points out a flaw in that particular brand of lights, but it highlights a much larger issue. How are we going to secure the Internet of Things?

Estimates vary widely, but by 2020 there is expected to be many billions of internet connected devices. Many of these devices will have been designed for a given purpose, but many will just be things to which we have added a cheap sensor. The vast majority of the IoT devices will have little or no protection against online attacks. So the IoT is going to create billions of unsecure endpoints in all of our networks.

Many of these devices will have very tiny and primitive processors incapable of any of the kinds of security protection we use today such as anti-virus anti-malware software. The devices are going to be built by a multitude of different companies and have a wide array of capabilities and vulnerabilities. And unless some standard is developed, the devices will use a multitude of different protocols such as Zigbee, WebHooks and IoT6. And perhaps we don’t even want one standard because that could make the whole world susceptible to an effective virus.

Unlike today’s viruses which can cause computer and network problems, an IoT an attack will be able to inflict real world damage. The obvious examples always used include attacks against insulin pumps or pacemakers. But damage can come from anywhere when hackers can address cars, heating and air conditioning systems, water systems and door locks.

There haven’t been many advertised hacks against IoT devices today, mostly because hackers have so many other lucrative places to attack us. But I just read this month how hackers gained access to some electric company grids through their smart metering systems. It won’t take a lot of playing inside an electric network to cause real harm to generators, substations or transformers.

There are some proposed solutions to some of these problems. For example, smartphones and tablets today have elements like SIM or Trusted Execution Environment (TEE) that are secure cores out of the reach of hackers. In those devices we can load credentials into those safe environments which allows us to create a true identity for the device that can be validated by the rest of the network. The more sophisticated IoT devices could deploy the same sort of technology.

We can do something similar for ‘dumber’ devices using something akin to the chip and pin systems that are used in Europe to protect credit cards. Those technologies allow banks to establish the identity of the person trying to complete a transaction.

But to get protection into the IoT is going to require both standards and compliance by manufacturers. Consider the American banking system which is not implementing the same safety standards as Europe, even while tens of millions of credit card numbers and PINs have been stolen multiple times. Just having security is only going to work if the people making the IoT devices spend the money to implement the technology. There will plenty of manufacturers who will cut corners on security to save money.

Further, many of the IoT technologies being contemplated involve swarms of very small sensors connected in clouds and used to monitor our environment. Whether these be deployed in our blood stream to look for signs of illness, or deployed in nature to watch endangered species, these devices will be of such a tiny nature that it will be impossible to add sophisticated software security.

Obviously solutions will be developed because the public will demand it. But before that happens I envision some dramatic and very public cases where hacking kills people or causes other real damage. This doesn’t have to be anything sophisticated. Turning toasters on to full heat overnight might burn down houses. Locking everybody in a town out of their houses by hacking into smart door locks would wake up the public to the dangers of the IoT. I fear we are in for some bumpy roads before we figure out how to do this right.

Funding Broadband to Schools

Indianola_High_SchoolFCC Chairman Tom Wheeler recently announced that he was going to try to funnel $5 billion over the next five years to upgrade the bandwidth inside schools. He is proposing to do this as part of the E-Rate program by changing the things that fund will pay for. I think this begs the question of how and why the FCC has the funds available to pay for this sort of expenditure. So following is a bit of a primer on the E-Rate program.

The E-Rate program is part of the federal Universal Service Fund (USF). Since the 1960’s there was a universal service fund that was administered by AT&T that provided funds to support the expansion of telephony into rural places. This was funded by a small surcharge on interstate long distance calls.

But when AT&T was broken up the funding for USF got murky and so Congress changed the administrator of this funding as part of the Telecommunications Act of 1996. In the Act the Congress directed the Congress to create the Universal Service Administrative Company (USAC) who collects and disburses funding for universal service. USAC is still funded based upon interstate telecommunications. Most telcos pass these fees along to customers on their bills, although this is not mandatory and companies could fund this out of their fees.

The Universal Service Fund has four major components. The High Cost Fund pays to support providing telephony in rural places where the costs are much higher per customer than average. The Low Income Fund pays for some of the installation fees and also some of the monthly fees for telephone lines for low income subscribers. The Rural Health Care Fund provides subsidies for rural tele-health and tele-medicine. And the Schools and Library Funds provides subsidies for Internet access and telecommunications services and infrastructure for schools and libraries.

The USF is undergoing major change due changes ordered by the FCC in Docket 11-171 in 2011. The fund is being transitioned to be called the Connect America Fund and will divert the High Cost Fund to be used to support rural broadband instead of rural telephony.

The Schools and Libraries Fund is commonly referred to as the E-Rate program. This program was started in 1997 where the FCC determined that, “telecommunications services, Internet access, and internal connections, including installation and maintenance,” for schools and libraries were eligible for discounted rates. The E-Rate program will pay from 20% to 90% of the cost of eligible services based upon the poverty and the urban/rural nature of the population supported by a given school. The Fund pays the neediest schools and works its way through the list of applicants each year until it runs out of funding.

What the FCC is doing as part of Chairman Wheeler’s announcement is to look at the definition of what is eligible for reimbursement from the E-Rate program. These definitions haven’t been upgraded for a long time and the fund pays for things like pagers and telephone lines (although one has to imagine the payments for pagers must be very tiny).

The FCC now wants to divert some of the fund to help pay for the basic infrastructure at schools to deliver broadband to the classrooms. President Obama has announced a policy goal, referred to as ConnectED, of bringing faster broadband to all of the schools in the country. His goal set a near-term goal of bringing 100 Mbps connections with a near-future goal of bringing gigabit speeds to schools

The FCC is responding to those policy goals with this change in the E-Rate funding. In Docket WC 13-184 the FCC had looked at some of these issues and had noted that there was a challenge in getting bandwidth from the wiring closet of schools into the classrooms. The FCC now wants some of the E-Rate funds to be used to rewire schools or to deploy other technologies that can bring bandwidth to where students can use it. It certainly is important for this fund to keep up with the times. It makes a lot more sense to use these funds to improve bandwidth at schools rather than to continue to pay for telephone service and for T1 lines. .

CenturyLink and the Cloud

Cloud_computing_icon_svgI don’t write many positive articles about the largest US telcos. This is mostly because these are the competitors for most of my ISP clients, but also because the big companies are on the wrong side of issues like net neutrality and privacy. It’s generally pretty easy to find things to dislike about any one of the big carriers.

But I have to say that I am impressed with CenturyLink’s foray into cloud computing. They got into it early and they have carved out a decent market niche. Cloud services is already a huge business and will grow much bigger. I read a recent statistic that says that only about 13% of US corporate data today is stored in the cloud. That leaves a lot of room for industry growth.

The two big giants of the cloud storage industry are Amazon and Google. In fact, Amazon is so large that I read that Amazon has five times more data center capacity today than the next 14 competitors combined, including Google. But I also have read reviews that talk about Amazon as the ‘Walmart of cloud storage’. They are cheap – they have lowered cloud data storage prices 42 times since they started. But they are also somewhat generic and this comes from having a suite of products that tries to satisfy everybody.

But companies like CenturyLink and Peak 10 have created a niche in the cloud computing market by offering customized services. For example, Peak 10 has concentrated on the medical and the gaming industries. CenturyLink cut its teeth on providing services to governments and other large businesses.

There are several components to cloud computing – data storage, transport, computing power and software centralization. Amazon has clearly moved ahead of everybody else in storage capacity, but one has to wonder if this is a long-term advantage. It appears that data storage is moving towards being free, or nearly free. Obviously with the upcoming Internet of Things there is going to be more pressure put onto storage capacity, but the dropping prices for data storage is what has led to the repeated Amazon price cuts

CenturyLink competes much better in the transport arena. They were born out of the merger of Qwest and US West, with Qwest having significant fiber assets throughout the US and the hemisphere. They continued to expand their fiber post-merger and most of the US is close to a CenturyLink fiber. While transport prices have dropped, particularly on the major intercity routes, transport to smaller markets is still a very lucrative business, and having fiber in those markets gives CenturyLink an advantage in many regions.

Amazon also has the edge today in computing power by virtue of owning so many data center assets. Amazon is not ahead only by virtue of sheer number of data center computing assets, but they have also been working feverishly on building faster and more energy efficient servers and switches. This gives them a temporary market advantage, but these kind of advantages usually don’t last too long. There is a major industry shift towards software defined networking and this is going to result in cheap data center routers and switches for everybody.

I wrote this blog as an example that it’s possible for a company to reinvent itself. I don’t think anybody has been thought to be stodgier than CenturyLink for the last decade. While Verizon and AT&T have been adding data customers, CenturyLink struggled with old copper. But CenturyLink is now a player in cloud products and they have recently launched credible new business lines by building fiber-to-the-home networks and also launching their Prism DSL product that is similar to AT&T’s U-verse. I hold CenturyLink out as an example to my clients. If they are able to take the steps needed to make sure that they will be relevant decades from now, then so can any other ISP.