Who’s Going to Get the Experimental Grants?

FCC_New_LogoWhen the announcement came out for the FCC’s experimental grants I wrote that many of the entities who had expressed interest in those grants were going to be disappointed. Now that I have looked at the rules more deeply, let me talk about who is going to get these grants, or better yet, who is not going to get them.

Let’s first look at the largest $75 million portion of the grant. As you go through the rules you can quickly see who is not going to qualify for the money:

  • Must propose to deploy a network capable of delivering 100 Mbps downstream/25 Mbps upstream, while offering at least one service plan that provides 25 Mbps down / 5 Mbps up to all locations within the selected census blocks. This eliminates Wisps and cellular carriers.
  • Grants are eligible for areas that are not served today with broadband of at least 3 Mbps down / 768 kbps up. This eliminates cable companies that have DOCSIS 1 or higher cable modems today. It also stops these grants from being used by anybody in areas where there are cable modems today.
  • Can only be used to serve in price-cap areas. This eliminates anybody that wants to serve an area served by a smaller independent telco.
  • Must be able to serve whole census blocks. This eliminates any project not willing to do that. Census blocks very often don’t follow geographic or political boundaries and thus, to meet this criterion it might be necessary to build to an area that is very expensive or not within the planned territory.
  • Must submit three years of audited financials within 10 days and an irrevocable letter of credit within 60 days of award. This eliminates any start-up entity that is not already funded and may make it hard for them to qualify even with funding in hand. The feds were very disappointed that so much stimulus money was returned from start-ups who never got their matching funds.
  • This is a reverse auction, meaning that funds will go to entities that ask for the least amount of money. This eliminates anybody who isn’t willing to pay for the bulk of a project themselves. And just as a note, this is not a straight dollar comparison, but rather compares the amount of the grant requested to the theoretical cost of serving an area as calculated by federal high cost model.

So who is left? This leaves existing companies with good financial positions who are willing to build fiber or a few types of point-to-point wireless technology to rural areas served by large telcos where there is no existing speeds today of 3 Mbps download. And to win in the reverse auction, a company is going to have to pay for a lot of the project themselves. This probably means telcos, CLECs or existing muni businesses that already have plans to build into rural areas and who would be glad to get part of such a project covered by a grant. That is a pretty small universe of companies and the bulk of this money is likely to go to telcos.

These rules probably eliminates 95% of the entities that filed an interest in these grants this past spring. I hate to use the word sham, but in that process everybody was told that these grants were going to reward good ideas and that filers should be creative. And they were. There were numerous communicates that want to build gigabit networks and to bring their communities to snuff with metro areas. But this auction coldly awards the company willing to take the least amount of subsidy and continues until all of the money is rewarded. There is zero reward in the process for creativity, need or anything other than asking for a low amount of grant dollars per passing.

The other $25 million of the grants are a little easier to qualify for. The big difference is that those networks don’t have to be capable of 100 Mbps or offer speeds of 25 Mbps download. Instead there must be products of at least 10 Mbps down and 1 Mbps up along with a cap of at least 100 GB of usage with a latency better than 100 msec. The 100 GB cap is probably going to eliminate satellite providers.

But these grants still require the areas be rural with no existing service. And companies still need to be able to provide an irrevocable letter of credit within 60 day of award, eliminating start-ups. But this smaller pile of grants will be available to WISPs and cellular companies along with anybody already eligible for the $75 million grants. I would expect most of this money to go to wireless companies willing to pay for a bulk of projects themselves.

Politics and Municipal Competition

Capitol_domeNot long ago I had a blog that looked in amazement at how political the issue of net neutrality had become, and how it was being debated more along partisan lines than on the merits of the issue. And I noted that this was the biggest political issue I had remembered during my career in the industry that had started back in the 70’s.

And now, in a very short period of time national politics has entered our industry again on the issue of allowing municipal competition in broadband. I find this issue interesting because it looks at state barriers to competition, and states vary widely on how they handle the issue today. The press reports that there are 20 states that have a ban on municipal competition, or else rules that are draconian enough to effectively stop it.

This issue has been political for years at the state level. There is a group, the American Legislative Exchange Council (ALEC) that writes and promotes various laws that support conservative policies. These laws are then introduced into state legislatures whenever the environment seems ripe. ALEC has been pushing anti-municipal broadband legislation for several decades. In recent years we’ve seen bans enacted against municipal broadband in North Carolina in 2011 and South Carolina in 2012. Just this year an ALEC bill was introduced in Kansas.

But the fight has now moved from the state legislatures to the FCC. In the last week there were two petitions filed at the FCC by existing municipal fiber systems in Chattanooga, Tennessee and Wilson, North Carolina. These petitions ask the FCC to remove barriers in those states that prohibit those municipal fiber businesses from expanding outward to serve other communities. The FCC accepted these petitions and has asked for comments by August 29th. This probably means that the FCC will consider granting the petitions, which is in line with statements made all year by FCC Chairman Tom Wheeler who says that there should be no restrictions on municipalities from building fiber networks.

The FCC granted the hearing of these petitions in the face of open political opposition. On July 16 the House of Representatives voted to strip the FCC of any authority to allow communities to pursue broadband businesses. The bill passed 223 – 200 with Republicans voting 221 – 4 in favor of the bill. Just like with net neutrality, this issue is heavily partisan with Republicans staunchly against municipal competition while democrats, while not so staunch, seem these days to be for whatever the republicans are against.

And so, just like net neutrality, this debate has left the arena of public discourse and now is highly partisan. This seems odd to me since there is already a lot of municipal competition in the states that allow it. It’s been reported that there are over 150 communities that have built and are operating fiber networks to customers. And there are hundreds more that have built fiber networks to serve their own government, schools and even sometimes large businesses. And so we have many examples of how municipal competition works and what it means to a community. This is not a national fight on whether cities can get into the fiber business, but rather we are debating whether states can prohibit it.

It’s interesting because the Chattanooga petition to the FCC quotes republican Trent Lott, the Speaker of the House at the time of the Telecommunications Act of 1996, as saying how that Act would help communities to compete in the telecom space. So this hasn’t always been so clearly partisan, but like many issues it is now clearly divided along party lines.

I don’t like this political fight any more than I like the one on net neutrality. Call me old fashioned, but I would rather see topics in our industry decided on their merits rather than being divided straight along party lines. There certainly are arguments to be made on both sides. But to me, the argument that trumps them all is that there are tens of thousands of small rural communities that don’t have sufficient broadband. And in most cases there is nobody lining up to build broadband network in these places.

I say that we should let localities decide on their own what is in their best interest. Fiber networks and Internet access are growing to become a natural utility like water lines and electric lines. Communities without broadband are going to be at risk of withering away and becoming irrelevant. I look at this issue in the context of what happened with electricity a hundred years ago. At that time big companies scrambled to build electric networks in all of the major cities. But rural America was left behind and many small towns decided then to electrify their towns in order to be relevant and to be a place that people want to live. Broadband is this century’s electricity. In those places where no incumbent steps up to bring broadband the local community needs to have the right to do it on their own.

My Meeting With the FCC

Pulse~LinkLast week I went to an ex parte meeting with the FCC on behalf of Pulse~Link, Inc. They are a California company that has developed practical applications of ultrawideband technology. The purpose of the meeting was to let the FCC technical staff know of the existence of this technology because it solves some of the issues they are currently actively engaged with.

For those of you who have never been the FCC it is an interesting process. In a post-911 world the front of the building is blocked by concrete barriers. When you go through security they not only check your ID, but they give you a badge with your picture on it. Whenever you officially meet with the FCC staff or Commissioners to discuss anything industry related it’s considered an ex parte meeting. This means that a formal letter must be filed soon after the meeting to document who you met with and what you discussed. I like this process because it stops the large companies from lobbying staff without at least documenting the meeting. I have gathered some interesting information for my blog from the memos generated from ex parte meetings.

But this is a technology blog, so let me talk about the technology. Ultrawideband (UWB) is a technology that the FCC promoted for wireless applications in 2002 in Docket 02-48. Ultrawideband works at very high frequencies and is similar to 802.11ad in the wireless world. But Pulse~Link has been able to take this technology and put it onto any wired medium – coaxial cable, telephone copper or even electric wires. They are marketing this is a chipset, under the brand name CWave that can be used in a wide variety of existing devices.

The most exciting uses of the technology are on coaxial cable, and it is these uses that got me to this meeting. I see this as a disruptive and transformational technology. The best way to think of UWB is as DSL over coax. It can provide a separate and distinct Ethernet path far above the cable and data that are normally sent over coax, without any interference.

And the amount of data is significant. The chipset Pulse~Link has today has a raw data rate of 1.3 Gbps, or an effective throughput of about 450 Gbps. Today Pulse~Link can pair two of these chips to create nearly a full gigabit of extra bandwidth on a piece of coax. That is impressive and has a lot of immediate applications, but the real promise of the technology is the upgrade path. There is already a chip in the lab working at a raw rate of 5.4 Gbps. And by 2017 the company expects to have a chip that can operate at over 12 Gbps.

The immediate uses for the technology are for applications where additional bandwidth on coaxial cable can have an immediate market use. Consider the following:

  • Distribute Increased Bandwidth in Schools. CWave can be used on the existing coaxial cable found in most schools to distribute large bandwidth to classrooms. One of the greatest challenges for schools identified in the FCC WC 13-184 docket is how to distribute big bandwidth from the wiring closet to the classroom. CWave can distribute large-pipe Ethernet at a fraction of the cost of newly wiring or rewiring schools with category 5 cable. CWave can help to stretch Schools and Libraries funding to cover a lot more schools.
  • Higher Data Capacity on Cable Systems. CWave’s operates in RF Spectrum on coax that is normally considered unusable for traditional cable TV. This ability enables a second Ethernet path on existing HFC plant that does not interfere with existing programming or data. CWave offers an affordable alternative for rural and small-town cable systems to upgrade to gigabit data speeds at costs far lower than taking the traditional migration path through DOCSIS 3.0 and 3.1.
  • Competition in MDUs. Because CWave operates above the frequency used by traditional cable TV, a service provider can use the technology over existing coaxial cable to offer broadband services inside MDUs without physically disturbing the incumbent cable provider. Almost all fiber over-builders, including Google, Verizon, CLECs and municipalities, have had encountered roadblocks in seeking to serve MDUs due to the high cost of rewiring. CWave can help bring competitive high-bandwidth services to this significant demographic.
  • Bandwidth in Hospitals. Hospitals require ever-increasing bandwidth to support new medical equipment and technologies. Because of the complexity of the structures, hospitals are notoriously expensive to rewire. CWave can distribute bandwidth inside hospitals, particular the rural hospitals that are not wired with category 5 cable.

The Broken Cable Industry

Fatty_watching_himself_on_TVLast week Mediacom filed a petition asking the FCC asking to open a rulemaking about regulating programmers. The FCC is under no obligation to act on this request, but I hope that they do.

Mediacom makes a point that I totally agree with, which is that the relationship between programmers and distributors is broken. The industry has evolved to the point where the programmers dictate all of the terms for carrying their content. In the early days of the industry the power was mostly with the cable companies. The FCC changed that with the Cable Act of 1992 that required that cable companies to carry local networks. But over time since then the industry has tilted nearly 100% in favor of the programmers.

The power of the programmers is due in a large part because the ownership of content has become concentrated in the hands of a few companies. There are six corporations that together own most of the programming that people think of as part of the cable TV experience. This has happened through mergers and acquisitions, and the situation might get even worse if Rupert Murdock and New Corp is successful in buying Time Warner.

Condensed media power has resulted in the following:

  • Programmers dictate what content cable companies must buy. Each of the six major programmers owns some ‘must see’ channels but each also owns many other channels. Most programming contracts require a cable company to take everything offered by a programmer in order to get the must see channels.
  • Additionally, the programmers dictate where many of the channels must be placed in the line-up. This is why the line-ups of different cable companies look so eerily alike – they don’t have a lot of options about how to create packages from the content they buy. I have one client who recently got into the cable business. They hoped to have a basic lineup of around 60 channels, but the programming contracts instead forced them into carrying nearly 85 channels.
  • The programming rate increases are accelerating. Some of the recent rate increases I have seen are mindboggling. Earlier this year Viacom forced a nearly 100% rate increase over five years upon several cable providers. But the largest rate increases have been for local network channels. Five or six years ago these channels (ABC, CBS, FOX and NBC) were free or nearly free for cable companies. Now most of my clients are paying between $8 and $10 per customer per month just for these networks.
  • To make matters worse for cable companies, the programmers turn around and make a lot of their content available online on a delayed basis to companies other than the cable companies. This means the cable companies pay to carry the content and then see customers watching it somewhere else (with somebody else’s ads).

All of this translates into runaway inflation in cable rates. I’ve seen various industry estimates that the average cable rates by 2020 could be around $200 per month. The high cost of cable is obviously is affecting households and is starting to drive people to find an alternative to the big cable packages.

This is not to say that the cable companies don’t also have policies that are making cable too expensive. For example, they keep increasing rates on settop box rentals and cable modem while also introducing other mysterious charges onto cable bills.

Mediacom has asked the FCC to explore this issue and to find a fix. They call this a broken industry and they are right. The constant large rate increases cannot be sustained and every rate increase is driving more customers away from traditional cable TV.

Mediacom is asking for common sense fixes. They are asking to be allowed to unbundle the programming packages and buy only those channels they want and to then be able to package content to customers in ways that they want to buy it. These changes will require FCC intervention, because right now all of the terms are being dictated by the programmers. Every company that offers cable would benefit by these changes and if the FCC opens a docket I will be making comments.

If the FCC fails to act, I predict that within a few years that the industry will implode. There will become a point where households will abandon cable in sufficient numbers to put that part of the industry into a death spiral. It’s a tricky set of issues and one wonders how much authority the FCC has to make drastic changes without some assistance from Congress. But if nothing is done then the industry is headed over a cliff in the not too distant future.

Net Neutraility Comments at the FCC

Network_neutrality_poster_symbolThe FCC’s Net Neutrality docket got over 1 million comments, most from ordinary Americans who are worried about the large ISPs and web companies colluding to restrict or hijack their Internet experience. I read through some of these comments and people are universally worried about companies like Comcast and Google getting together to limit what they can do on the web. The public does not want to see a network provider have the ability to slow down their Internet experience or to dictate which web sites they can use.

Obviously I didn’t read all of the comments in this docket and one has to wonder if anybody at the FCC can or will read it all. That’s a tall task. But I did look at the comments of the larger carriers and web companies to see what they have to say. There were no surprises with the big ISPs on one side of the issue and almost everybody else on the other.

AT&T is in favor of no additional regulation of the Internet, meaning they would be free to prioritize traffic if they wish. This could obviously make them a lot of money. AT&T says if there must be regulation that they would prefer it to be through Section 706 regulation, which is the section of the FCC rules that talks about no blocking of Internet traffic. AT&T is totally against having a Title II classification of the Internet as a common carrier business. And not surprisingly, AT&T is not in favor of regulating data for wireless carriers.

Comcast is also against Title II classification as a common carrier and they prefer no regulation at all. Comcast says that they are already a good web citizen and don’t need to be regulated, but even if they were there would be loopholes that would allow carriers like them to discriminate. This seems like an odd argument to make from a company that wants approval for a giant merger. Comcast says that if there is regulation that it should also apply to public Wi-Fi and mobile broadband.

Verizon had the longest comments I saw. Verizon believes the best solution is the least amount of regulation possible. They think the market will control carriers because customers won’t accept being throttled. Verizon says the real threat to the Internet comes from companies like Google, Netflix and Amazon. And obviously they are very much against Title II regulation.

On the other side of the argument is, well, just about everybody else except a few other cable companies. There were a few filings that represented groups of Internet-based companies. The Information Technology Industry Council represented companies like Apple, Facebook, Google, Intel, Microsoft, Yahoo and many others. They argue that the FCC needs to put in rules to protect consumers, but also to protect both small and large web-based companies. They are not in favor of Title II regulation but instead would like to see something similar to the rules that were vacated by the courts.

The Internet Association represents Amazon, Ebay, Expedia, Facebook, Google, Linked-In, Twitter, Netflix, Yahoo, Yelp and many others. As you might have noticed, Google and Yahoo are in both industry groups. This group also doesn’t support full Title II regulation but thinks that the FCC needs to find ways to stop the ISPs from discriminating and wants the FCC to support application agnostic network management. They want the same rules to apply to wireless carriers.

Netflix is at the core of a current battle over network neutrality. Netflix is about the only big tech company I could find in favor of Title II regulation. They think anything short of full title II reclassification will just be asking for another court battle that the FCC will eventually lose.

One has to wonder if the volume of public comments means anything. It’s clear where the public stands on this issue and people are afraid that the Internet is going to change to their detriment. They already see the ongoing battle between Verizon and Netflix and they don’t want to see a future where their web experience is dependent upon how ISPs and content providers are getting along. When they buy an amount of bandwidth from an ISP they want whatever fits into that bandwidth to work.

The Future According to CenturyLink

telephone cablesRecently the CFO of CenturyLink, Stewart Ewing, spoke at the Bank of America / Merrill Lynch 2014 Global Telecom and Media Conference. He had some interesting things to say about the future of CenturyLink that contrasts with some of the things that other large carriers like AT&T and Verizon have been saying.

The most interesting thing he had to say is that CenturyLink sees the future of broadband in the landline link to a home. He cannot foresee 4G wireless as a substitute for a landline wireless connection. He doesn’t see wireless delivering enough bandwidth in coming years as demand at homes keeps growing. Already today the average CenturyLink residence uses slightly less than 50 Gigabits of data per month and that is far above the data caps for 4G plans. He doesn’t think cellular can deliver the needed speeds, and unless the cellular model is drastically changed, it’s too expensive and capped at really low levels.

So CenturyLink plans to continue to upgrade its rural plant. About two thirds of CenturyLink’s customers can get 10 Mbps or higher today and the company is working to make that available everywhere. Contrast this to AT&T and Verizon. They have both told the FCC that they have plans to convert millions of rural lines to 4G LTE. I have written about this many times and see it as one of the biggest threats on the horizon to rural broadband. LTE is a great product when you want a burst of fast data to your cell phone. But the LTE network is not designed to serve multiple video streams to large numbers of households. 4G is also capable of some fairly impressive speeds that are significantly in excess of 10 Mbps, but those speeds drop quickly as you move away from a cell site.

It’s fairly obvious that AT&T and Verizon favor LTE since it is their own best economic benefit – their wireless operations dwarf their landline businesses. Nobody can argue with a straight face that LTE is the best way to provide data for customers from either a a performance or a cost basis. Cellular coverage is still poor in a lot of rural America and so forcing people off of copper and onto wireless will  degrade the ability of many rural households to get broadband. But these two companies have a big financial incentive to move people from low-priced landlines to expensive cellular connections. It makes me think that if the FCC really cares about rural America that they ought to be divesting the landline business from AT&T and Verizon to remove the wireless bias.

CenturyLink says they are worried about the FCC changing the definition of rural broadband to be higher than 10 Mbps. They say that speed is difficult to achieve in their DSL plant and that they are far more comfortable with a definition of around 6 Mbps. It’s honestly refreshing to hear a telco executive tell people the truth for a change. The other big telcos spew a lot of rhetoric to sway the FCC or to assuage the public and it’s unusual to hear somebody tell the unvarnished truth to the public.

Those who follow my blog know that I promote a high definition of broadband. Households want the ability to stream multiple videos simultaneously. And you can expect in just a few years for there to be a much greater demand for HD quality video and a climbing demand for 4K video. The average urban household that has choice is already finding 10 Mbps to be far too slow. Just this week Verizon increased its minimum FiOS speeds to a symmetrical 35 Mbps. I know this is a really big technological expectation for CenturyLink and other rural telcos still using copper, but the definition of broadband needs to keep pace with what the normal household wants to buy, and that number is going to keep climbing year after year. If we don’t set the bar high then rural places are going to fall further behind the speeds available in cities.

CenturyLink does expect to continue to expand the footprint of its Prism TV product. This is a paired and bonded DSL product that can deliver up to 50 Mbps for customers somewhat close to a central office. CenturyLink has made this available to over 2 million customers and plans to make it available to 300,000 more in 2015.

Interestingly, CenturyLink does not plan to expand WiFi hotspots. Some other carriers seem to be in a race to put in hot spots but CenturyLink cannot see a way to monetize the effort. Of course, CenturyLink will put a hotspot in for a business that asks for one, but they don’t intend to build hotspots of their own. I have also written about this topic several times. Nobody who is not serving a captive audience like at an airport or in an airplane has been able to get much money from selling Internet by the hour. And while the giant cellular carriers benefit greatly by more WiFi, I have yet to hear of a deal where they are paying somebody to install public hot spots. Comcast says they have installed hundreds of thousands of hot spots and they recently announced that they are turning the wireless modems of all of their residential customers into hot spots. But to me that seems more like a move that is going to antagonize the public greatly with little obvious monetary benefit. I think CenturyLink is being prudent to stay away from this business until somebody shows a way to make money with it.

Cracking Down on USF Fraud

FCC_New_LogoLast week the FCC announced what it is calling a USF Strike Force that will be “dedicated to combating waste, fraud and abuse” in the Universal Service Fund (USF). This Strike Force will be headed up by Loyaan Egal who was a former senior assistant US Attorney in the fraud and public corruption section of the US District Attorney’s office for the District of Columbia.

They are not just talking about doing audits and getting refunds of improper charges to the USF. They are going to partner with the FCC’s Office of the Inspector General and the US Department of Justice to prosecute any unlawful conduct they find.

It is, of course, always a good idea for any federal program to make sure that the funds they are handing out are being done so rightfully. But that is not the only motivation in this instance. The FCC recently announced that they were going to dedicate an additional $2 billion over two years towards expanding broadband into schools. This is not new funding, but instead they want to find this money within the existing USF. Part of that funding is going to come from redefining what is eligible for schools to receive out of the Schools and Libraries Fund. For example, some schools today get reimbursed for phone lines and pagers and that is expected to go away. But the FCC hopes to find the rest of the money by aggressively seeking out and ending abuse in the USF.

To put this into perspective, let’s look at the size of the USF in 2013. Last year the fund collected over $8.3 billion. Those funds were used as follows:

  • $4.17 billion – Rural High Cost Program
  • $1.80 billion – Lifeline Program
  • $2.20 billion – Schools and Libraries Program
  • $0.09 billion – Rural Health Care Program
  • $0.07 billion – Pilot Program for Health Care
  • $0.11 billion – Administrative Expense

The FCC probably needs to find $500 million per year in cost savings to fund their school initiative. We don’t yet know the exact number until they put out more detail of the amended reimbursement rules for School and Libraries Program. But in order to find $500 million the investigators will have to successfully determine that 6% of the monies being paid out as USF today are fraudulent. That is a tall order. Do they really think there is that much fraud in the program, and if so, why haven’t they acted before now?

There have been some recent headlines about fraud in the Lifeline Program. This program pays up to $9.25 per month towards a phone for low income customers. Just recently the FCC and FBI announced that three Florida men had defrauded the Lifeline program for $32 million by charging the fund for imaginary customers. One would have to assume that is a multi-year number and it will take finding a slew of additional examples of fraud to uncover $500 million per year.

Anybody who receives USF ought to be nervous by this investigation even if they are following the rules to a tee. One can expect audits and questionnaires to ask you to prove your eligibility for the USF funds. Over the years I have reviewed the USF filings from numerous firms who draw from the USF and it is not unusual to find errors in their filings, due mostly to not understanding the filing rules. The rules for being eligible for USF are complex, particularly in the High Cost Fund, and companies routinely hand this work off to outside consultants. I would be at least a little nervous that an error in my filing could be interpreted as fraud.

I also have one last word of advice to the FCC. Take a look at the $110 million a year it costs to administer these funds. While that is only 1.3% of what is collected, it just seems like this function could be done for a lot less than $110 million per year.

Something New

500px-LG_Logo_svgI often report on new scientific breakthroughs, but today’s blog is about a few new technological applications that will benefit our industry.  I rarely go a day without seeing some new innovation, and I thought the following were the most interesting ones I’ve seen lately.

First, Apple is working on a fuel cell for portable devices. This would be a big breakthrough for laptops, tablets and other sizable portable devices because fuel cells have some distinct advantages over the batteries we use today. For one thing, the fuel generating chemicals in a fuel cell can be refilled and so the battery could theoretically be kept going for a long time.

This is important because laptop batteries today are considered as toxic waste. Fuel cells would cut down on pollution by eliminating some of the nastier metals used in today’s batteries. By  lasting longer they would cut down on the huge number of batteries we are burning through as a society. For instance Apple could develop a universal tablet battery that you would keep as you move through newer generations of devices. Our devices today are not particularly green and disposing of our devices is creating a challenge for landfills and groundwater, mostly due to the batteries. So the small fuel cells will be a big step towards greener devices.

Next, the Korean manufacturer LG has announced a TV screen that can be rolled up like a poster. There have been flexible TVs around for a few years, but LG has made a screen that can be rolled up into a 1.5 inch tube. This will drastically change the supply chain for TVs. They are expensive today to ship and store due to the large sizes and they often get damaged in transit. But TVs this flexible can be sent by UPS in a tube.

The LG technology can also produce transparent TVs. This opens up a whole new world of applications for TVs and monitors. For instance you could put TVs on bathroom mirrors to watch while you brush your teeth. They could go on any window or on any wall and would disappear when not being used as TVs. I remember reading science fiction books many years ago that predicted that there would be TVs everywhere in the future, and with this technology that might be finally possible. These screens also advance the trend for separating the TV electronics from the screens. We will be able to put screens anywhere controlled by the same centralized smart box.

LG says they will be able to make a transparent 60-inch flexible TV capable of 4K quality by 2017. But the promise of this technology is not just for giant TVs, but also for little TVs screens that can be put anywhere – in the bathroom, kitchen, shop, garage – wherever somebody wants to watch a screen. The biggest outcome of cheap TVs everywhere would be an explosion in the demand for bandwidth. It’s not hard to picture households wanting to have ten football games on at the same time on Sundays.

Google has announced a new feature for Android that allows devices in proximity to each other to automatically connect. They are calling this technology Nearby. Any device using this technology would seek out and find any other nearby devices and would connect to enable communication. This has a lot of applications. For example, when friends or family meet their phones could automatically synch up and update calendars or whatever else they want to share.  This technology might be the platform to let stores contact shoppers as they pass through the store to offer specials or point out items of interest. And for the Internet of Things this is a handy way to make the smartphone the controller of other devises. Whenever you walk into a room in your house your phone would be instantly talking to all of the Nearby devices there.

Nearby would do this by automatically turning on the Bluetooth, WiFi and microphones as needed. There are some privacy concerns about this capability and certainly there will be apps to let each user set the degree to which they are willing to be open to others, and to also control who might be able to connect to you. But Google is counting on most people wanting to have an interactive shopping experience and that is probably what they see as the biggest commercial application of the technology. Google has been looking for a way to compete with Amazon in he shopping arena, and this might be that platform. Where Amazon dominates the online shopping experience Google could come to dominate the in-store shopping experience.

 

What’s the Truth About Netflix?

Polk County SignClearly a lot of customers around the country are having trouble with NetFlix. The latest round of finger pointing is going on between Verizon, Netflix and some intermediate transport providers.

Netflix uses adaptive streaming for its standard quality video and this only requires about 2 Mbps at the customer end to get the quality that Netflix intends for the video. HD videos require more bandwidth, but customers are complaining about standard video. A Netflix download requires a burst of data up front so that the movie can load ahead of the customer. But after that it stays steady at the 2 Mbps rate and the download even pauses when the customer pauses. It’s getting hard to find an urban ISP that doesn’t deliver at least that much speed, so one would assume that any customer who subscribes to at least 2 Mbps data speeds should not to be having trouble watching Netflix.

But they are. On their blog Verizon talks about a customer who has a 75 Mbps product and who was not getting good Netflix quality. On that blog Verizon says that it checked every bit of its own network for possible choke points and found none. For those not familiar with how networks operate, a choke point is any place in a network where the amount of data traffic passing through could be larger than the capacity at the choke point. In most networks there are generally several potential chokepoints between a customer and the outside world. In this blog Verizon swears that there is nothing in its network for this particular customer that would cause the slowdown. They claim that the only thing running slow is Netflix.

This is not to say that there are no overloaded chokepoints anywhere in Verizon networks. It’s a big company and with the growth of demand for data they are bound to have choke points pop up – every network does. But one would think that their fiber FiOS network would have few chokepoints and so it’s fairly easy to believe Verizon in this instance.

Verizon goes on to say that the problem with this Los Angeles customer is either Netflix or the transit providers who are carrying Netflix traffic to Verizon. Verizon is not the only one who thinks it’s the transit interface between the networks. Here is a long article from Peter Sevcik of NetForecast Inc. that shows what happened to the Netflix traffic at numerous carriers both before and after Netflix started peering directly with Comcast. This data shows that traffic got better for everybody else immediately upon the Comcast transition, which certainly indicates that the problem is somewhere in the transit between Netflix and the ISPs.

Verizon says the problem is that Netflix, or the intermediate carriers don’t want to buy enough bandwidth to eliminate chokepoints. Sounds like a reasonable explanation for the troubles, right? But then Dave Schaffer, the CEO of Cogent came forward and pointed the finger back at Verizon. He says that the problem is indeed in the interface between Cogent and Verizon. But Schaffer claims this is Verizon’s fault since they won’t turn up additional ports to relieve the traffic pressure.

So now we are back to square one. The problem is clearly in the interface between Verizon and carriers like Cogent. But they are blaming each other publicly. And none of us outside of this squabble are going to know the truth. Very likely this is a tug-of-war over money, and that would fall in line with complaints made by Level3, who says that Verizon is holding traffic hostage to extract more money from the transit carriers.

The FCC is looking into this and it will be interesting to see what they find. It wouldn’t be surprising if there is a little blame on both sides, which is often the case when network issues devolve into money issues. Carriers don’t always act altruistically and sometimes these kinds of fights almost seem personal at the higher levels of the respective companies. The shame from a network perspective is that a handful of good technicians could solve this problem in a few hours. But in this case even the technicians at Verizon and the transit carriers might not know the truth about the situation.