Do We Need International Digital Laws?

German Chancellor Angela Merkel said a few weeks ago that the world needs international regulation of digital technology, much like we have international regulation for financial markets and banking.

She says that without some kind of regulations that isolated ‘islands’ of bad digital actors can emerge that are a threat to the rest of the world. I am sure her analogy is a reference to the handful of islands around the globe that play that same maverick role in the banking arena.

We now live in a world where a relatively small number of hackers can cause incredible harm. For instance, while never definitely proven, it seems that North Korea hackers pulled off the major hack of Sony a few years ago. There are accusations across western democracies that the Russians have been using hacking to interfere with elections.

Merkel certainly has a valid point. Small ‘islands’ of hackers are one of the major threats we face in the world today. They can cause incredible economic harm. They threaten basic infrastructure like electric grids. They make it risky for anybody to be in the Internet at a time when broadband access is becoming an integral part of the lives of billions.

There currently aren’t international laws that are aimed at fighting the nefarious practices of bad hackers or at punishing them for their crimes. Merkel wasn’t specific about the possible remedies. She said that the US and Germany have undertaken discussions on the topic but that it hasn’t gone very far. There are certainly a few things that would make sense at the international level:

  • Make certain kinds of hacking an international crime so that hacker criminals can more easily be pursued across borders.
  • Create a forum for governments to better coordinate monitoring hackers and devising solutions for blocking or stopping them.
  • Make laws to bring cryptocurrency under the same international auspices as other currencies.

But as somebody who follows US telecom regulation in this blog I wonder how fruitful such regulations might be? We now live in a world where hackers always seem to be one step ahead of the security industry that works to block them. The cat and mouse game between hackers and security professionals is a constantly changing one and I have to wonder how any set of rules might be nimble nimble enough to make any difference.

This does not mean that we shouldn’t have an international effort to fight against the bad actors – but I wonder if that cooperation might best be technical cooperation rather than the creation of regulations that might easily be out of date as they are signed into law.

Any attempt to create security regulations also has to wrestle with that fact that a lot of what we think of as hacking is probably really government sponsored cyberwarfare. How do we tell the difference between cyber-criminals and cyber-warriors? In a murky world where it’s always going to be hard to know who specifically wrote a given piece of code I wonder how we tell the criminal bad guys from the government bad guys?

I also see a dilemma in that any agreed-upon international laws must, by definition filter back into US laws. We now have an FCC that is trying to rid itself of having to regulate broadband. Assuming that Title II regulation will be reversed I have to wonder if the FCC would be able to try to require ISPs to comply with any international laws at a time when there might not even be many US laws that can be enforced on them.

It makes sense to me that there ought to be international cooperation in identifying and stopping criminal hackers and others that would harm the web. But I don’t know if there has even been an issue where the governments of the world engage in many of the same practices as the bad actors – and that makes me wonder if there can ever be any real cooperation between governments to police or control bad practices on the web.

The Fastest ISPs

PC Magazine has been rating ISPs in terms of speed for a number of years. They develop their rankings based upon speed tests taken at their own speed test site. They had about 124,000 speed tests taken that led to this year’s rankings. The scoring for each ISP is a composite number based 80% on the download speed and 20% of upload speeds. To be included in the rankings an ISP needed to have 100 customers or more take the speed test.

You always have to take these kinds of rankings with a grain of salt for several reasons. For example speeds don’t only measure the ISP but also the customer. The time of day can affect the speed test, but probably the type of connection affects it the greatest. We know these days that a lot of people are using out-of-date or poorly located WiFi routers that affect the speeds at their computer.

Measured speeds vary between the different speed tests. In writing this blog I took four different speed tests just to see how they compare. I took the one at the PC Magazine site and it showed my speeds at 27.5 Mbps down / 5.8 Mbps up. I then used Ookla which showed 47.9 Mbps down / 5.8 Mbps up. The Speakeasy speed test showed 17.6 Mbps down and 5.8 Mbps up. Finally, I took the test from Charter Spectrum, my ISP, which showed 31.8 Mbps down / 5.9 Mbps up. That’s a pretty startling set of different speeds measured just minutes apart – and which demonstrates why speed test results are not a great measure of actual speeds. I look at these results and I have no idea what speed I actually am receiving. However, with that said, one would hope that any given speed test would probably be somewhat consistent in measuring the difference between ISPs.

The results of the speed test ‘contest’ are done for different categories of ISPs. For years the winner of the annual speed test for the large incumbents has been Verizon FiOS. However, in this year’s test they fell to third in their group. Leading that category now is Hotwire Communications which largely provides broadband to multi-tenant buildings, with a score of 91.3. Second was Suddenlink at 49.1 with Verizon, Comcast and Cox and closely behind. The lowest in the top 10 was Wow! at a score of 26.7.

Another interesting category is the competitive overbuilders and ISPs. This group is led by Google Fiber with a score of 324.5. EPB Communications, the municipal network in Chattanooga, is second at 136.1. Also in the top 10 are companies like Grande Communications, Sonic.net, RCN, and Comporium.

PC Magazine also ranks ISPs by region and it’s interesting to see how the speeds for a company like Comcast varies in different parts of the country.

Results are also ranked by state. I find some of the numbers on this list startling. For instance, Texas tops the list with a score of 100.3. Next is South Dakota at 80.3 and Vermont at 70.6. If anything this goes to show that the rankings are not any kind of actual random sample – it’s impossible to think that this represents the true composite speeds of all of the people living in those states. The results of this contest also differs from results shown by others like Ookla that looks at millions of actual connection speeds at Internet POPs. Consider Texas. Certainly there are fast broadband speeds in Austin due to Google Fiber where all of the competitors have picked up their game. There are rural parts of the state with fiber networks built by telcos and cooperatives. But a lot of the state looks much like anywhere else and there are a lot of people on DSL or using something less than the top speeds from the cable companies.

But there is one thing this type of study shows very well. It shows that over the years that the cable companies are getting significantly faster. Verizon FiOS used to be far faster than the cable companies and now lies in the middle of a pack with many of them.

This test is clearly not a statistically valid sample. And as I showed above with my results from various speed tests the results are not likely even very accurate. But ISPs care about these kinds of tests because it can give them bragging rights if they are near the top of one of the charts. And, regardless of the flaws, one would think the same shortcomings of this particular test are similar across the board, which means it does provide a decent comparison between ISPs. That is further validated by the fact the results of this exercise are pretty consistent from year to year.

Industry Shorts, June 2017

Following are some topics I found of interest but which don’t justify a whole blog.

Amazon Bringing Alexa to Settop Boxes. Amazon has created a develop kit that would allow any settop box maker to integrate their voice service Alexa. The Alexa voice platform is currently supporting the popular Echo home assistant device. It’s also being integrated into some new vehicles and Amazon has made it available for integration into a whole range of home automation devices. The Amazon Alexa platform is currently ahead of the competitors at Apple, Google and Microsoft mostly due to having made the product open to developers who have already created over 10,000 applications that will work on the platform. Adding Alexa to a settop box could make it a lot easier to use the settop box as the hub for a smart home.

Comcast Tried to Shut Down anti-Comcast Website. LookingGlass Cyber Security Center, a vendor for Comcast, sent a cease-and-desist letter to the advocacy group Fight for the Future. This group is operating a website called comcastroturf.com. The advocacy group claims that Comcast has used bots to generate over a half million fake filings to the FCC in the network neutrality docket. These comments were all in favor of killing net neutrality and the group claims that Comcast used real people’s names to sign the filings, but without their permission. The website allows people to see if their name has been used. The cease-and-desist order was withdrawn after news of it got a lot of coverage in social media.

Net Neutrality Wins in Court. Not that it probably makes much difference now that the FCC is trying to undo Title II regulation, but the challenge filed by Verizon and other large ISPs against the FCC’s net neutrality decision was rejected at appeal. This affirms the ability of the FCC to use Title II rules for regulating broadband. The full U.S. Court of Appeals for the D.C. Circuit upheld an earlier court ruling that affirmed the FCC had the needed authority to implement the net neutrality decision.

Altice Buys Ad-Tech Company. Altice joins other big ISPs that want to take advantage of the end of the new FCC rules that allows ISPs to monetize customer’s private data. Altice, which is now the fourth largest US cable company after the acquisition of Cablevision, now joins the other big ISPs who have added the expertise to slice and dice customer data. Altice paid $300 million for Teads, a company specializing in targeting advertising based upon customer specific data.

Other large ISPs are already poised to take advantage of the new opportunity. For example, Verizon’s purchase of AOL and Yahoo brings this same expertise in-house. It has been widely speculated that the ISPs have been gathering customer data for many years and so are sitting on a huge treasure trove detailing customers web browsing usage, on-line purchasing habits, email and text information, and for the wireless ISPs the location data of cellphones.

Charter Rejects $100 Billion offer from Verizon. The New York Post reported that Charter rejected a purchase offer from Verizon. The Post reports that Charter thought the offer wasn’t high enough. It also came with some tax implications that would complicate the deal. Whether this particular offer is real or not, it points to the continuing consolidation of the industry ISPs, cable providers and cellular companies. The current administration is reportedly not against large mergers, so there’s no telling what other megadeals we might see over the next few years.

Top 7 Media CEOs made $343.8 Million in 2016. The CEOs of CBS, Comcast, Discovery Communications, Disney, Fox, Time Warner and Viacom collectively made a record salary last year, up 21.1% from 2015. It’s interesting in a time when the viewership of specific cable networks is dropping rapidly that the industry would be rewarding their leaders so handsomely. But all of these companies are compensating for losses of customers with continuing rate hikes for programming and most are having banner earnings.

Frontier Lays Off WV Senate President. Frontier just laid off Mitch Carmichael, the President of the Senate in West Virginia. This occurred right after the Senate passed a broadband infrastructure bill that was aimed at bringing more broadband competition to the state. The bill allows individuals or communities to create broadband cooperatives to build broadband infrastructure in areas with poor broadband coverage. Frontier is the predominant ISP in the state after its purchase of the Verizon property there. The West Virginia legislature is a part-time job that pays $20,000 per year and most legislators hold other jobs. West Virginia is at or near the bottom in most statistics concerning broadband speeds and customer penetration rates.

FCC to Investigate MDU Broadband

The FCC is launching an investigation into anticompetitive practices that are keeping broadband from coming to apartments and other multi-tenant buildings. They have issued a Notice of Inquiry in Docket in GN Docket 17-142 looking into the topic and are expected later this month to formally release it to the public. The docket specifically looks at barriers to competition in what the FCC is calling MTEs – multiple tenant environments, which includes apartments, condominiums, shopping malls and cooperatively owned buildings.

This is not the first time that the FCC has tackled the topic. Back in 2008 the commission banned some contractual arrangements that gave incumbent ISPs unfair advantage over competitors. However, that order didn’t go far enough, and ISPs basically shifted to arrangements that were not banned by the FCC. The FCC is looking into the topic because it’s become obvious that exclusive arrangements are harming the introduction of faster broadband into a sizable portion of the market. There are cities where half or more of residents live in apartments and don’t have the same competitive choices as those living in single family homes.

The FCC has an interesting legal challenge in looking at this issue. This docket specifically looks at the potential for regulating broadband access in MTEs, something that the FCC has the authority to do under Title II regulation. But assuming that the FCC moves forward this year with plans to scrap Title II regulation they might also be eliminating their authority to regulate MTEs in the manner suggested by the NOI. If they decide to act on the issue it will be interesting to see how they define their authority to regulate anything that is broadband related. That might be our first glimpse at what a regulatory regime without Title II looks like.

Further, Chairman Ajit Pai has shown a strong preference to lighten the regulations on ISPs and you have to wonder if he is willing to really tackle a new set of regulations. But he’s faced with the dilemma faced by all regulators in that sometimes the market will not automatically produce the results that are beneficial to society and sometimes regulations are the only way to get corporations and others to behave in the way that benefits everybody. It’s clear that residents in MTEs have little or no competition and choice and new regulations might be the only way to get it for them.

The NOI looks at specific issues related to MTE broadband competition:

  • It asks if the FCC should consider overriding state and local regulations that inhibit the deployment of broadband in MTEs. Some jurisdictions have franchising and other rules that make it hard for a smaller competitor to try to serve only MTEs or parts of markets.
  • It asks if the FCC should prohibit exclusive marketing and bulk billing arrangements by ISPs.
  • It asks if the FCC should prohibit revenue sharing and exclusive wiring arrangements with ISPs.
  • It asks if there are other kinds on non-contractual practices that should be prohibited or regulated.

The NOI is interesting in that it tackles all of the topics that the FCC left untouched in 2008. When that order came out I remember thinking about all of the loopholes the FCC had left available to ISPs that wanted to maintain an exclusive arrangement with apartment owners. For example, bulk billing arrangements are where a landlord buys wholesale connections from an ISP and then includes broadband or cable TV as part of the rent, at a mark-up. Landlords under such arrangements are unlikely to allow in another competitor since they are profiting from the exclusive arrangement. The FCC at the time didn’t feel ready to tackle the issues associated with regulating landlord behavior.

The NOI asks for comments on the non-contractual issues that prohibit competition. I’ve seen many such practices in the marketplace. For instance, a landlord may tell tenants that they are pro-competition and that they allow access to multiple ISPs, but then charge exorbitant fees to ISPs for gaining access to buildings or for wanting to collocate electronics or to run wiring. I can think of dozens of different roadblocks that I’ve seen that effectively keep out competitors.

I am heartened a bit by this docket in that it’s the first thing this new FCC has done to solve a problem. Most of the work they’ve done so far is to dismantle old rules to reduce regulation. There is nothing wrong with that in general and I have my own long shopping list of regulations that are out of date or unnecessary. But there are industry issues like this one where regulation is the only way to provide a needed fix to a problem. It’s clear that large ISPs and many landlords have no interest in bringing competition to their buildings. And if that is a goal that the FCC wants to foster, then they are going to have to create the necessary regulations to make it happen – even if they prefer to not regulate.

Means Testing for FCC Funding – Part II

Yesterday I wrote about the recent blog by FCC Commissioners Michael O’Rielly and Mignon Clyburn that suggests that there ought to be a means test for anybody accepting Universal Service Funds. Yesterday I looked at the idea of using reverse auctions for allocating funds – an idea that I think would only serve to shift broadband funds to slower technologies, most likely rural cellular service for broadband. Today I want to look at two other ideas suggested by the blog.

The blog suggests that rural customers ought to pay more for broadband since it costs more to provide broadband in sparsely populated areas. I think the FCC might want to do a little research and look at the actual prices charged today for broadband where commercial companies have built rural broadband networks. It’s something I look at all of the time and all over the country, and from what I can see the small telcos, cooperatives, WISPs and others that serve rural America today already charge more than what households pay for broadband in urban areas – sometimes considerably more. I am sure there are exceptions to this and perhaps the Commissioners have seen some low rural pricing from some providers. But I’ve looked at the prices of hundreds of rural ISPs and have never seen prices below urban rates.

The small rural ISPs have to make a commercial go of their broadband networks and they’ve realized for years that the only way to do that is to charge more. In most urban areas there is a decent broadband option starting around $40 per month and you rarely see a price close to that in rural America. If you see a low price in rural America it probably offers a very slow speed of perhaps a few Mbps, which certainly doesn’t compare to the 60 Mbps I get from Charter for $44.95 per month.

The issue of rural pricing does raise one policy issue. Historically the Universal Service Fund was used for precisely what this blog seems not to like – to hold telephone rates down in rural America so that everybody in the country could afford to be connected. That policy led to the country having telephone penetration rates for decades north of 98%. I’m not advocating that USF funds ought to be used to directly hold down rural broadband rates, but it’s worth a pause to remember that was the original reason that the Universal Service Fund was started and it worked incredibly well.

The second idea raised by the blog is that Universal Service Funds ought not be used to build broadband to wealthy customers. They suggest that perhaps federal funding ought not to be used to bring broadband to “very rich people who happen to live in the more rural portions of our nation.”  The blog worries that poor urban people will be subsidizing ‘some of the wealthiest communities in America.’  I am sure in making that statement that the Commissioners must have a few real-life communities in mind. But I work all over the country and there are not very many pockets of millionaires in rural America, except perhaps for farmers.

Farmers are an interesting case when it comes to broadband. By definition farmers are rural. But US agriculture is the largest industry in the country and the modern farmer needs broadband to be effective. We are headed soon towards a time when farm yields can increase dramatically by use of IoT sensors, farm robots and other high technology that is going to require broadband. I know that a lot of the rural communities that are clamoring for broadband are farming communities – because those farms are the economic engine that drives numerous counties and regions of the country. I don’t think it’s unreasonable if we are going to rethink policy to talk about bringing broadband to our largest industry.

The FCC blog suggests that perhaps wealthier individuals ought to pay for the cost of getting connected to a broadband network. It’s certainly an interesting idea, and there is precedent. Rural electric companies have always charged the cost of construction to connect customers that live too far from their grid. But with that said we also have to remember that rural electric grids were purposefully built to reach as many people as possible, often with the help of federal funding.

This idea isn’t practical for two reasons. It’s already incredibly hard today to finance a fiber network. I picture the practical problem of somehow trying to get commitments from farmers or other wealthy individuals as part of the process of funding and building a broadband network. As somebody who focuses mostly on financing fiber networks this would largely kill funding new networks. To get the primary borrower and all of the ‘rich’ people coordinated in order to close a major financing is something that would drive most lenders away – it’s too complicated to be practically applied. The FCC might want to consult with a few bankers before pushing this idea too far.

But there is a more fundamental issue and the FCC blog touches upon it. I’m trying to imagine the FCC passing a law that would require people to disclose their income to some commercial company that wants to build a fiber network. I’m not a lawyer, but that sounds like it would bump against all sorts of constitutional issues, let alone practical ones. For example, can you really picture having to report your income to AT&T?  And I then go back to the farmers again. Farmers don’t make a steady income – they have boom years and bust years. Would we put them on or off the hook for contributing towards a fiber network based upon their most recent year of income?

I certainly applaud the Commissioners for thinking outside the box, and that is a good thing when it leads to discussions of ways to improve the funding process. I will be the first to tell you that the current USF distributions are not always sensible and equitable and there is always room for improvement. Some of the ideas suggested by the blog have been discussed in the past and it never hurts to revisit ideas. But what most amazes me about the suggestions made by this blog is that the proposed solutions would require a heavy regulatory hand – and this FCC, or at least its new Chairman has the goal of reducing regulation. To impose a means test or income test would go in the opposite direction and would require a new layer of intrusive regulations.

Means Testing for FCC Funding – Part I

A recent blog by FCC Commissioners Michael O’Rielly and Mignon Clyburn asks if there should be a means test in federal high cost programs. This blog is something every telco, school, library or health care provider that gets any form of Universal Service funding needs to read.

There is already some means testing in the Universal Service Fund. For instance, the Lifeline program brings subsidized voice and broadband only to households that meet certain poverty tests. And the Schools and Libraries program uses a mean test to make certain that subsidies go to schools with the most low-income students. The FCC blog talks about now applying a means test to the Universal Service Funds that are used to promote rural broadband. There are several of these programs, with the biggest dollar ones being the CAF II funding for large telcos and the ACAM program for small telcos to expand rural broadband networks.

The blog brings up the latest buzzword at the FCC, which is reverse auction. The FCC embraces the concept that there should be a competition to get federal money to expand broadband networks, with the funding going to the carrier that is willing to accept the lowest amount of funding to expand broadband into an area. On the surface that sounds like a reasonable suggestion in that it would give money to the company that is the most efficient.

But in real-life practice reverse auctions don’t work, at least for building rural broadband networks. Today these FCC infrastructure programs are aimed at bringing broadband to places that don’t have it. And the reason they don’t have it is because the areas are largely rural and sparsely populated, meaning costly for building broadband infrastructure. In most of these places nobody is willing to build without significant government subsidy because there is no reasonable business plan using commercial financing.

If there was a reverse auction between two companies willing to bring fiber to a given rural area, then in my experience there wouldn’t be much difference between them in terms of the cost to build the network. They have to deploy the same technology over the same roads to reach the same customers. One might be slightly lower in cost, but not enough to justify going through the reverse auction process.

And that is the big gotcha with the preference for reverse auctions. A reverse auction will always favor somebody using a cheaper technology. And in rural broadband, a cheaper technology means an inferior technology. It means using federal funding to expand DSL or cellular wireless as is being done with big telco CAF II money instead of building fiber, as is being done by the small telcos accepting ACAM money.

Whether intentional or not, the FCC’s penchant for favoring reverse auctions would shift money from fiber projects – mostly being done by small telcos – to the wireless carriers. It’s clear that building cellular technology in rural areas is far cheaper than building fiber. But to use federal money to build inferior technology means relegating rural areas to dreadfully inadequate broadband for decades to come.

Forget all of the hype about how 5G cellular is going to bring amazing broadband speeds – and I hope the FCC Commissioners have not bought into cellular company’s press releases. Because in rural areas fast 5G requires bringing fiber very close to customers – and that means constructing nearly the same fiber networks needed to provide fiber into homes. The big cellular companies are not going to invest in rural 5G any more than the big telcos have ever invested in rural fiber. So a reverse auction would divert federal funds to Verizon and AT&T to extend traditional cellular networks, not for super-fast wireless networks.

We already know what it looks like to expand rural cellular broadband. It means building networks that deliver perhaps 20 Mbps to those living close to cell towers and something slower as you move away from the towers. That is exactly what AT&T is building with their CAF II funding today. AT&T is taking $426 million per year for six years, or $2.5 billion in total to expand cellular broadband in rural areas. As I’ve said many times in the past this is perhaps the worse use of federal telecom funding I have ever seen. Customers on these cellular networks are getting broadband on day one that is too slow and that doesn’t even meet the current FCC’s definition of broadband. And in the future these customers and rural communities are going to be light-years behind the rest of the country as household demand for broadband continues to grow at a torrid pace while these customers are stuck with an inadequate technology.

The FCC blog also mentions the concept of possibly re-directing future USF payments, and if I am a small telco that scares me to death. This sounds like the FCC may consider redirecting this already-committed ACAM funding. Numerous small telcos just accepted a 10-year commitment to receive ACAM funding from the USF Fund to expand broadband in rural areas, and many are already borrowing matching funds from banks based upon that commitment. Should that funding be redirected into a reverse auction these small companies will not be able to complete their planned expansion, and if they already borrowed money based upon the promise of that ACAM funding they could find themselves in deep financial trouble.

New Technologies, June 2017

Following are some interesting new technologies I’ve run across recently.

WiFi Imaging. Cognitive Systems has a product they call Aura that can detect motion inside of a home using WiFi. The technology was developed a few years ago at MIT. The technology used is called Radio Frequency (RF) Capture. The device can sense subtle changes in wireless signals to determine if something is moving in the home. It can be set to different sensitivities to be able to detect people, but not animals. It can also be set to track specific cellphones so that you’ll know when a known person has entered or left the home. For now the device does not connect to external security services but sends a message to a smartphone.

Some German researchers at the University of Munich have already taken this same idea a lot farther. In a paper published in the Physical Review of Letters they describe a technique where they can use WiFi to create 3D holographic images through walls. The lab unit they have built can detect objects down to about 4 centimeters in size. They scan ten times per second and can see outlines of people or pets moving inside of another room. This technology is eerily reminiscent of the surveillance machine in The Dark Knight that Bruce Wayne destroys at the end of the movie since it was a scary invasion of privacy.

Eliminating IoT Batteries. One of the scariest things about the exploding number of devices used for IoT is the need to power them, and the potential huge waste, cost and hassle of needing batteries for tons of devices. Tryst Energy from the Netherlands has developed an extremely efficient solar device that only needs 200 lux of light for four hours per day to operate a small sensor that communicates with Bluetooth or WiFi. That is the amount of light normally found under a desk. The device also ought to last for 75 – 100 years, opening the ability to place small IoT sensors in all sorts of places to monitor things. When you consider the expected billions of devices that are expected over the next decade this could provide a huge boost to the IoT industry and also provide a green solution for powering tiny devices. The device is just starting to go into production.

Bots Have Created Their Own Language. A team at OpenAI, the artificial intelligence lab founded by Elon Musk and Sam Altman, has published a paper describing how bots have created their own language to communicate with each other. They accomplished this by presenting simple challenges that require collaboration to bots, which are computer programs that are taught to accomplish tasks. Bots are mostly being used these days to learn to communicate with people. But the OpenAI team instead challenged the bots to solve spatial challenges such as devising a way to move together to a specific location inside of a simulated world. Rather than tell the bots how to accomplish this they simply required that the bots collaborate with other bots to accomplish the assigned tasks. What they found was that the bots created their own ‘language’ to communicate with each other and that the language got more efficient over time. This starts sounding a bit like bad Sci-Fi world where computers can talk to each in languages we can’t decipher.

Recycling CO2. Liang-shi Li at Indiana University has found a way to recycle CO2 for the production of power. He has created a molecule that, with the addition of sunlight, can turn CO2 from the atmosphere into carbon monoxide. The carbon monoxide can then be burnt to create power, with the byproduct being CO2. If scaled this would provide for a method to produce power that would add no net CO2 to the atmosphere (since it recycles the CO2). Li uses a nanographene molecule that has a dark color and that absorbs large amounts of sunlight. The molecule also includes rhenium which is then used as a catalyst to turn nearby CO2 into carbon dioxide. He’s hoping to be able to accomplish this instead with more easily obtained magnesium.

Liquid Light. It’s common knowledge that light usually acts like a wave, expanding outward until it’s reflected or absorbed by an object. But in recent years scientists have also discovered that under extreme conditions near absolute zero that light can also act like a liquid and flow around objects and join back together on the other side. The materials and processes used to produce the liquid light are referred to as Einstein-Bose condensates.

Scientists from CNR Nanotec in Italy, Ecole Polytechnique de Montreal in Canada, and Aalto University in Finland just published an article in Nature Physics that shows that light can also exist in a ‘superliquid’ state where light flows around objects with no friction. Of most interest is that this phenomenon can be produced at normal room temperature and air pressure. The scientists created this effect by sandwiching organic molecules between two highly-reflective mirrors. The scientists believe that interaction of light with the molecules induces the photons in the light to take on characteristics of electrons in the molecules.

The potential uses for this technique, if perfected, are huge. It would mean that light could be made to pass through computer chips with no friction, meaning no creation of the heat that is the bane of data centers.

Latest Industry Statistics

The statistics are out for the biggest cable TV and data providers for the first quarter of the year and they show an industry that is still undergoing big changes. Broadband keeps growing and cable TV is starting to take some serious hits.

Perhaps the most relevant statistic of all is that there are now more broadband customers in the country than cable TV customers. The crossover happened sometime during the last quarter. This happened a little sooner than predicted due to plunging cable subscribers.

For the quarter the cable companies continued to clobber the telcos in terms of broadband customers. Led by big growth in broadband customers at Comcast and Charter the cable companies collectively added a little over 1 million new broadband customers for the quarter. Charter led the growth with 458,000 new broadband subscribers with Comcast a close second at 430,000 new customers.

Led by Frontier’s loss of 107,000 broadband customers for the quarter the telcos collectively lost 45,000 net customers for the quarter. Most of Frontier’s losses stem from the botched acquisition of Verizon FiOS properties. Verizon lost 27,000 customers for the quarter while AT&T U-verse was the only success among telcos adding 90,000 new customers for the quarter.

Looking back over the last year the telcos together lost 727,000 broadband customers while the cable companies together gained 3.11 million customers during the same period. The cable companies now control 63.2% of the broadband market, up from 61.5% of the market a year ago.

Overall the broadband market grew by 2.38 million new broadband subscribers for over the last year ending March 31. It’s a market controlled largely by the giant ISPs and the largest cable companies and telcos together account for 93.9 million broadband subscribers.

Cable TV shows a very different picture. The largest seven cable providers collectively lost 487,000 video subscribers for the quarter. That includes AT&T losing 233,000, Charter losing 100,000, Dish Networks losing 143,000, Verizon losing 13,000, Cox losing 4,000 and Altice losing 35,000. The only company to gain cable subscribers was Comcast, which gained 41,000.

Total industry cable subscriber losses were 762,000 for the quarter as smaller cable companies and telcos are also losing customers. That is five times larger than the industry losses of 141,000 in the first quarter of last year. This industry is now losing 2.4% of the market per year, but that r is clearly accelerating and will probably grow larger. The annual rate of decline is already significantly higher than last year’s rate of 1.8%.

At this point it’s clear that cord cutting is picking up steam and this was the worst performance ever by the industry.

The biggest losers have stories about their poor performance. Charter says it is doing better among its own historic customers but is losing a lot of customers from the Time Warner acquisition as Charter raises rates and does away with Time Warner promotional discounts. AT&T has been phasing out of cable TV over its U-Verse network. This is a DSL service that has speeds as high as 45 Mbps, but which is proving to be inadequate to carry both cable TV and broadband together. Dish Networks has been bogged down in numerous carriage and retransmission fights with programmers and has had a number of channels taken off the air.

But even considering all of these stories it’s clear that customers are leaving the big companies. Surveys of cord cutters show that very few of them come back to traditional cable after cutting the cord after they get used to getting programming in a different way.

What is probably most strikingly different about the numbers is that for years the first quarter has performed the best for the cable industry, which in recent years has still seen customer gains even while other quarters were trending downward. We’ll have to see what this terrible first quarter means for the rest of 2017.

 

 

Comparing Streaming and Broadcast Video

One thing that doesn’t get talked about a lot in the battle between broadcast TV and on-line video is video quality. For the most part today broadcast TV still holds the edge over on-line video.

When you think of broadcast TV over a cable system I can’t help but remember back twenty years ago when the majority of the channels on a cable system were analog. I remember when certain channels were snowy, when images were doubled with ghosts and the first couple of channels in the cable system were nearly unwatchable. Today the vast majority of channels on most cable systems are digital, but there are still exceptions. The conversion to digital resulted in a big improvement in transmission quality.

When cable systems introduced HDTV and the quality got even better. I can remember flipping back and forth between the HD and SD versions of the same channel on my Comcast system just to see the huge difference.

This is not to say that cable systems have eliminated quality issues. It’s still common on many cable systems to see pixilation, especially during high action scenes where the background is constantly changing. All cable systems are not the same, so there are differences in quality from one city to the next. All digital video on cable systems is compressed at the head-end and decompressed at the settop box. That process robs a significant amount of quality from a transmission and one only has to compare any cable movie to one from a Blu-ray to realize how much is lost in the translation.

In the on-line world buffered video can be as good as good as cable system video. But on-line video distributors tend to compress video even more than cable systems – something they largely can get away with since a lot of on-line video is watched on smaller screens. And this means that a side-by-side comparison of SD or HD movies would usually favor the cable system. But Netflix, Amazon and a few others have one advantage today with the spectacular quality of their 4K videos – there is nothing comparable on cable networks.

But on-line live-streamed video still has significant issues. I watch sports on-line and the quality is often poor. The major problem with live-streamed video is mostly due to delays in the signal getting to the user. Some of that delay is due to latency – either latency in the backbone network between the video creator and the ISP or latency in the connection between the ISP and the end-user. Unlike downloading a data file where your computer will wait until it has collected all of the needed packets, live-streamed video is sent to end-users with whatever pixels have arrived at the needed time. This creates all sorts of interesting issues when watching live sports. For instance, there is pixilation, but it doesn’t look like the pixilation you see on cable network. Instead parts of the screen often get fuzzy when they aren’t receiving all the pixels. There are also numerous problems with the video. And it’s still not uncommon for the entire picture to freeze for a while, which can cause an agonizing gap when you are watching sports since it always seems to happen at a critical time.

Netflix and Amazon have been working with the Internet backbone providers and the ISPs to fix some of these issues. Latency delays in getting to the ISPs is shrinking and, at least for the major ISPs, will probably not be an issue. But the one issue that still needs to be resolved is the crashes that happen when the Internet gets overloaded when the demand is too high. We’re seeing ISPs bogging down when showing a popular stream like the NBA finals, compared to a normal NBA game that might only be watched by a hundred thousand viewers nationwide.

One thing in the cable system’s favor is that their quality ought to be improving a lot over the next few years. The big cable providers will be implementing the new ATSC 3.0 video standard that is going to result in a significant improvement in picture quality on HD video streams. The FCC approved the new standard earlier this year and we ought to see it implemented in systems starting in 2018. This new standard will allow cable operators to improve the color clarity and contrast on existing HD video. I’ve seen a demo of a lab version of the standard and the difference is pretty dramatic.

One thing we don’t know, of course, is how much picture quality means to the average video user. I know my teenage daughter seems quite happy watching low-quality video made by other teens on Snapchat, YouTube or Facebook Live. Many people, particularly teens, don’t seem to mind watching video on a smartphone. Video quality makes a difference to many people, but time will tell if improved video quality will stem the tide of cord cutting. It seems that most cord cutters are leaving due to the cost of traditional TV as well as the hassle of working with the cable companies and better video might not be a big enough draw to keep them paying the monthly cable bill.