Improving Our Digital Infrastructure, Part 2

FCC_New_LogoYesterday’s blog discussed the FCC’s recently published roadmap of how we can get broadband to rural America. That blog looked at the positive aspects of the plan, such as setting a goal to bring at least 25/3 Mbps broadband to almost every home in the nation. But there are a few aspects of that plan that I fund troubling that I will examine today.

One of the biggest red flags in the paper is that the FCC suggests that federal funding for fiber be granted using a reverse auction to be awarded ‘at a national / regional level’. This means that the FCC leans towards giving the funding to the largest telcos in the same manner that they did the CAF II funding. That would mean giant hand-outs to companies like Verizon, AT&T, CenturyLink and Frontier. The idea of a reverse auction is a good one, in that the whoever asks for the least amount of support can get the federal funding support. But if that funding is only awarded at a regional or national level, then by definition, only these giant companies have any opportunity for the funding. Smaller companies don’t have any realistic way to scale up to serve gigantic regional service areas.

We already can see the amazing strides made by independent telephone companies, cooperatives and others in bringing broadband to rural America. It would be a travesty to set the rules in such a way as to exclude these companies from a chance at the money (as already happened with CAF II). If you ask any rural customer if they would prefer to be served by a cooperative or AT&T – the answer is always going to favor the smaller local company rather than a national one.

The second aspect of the FCC roadmap that concerns me is that it refers in several places to providing fiber networks that are capable of supporting 5G. I am really hoping that the FCC is not envisioning that these many billions of funding are going to go to building rural infrastructure just so the wireless companies can then offer 5G for the last mile instead of fiber-to-the-premise. I’ve probably written half a dozen blogs about 5G technology and I have serious technical doubts about it ever being a viable rural technology. And that means no amount of money can make it work if it’s the wrong technology for rural America. I hope I am wrong and that the FCC hasn’t bought into AT&T and Verizon’s vision of 5G as the ultimate network, but the wording in the FCC’s documents sets off a lot of red flag warnings for me.

Another thing that makes me think that the FCC wants to give the money to the biggest companies is that they support using accelerated depreciation as a way to improve the financial viability of building rural infrastructure. Any smaller company that invests in fiber today already doesn’t pay taxes on that new investment for up to ten years due to the tax savings from existing depreciation rules. The only companies that would really get the full benefit of accelerated depreciation are the giant telcos that could use the tax savings from these new investments to shield profits from other parts of the company.

There is another part of the FCC document that has me scratching my head – where they discuss somehow using this new plan to support Smart Cities concepts. The first half of the FCC document talks about providing funding to build broadband for rural areas – and those are the last places where anybody will ever be thinking about smart city technologies. But perhaps the FCC is suggesting that the federal government should help to promote smart city concepts in addition to funding up to $80 billion for rural broadband. But I would hate to see money needed for rural America siphoned to the cities – not that they don’t need this, but it ought to be addressed in some other manner.

Another concept that puzzles me a bit is that the FCC suggests that tax credits could be used to encourage the repatriation of the trillions of dollars that US corporation are keeping overseas to avoid US taxes. Tax credits might be a great way in general to get that money reinvested back into the US. But I doubt that big companies like Apple and Google will want to invest in rural fiber with that money after they’ve repatriated it. Even with huge government support, rural broadband projects are only going to earn modest infrastructure level returns, at best.

Finally, this proposal has one glaring omission. The FCC makes the blanket assumption that places with 25/3 Mbps broadband today is set for future broadband. I know that the cable and other networks in small town America – in rural county seats, for example, are not at all the same as the networks in urban areas. While these smaller towns might have 25/3 broadband today there is no guarantee they will ever be further upgraded – the return on making such upgrades in tiny markets is miniscule. This program could produce what I call reverse donuts – great broadband to farms surrounding county seats with slower and older technology. That would be an unusual outcome of a plan like this.

Bottom line for me is that the goals established by the FCC are great. But I think it would be a huge mistake if they end up handing billions only to the big telcos, especially if the end vision is to use 5G to serve rural America instead of fiber. There are aspects of this plan that sound like they were written by AT&T and Verizon. It’s impossible to know if this document represents the old or new FCC regime, but I didn’t hear any outcry on its release and I suppose it is the vision of Chairman Pai and staff.

Improving Our Digital Infrastructure, Part 1

FCC_New_LogoLast week the FCC published a document that is their vision of a roadmap to improve the nation’s digital infrastructure. Today’s blog is going to look at the positive aspects of that roadmap and tomorrow I will look at some of the FCC’s ideas that I find to be troublesome.

I find this to be an interesting document for several reasons. First, it was published on Ajai Pai’s first day as FCC Chairman. It’s obvious that this paper has been under development for a while, but it clearly reflects the new Chairman’s views of the industry.

This paper is not so much a complete broadband plan as it is a roadmap of principles that the FCC supports to get broadband to rural areas. The FCC recognizes that they only have the power today to institute a few of the goals of this plan and that Congress would need to act to implement most of the suggestions in the plan.

The obviously good news about this document is that it clearly lays forward the principle that rural America deserves to have real broadband that meets or exceeds the FCC’s definition of 25 Mbps. This is a clear break from the FCC’s decision just a few years ago to fund the CAF II program which is spending $19 billion to fund rural broadband that only has to meet a 10/1 Mbps standard. One of my first thoughts in reading this document is that it seems likely that if this new roadmap is implemented that the FCC would have to cancel the remainder of the CAF II deployment. It’s really too bad the that FCC didn’t support real bandwidth for rural America before tossing away money on the CAF II plan.

The FCC plan looks at bringing broadband to the 14% of the households in the country that don’t have broadband today capable of delivering 25/3 Mbps. The FCC estimates that it will cost roughly $80 billion to bring broadband to these areas. Interestingly, they estimate that it would take only $40 billion to reach 12 out of the 14%, and that the last little sliver of the country would cost the remaining $40 billion. But the FCCs goal is to find a way to get broadband to all of these places (except I’m sure for the most remote of the remote places).

The paper calls for aggressive federal assistance in funding the rural broadband. They recognize that there has not been commercial deployment in these areas because commercial providers can’t justify the investments due to the high cost of deployment. And so they suggest that the government should provide grants, loans and loan guarantees that are aggressive enough to improve the returns for private investment. They suggest that grants could be as high as 80% of the cost of deployment in the most remote places.

The paper suggests that most of the areas will have enough customer revenue to support the properties without further federal support. In looking at some of the business plans I have built for rural counties I think that they are probably right. What sinks most rural business plans is not the ongoing maintenance costs, but rather the heavy burden of debt and a return on equity during the first 10 years of deployment. Rural fiber deployment will look like better financial opportunity if the government can find a way to provide enough up-front funding support. The FCC does recognize that most rural markets in the country will require ongoing federal support to be viable. They suggest it will require about $2 billion per year in ongoing support that will probably be similar to how the Universal Service Fund works today.

The roadmap document also suggests other financial incentives to fiber builders such as faster depreciation, tax credits, and changes to the IRS rules which require today that grant funding be considered as income. That provision stopped a number of companies from accepting the stimulus funding a few years ago and is a definite roadblock to accepting grant funding.

Overall these are great goals. It’s going to require significant fiber in rural areas to meet the stated speed goals. It’s great to see the FCC change direction and suggest that rural America deserves real broadband. I just wish they had adopted this policy a few years ago rather than supporting the CAF II program that is throwing money at propping up rural DSL.

Catching Up On Small Cell Deployment

light-pole-on-i-805-in-san-diego2I remember going to a presentation at a trade show a few years back where there was great enthusiasm for the future of small cell sites for cellular networks. The panel, made up mostly of vendors, was predicting that within five years there would be hundreds of millions of small cells deployed throughout all of the urban areas of the US.

Small cells are supposed to relieve congestion from the larger existing cellular towers. They can be hung anywhere such as on light poles, rooftops, and even in manholes. They have a relatively small coverage area ranging from 30 to 300 feet depending upon the local situation.

But I recently saw that MoffettNathanson estimated that there have only been 30,000 small cells deployed so far. That’s obviously a far cry smaller than the original projections and it’s an interesting study in the dynamics of the telecom industry for why this didn’t go as planned. We’ve seen other examples of new technologies before that didn’t pan out as promised, so it’s a familiar story to us that have been following the industry for a while.

There are a number of different issues that have slowed down small cell deployment. One of the key ones is price since it can cost between $35,000 and $65,000 to get a small cell in place. That’s a steep price to pay for a small coverage area unless that area is full of people much of the day.

Another problem is that small cells need to be fiber fed and also need to have a source of reliable continuous power. Not surprisingly, that turns out to be a big issue in the crowded urban areas where the small cells make the most sense. It’s not easy, for example, to bring fiber to an existing light pole. And it’s often not even easy to bring reliable power to some of the best-suited cell locations.

The problems that surprised the cellular industry the most are the problems with getting permits to place the cell sites. Remember that these sites are deployed in the densest parts of big cities and many of those cities have a lot of rules about running new fiber or power lines in those areas. Some of the cellular companies have cited waits as long as two years for permitting in some locations.

Yet another problem is that the big cellular companies are having a hard time figuring out how to incorporate the new technology into their processes. The whole industry has grown up dealing with big cell towers and all of the work flows and processes are geared towards working in the tower environment. I can’t tell you how many times I’ve seen big companies have trouble dealing with something new. It was the inability to change the existing workflows, for example, that led Verizon to basically start a whole new company from scratch when they launched FiOS.

And like any new technology, the small cells have not always delivered the expected performance. This has a few companies stepping back to assess if small cells are the right way to go. For instance, AT&T has largely stopped new small cell deployment for now.

The FCC recently took a stab at some new regulations that might make the permitting process easier. And the FCC just released a policy paper that promised to look at further easing the rules for deploying wireless technology and for getting onto poles.

The main reason that I’m following small cells is that the industry is on the cusp of implementing two new technologies that are going to face all of the same issues. It’s clear that 5G is going to need small cells if it is to be able to handle the number of devices in a local area that have been hyped by the cellular companies. And Google, AT&T and others are looking at wireless local loop technologies that are also going to require small fiber-fed devices be spread throughout a service area. My gut feeling is the same problems that have plagued small cell deployment are going to be a thorn for these new technologies as well – and that might mean it’s going to take a lot longer to deploy these technologies than what the industry is touting.

The Beginning of the End for HFC?

coax cablesWe’ve spent the last few years watching the slow death of telephone copper networks. Rural telcos all over the country are rapidly replacing their copper with fiber. AT&T has made it clear that they would like to get out of the copper business and tear down their old copper networks. Verizon has expressed the same but decided to sell a lot of their copper networks rather than be the ones to tear them down. And CenturyLink has started the long process of replacing copper with fiber and passed a million homes with fiber in urban areas in 2016.

Very oddly, the dying copper technology got a boost when the FCC decided to award money to the big rural copper owners like Frontier, CenturyLink and Windstream. These companies are now using CAF II money to try to squeeze one more generation of life out of clearly old and obsolete copper. Without that CAF II money we’d be seeing a lot more copper replacement.

I’ve been in the telco industry long enough to remember significant new telco copper construction. While a lot of the copper network is old and dates back to the 50s and 60s, there was still some new copper construction as recently as a decade ago, with major new construction before that. But nobody is building new telco copper networks these days, which is probably the best way to define that the technology is dead – although it’s going to take decades for the copper on poles to die.

This set me to thinking about the hybrid coaxial networks (HFC) operated by the cable companies. Most of these networks were built in the 60s and 70s when cable companies sprang up in urban areas across the country. There are rural HFC networks stretching back into the 50s. It struck me that nobody I know of is building new HFC networks. Sure, some cable companies are still using HFC technology to reach a new subdivision, but nobody would invest in HFC for a major new build. All of the big cable companies have quietly switched to fiber technology when they build any sizable new subdivision.

If telco copper networks started their decline when companies stopped building new copper networks, then we have probably now reached that same turning point with HFC. Nobody is building new HFC networks. What’s hanging on poles today is going to last for a while, but HFC networks will eventually take the same path into decline as copper networks.

There will be a lot of work and money poured into keeping HFC networks alive. Cable companies everywhere are looking at upgrades to DOCSIS 3.1 as a way to get more speeds out of the technology – much in the same way that DSL prolonged copper networks. The big cable companies, in particular, don’t want to spend the capital dollars needed to replace HFC with fiber – Wall Street will punish any cable company that tries to do so.

Cable networks have a few characteristics that give them a better life than telephone copper. Having the one giant wire in an HFC network is superior to having large numbers of tiny wires in a copper network which go bad one-by-one over time.

But cable networks also have one big downside compared to copper networks – they leak interference into the world and are harder to maintain. The HFC technology uses radio waves inside the coaxial cable as the method to transmit signal. Unfortunately, these radio waves can leak out into the outside world at any place where there is a break in the cable. And there are huge numbers of breaks in an HFC network – one at every place where a tap is placed to bring a drop to a customer. Each of the taps and other splices in a cable network are sources of potential frequency leakage. Cable companies spend a lot every year cleaning up the most egregious leaks – and as networks get older they leak more.

Certainly HFC networks are going to be around for a long time to come. But we will slowly start seeing them replaced with fiber. Altice is the first cable company to say they will be replacing their HFC network with fiber over the next few years. I really don’t expect the larger cable companies to follow suit and in future years we will be deriding the networks used by Comcast and Charter in the same way we do old copper networks today. But I think that somewhere in the last year or two we saw the peak of HFC, and from that point forward the technology is beginning the slow slide into obsolescence.

Checking in on Internet2

internet2I recently checked in with Internet2 to see what they are up to these days. For those of you not familiar with Internet2, it’s a high-speed data network operator by and for the benefit of many colleges and universities.

Internet2 came about due to the fact that universities were unable to buy the commercial bandwidth they needed. In the early 90s universities were among the largest data users in the country. Universities often collaborate on science research and they wanted to be able to transmit huge data files for research on things like particle physics and medical imaging. There were also supercomputers at some campuses that other schools wanted to use.

And so the universities started looking for solutions to get fast data pipes between campuses. The first attempt of this was funded by the National Science Foundation in 1995 and was called vBNS. In 1997 the effort was picked up by EDUCOM, a consortium that was organized by the non-profit University Corporation for Advanced Internet Development. This corporation changed their name to Internet2 and worked with Qwest to build and assemble a 10 Gbps network in 1998. You have to put that speed into perspective. Today there are amazingly a handful of residential customers in the country with 10 Gbps connections. But in 1998 that speed was cutting edge. Internet2 subsequently began working with Level3 and they upgraded the network in 2007 to 100 Gbps.

The organization has grown over the years. The original project was funded by 34 universities. The network has now expanded and is comprised today of 252 universities, 82 corporations that collaborate in research, 68 large government affiliate members such as federal agencies, 41 regional and state education networks and 65 other research and educational networking partners around the world.

Internet2 has also reached out through these many connections into communities all over the country. The network (through the affiliated regional partners) now connects to over 60,000 anchor institutions that include primary and secondary schools, community colleges and universities, public libraries, museums and health care organizations.

The organization has also taken on other roles. In the early years the corporation was focused on assembling the network needed to connect to institutions. The network has been assembled through long-term IRU agreements for access to fiber. The second phase of the roll-out was to form the many alliances with state and regional government networks to extend the reach of the Internet2 bandwidth.

But now the organization is focusing on other ways to benefit members. For example, it has been negotiating cheap pricing for cloud services since many of the member institutions are moving functions into the cloud. The Internet2 staff also has begun negotiating bulk pricing for other telecom services like VoIP, commonly used software, etc.

Internet2 is probably the biggest example I know of what I call a service organization. This has always been a popular model in rural America and there are numerous service organizations that operate or assist rural electric companies or rural cooperatives of all types. There are also a number of statewide fiber networks owned by independent telephone companies that are smaller clones of Internet2. The one thing I’ve always seen is that the roles of service organizations always grow over time as members find more and more things that they can do more affordably in as a group. So I would expect the Internet2 crew to be asked to tackle even more future tasks as new technologies emerge that are needed by its university members.

There is one area where Internet2 has not yet extended its cheap bandwidth – which is to rural broadband networks. One of the biggest hurdles that rural America still faces is getting access to affordable bandwidth. Often, the only place to buy bandwidth in rural areas is through premium prices paid to the telco or cable company in the area. But the Internet2 network already extends to many of these rural towns and counties. I routinely find when I visit a rural community that the only cheap bandwidth in town is at the library or at city hall. But this ‘government’ bandwidth is never made available to serve the residents of these communities.

There would be a huge social benefit if Internet2 and its affiliates would allow rural ISPs to become anchor institutions on the extensive Internet2 network. The folks at Internet2 tell me that this is something they have considered, and I hope they give it more thought. They have an opportunity to help rural America in the same way they have benefitted our schools and universities – and rural America sure could use the help.

The Resurgence of Rabbit Ears

rabbit earsThere is perhaps no better way to understand the cord cutting phenomenon than by looking at the booming sales of home TV antennas known as ‘rabbit ears’ used to receive local television off the airwaves. A study released by Park Associates shows that 15% of households now use rabbit ears, and that is a pretty amazing statistic. That is up from 8% of households from as recently as 2013. And I recall an earlier time when this had fallen below 5%.

For the longest time the TV-watching public was counted in three groups – those who had cable TV (including satellite), those that used rabbit ears to watch local TV only, and those with no TV. We now have a fourth category – those that only watch OTT programming such as Netflix.

I was once in the category of not watching TV at all. I remember twenty years ago I went to Circuit City (now gone) to consider buying a set of rabbit ears and the clerks there weren’t even sure if the store carried them. With some asking around they found that they had a few units of one brand that had been gathering dust.

But today there is a resurgence in rabbit ears and there are easily a dozen major brands. And there are new rabbit ear options coming on the market all of the time. For example, Sling TV just launched AirTV, a $99 box that integrates Sling TV, Netflix and high-quality rabbit ears together with a voice-activated remote control that makes it easy to cut the cord. This looks to be one of the better voice-activation systems around and lets you search programming options by using the name of shows, actors names or genres of types of programming.

Since most people have had cable TV for a long time many have no idea of what they can receive off air for free. The FCC has an interesting map that shows you the expected reception in your area. In my case the map shows that I can get a strong signal from every major network including CW and PBS along with signals from MyTV, Univision and a few independent local stations.

The Parks study also looks at other industry statistics. A few of the most interesting ones include:

  • Penetration of pay-TV was down to 81% in 2016 and has fallen every year since 2014. Parks cites the normal reasons for the decline including the growth of OTT programming, the increasing cost of a cable TV subscription and growing consumer awareness that there are viable alternatives to cable TV.
  • Satisfaction with pay-TV keeps dropping and only one-third of households now say that they are very satisfied with their pay-TV service.
  • OTT viewing continues to rise and 63% of US households now subscribe to at least one OTT offering like Netflix while 31% of households subscribe to more than one.
  • In 2016 12% of households downgraded their pay-TV service (meaning dropped it or went to a less expensive option). This was double the percentage (6%) who upgraded their pay-TV service in 2016.
  • Very few cord nevers (those who have never had cable TV) are deciding to buy pay-TV, with only 2% of them doing so in 2016. This is the statistic that scares the cable companies because cord nevers include new Millenial households. This generation is apparently not interested in being saddled with a pay-TV subscription. In past generations the percentage of new homes that bought pay-TV closely matched the overall penetration of the market – buying TV was something you automatically did when you moved to a new place.

These statistics show how much choice the OTT phenomenon has brought to the marketplace. Ten years ago there wouldn’t have been industry experts predicting the resurgence of rabbit ears. In fact, rabbit ears were associated with other obsolete technologies like buggy whips and were used as the butt of jokes to make fun of those who didn’t like the modern world. But this is no longer true and new rabbit ear homes are perhaps some of the most tech savvy, who know that they can craft an entertainment platform without sending a big check to a cable company.

 

Continued Cable Rate Increases

Fatty_watching_himself_on_TVEvery year about this time we see the big cable companies increase rates. Rather than list all of the changes at all of the big companies, I’m going to look at the rate increases announced by Mediacom. I’m not particularly singling them out, but the increases they are implementing this year are typical and I think Mediacom reflect the trends we are now seeing around the industry.

First, Mediacom increased the rate on its basic internet product while bumping the speeds higher. They are increasing the price from $49.99 to $54.99 and increasing speeds from 15 Mbps download to 60 Mbps download. Both of those changes are trends I see across the industry.

Mediacom claims the increase in data prices is due to the increased speeds. Anybody who understands the ISP industry knows that is a fairly lame excuse, but the speed increase gives them cover to make the claim. The cost of providing bandwidth (except for the tiniest and most remote ISPs) is only a few dollars per month per customer. Mediacom will certainly see a one-time burst in the use of data because of the speed increases since they will have unleashed homes that were restricted by the slower 15 Mbps limit. But after this one-time burst they will see usage return to the former growth curve. I have a client that did a similar upgrade last year and they saw about a 25% immediate increase in data usage as customers moved to the faster speeds. But that one-time increase doesn’t cost the ISP very much money.

But Medicom’s increase reflects the new trend in the industry for raising data prices. Verizon started this a few years ago with a small increase in FiOS and almost all large ISPs have now done the same thing. But unlike cable rate increases that are due to programming increases, ISPs have a much harder time defending data price increases. Yet the big ISPs are under constant pressure from Wall Street to increase the bottom line – and since they are all losing cable customers they have little recourse but to raise data rates. I expect this will become a routine annual increase like we’ve always seen in cable rates – but the public is going to catch on after a while that the rate increases go straight to profits and are not due to any underlying costs for providing data.

The other trend Mediacom is matching is the one to increase data speeds. Just last week, for example, Verizon FiOS raised speeds across the board with its fastest product now at 750 Mbps. Comcast and Charter are on a similar path, increasing speeds in some markets with plans to increase speeds everywhere. I think the cable companies have figured out that increasing speeds doesn’t cost them much and it keeps customers happy while fending off possible challenges from fiber competitors.

Mediacom also increased cable TV rates and announced an increase of $3.95 in its expanded basic package. They claim that programming costs have increased during the year by $5.50 to $6.50 (they have different line-ups in different markets). Those are not untypical numbers. But they claimed that “We are not passing along that entire increased expense to customers.” And that is not true.

In addition to the basic rate increase the company increased two other fees. They increased the ‘Local Broadcast Channel” fee by $1.61 per month and raised the regional sports channel surcharge by $0.24 per month. So altogether they raised various components of cable rates by $5.80 cents, which likely covers the increased programming costs.

This trend of disguising prices using fees is also now industrywide. Cable companies of all sizes have moved part of their cable rates into these ancillary fees, making it look like cable is more affordable than it is. The companies advertise the base ‘cable’ rates without mentioning the real cost of buying cable. These fees are confusing to customers, who often think that they are taxes of some sort. In Mediacom’s case the Local Broadcast Channel fee and the local sports channel surcharge fee now total an eye-popping $11.83 per month.

And of course, not all of these rate increases affect all customers immediately. There are customers with various specials who might not see some or all of these increases until their special expires. But a normal customer paying a month-to-month bill is going to get a total increase of $6.85 per month. That looks to be a fairly normal increase when looking around the industry this year. Each company is making different choices on the rates to raise. In addition to the choices Mediacom made, some are increasing settop box and cable modem rates to get the increases they want. Since a significant percentage of customers buy both cable TV and broadband, I’m guessing that customers don’t really much care which rates go up, they just care about the total increase in their bill.

A Year of Changes

fast fiberI can’t recall a time when there were so many rumors of gigantic changes in the telecom industry swirling around at the same time. If even half of what is being rumored comes to pass this might be one of the most momentous years in the history of telecom. Consider the following:

Massive Remake of the FCC.  Ajat Pai has been named as the interim head of the FCC, but it’s been said that the president is already referring to him as the Chairman. We know that Pai was against almost every initiative of the Wheeler FCC and there are expectations that things like net neutrality and the new privacy rules will be reversed or greatly modified.

There are also strong rumors in the industry that the new administration is going to follow the advice of the transition telecom team of Jeff Eisenach, Roslyn Layton and Mark Jamison. That team has proposed the following:

  • A reapportionment of ‘duplicative’ functions at the FCC. Functions like fostering competition and consumer protection, for example would be moved the Federal Trade Commission.
  • A remake of telecom rules to remove ‘silos.’ For as long as I can remember we’ve had separate rules for telcos, cable companies, wireless companies and programmers. That probably made sense when these were separate industries, but today we see all of these business lines about to converge within the same corporation like Comcast or AT&T. The transition team says it’s time to change the rules to reflect the reality of technology and the marketplace.

At this point I’ve not seen any specific proposals on what those streamlined rules might be. And Congress will have to take an active role in any changes since the current FCC responsibilities are the results of several major telecom and cable acts.

Verizon Looking to Buy a Cable Company. It’s been reported that Lowell McAdams, the CEO of Verizon, has told friends that the company will be looking for a cable acquisition to boost demand for its wireless data. McAdams also talked to analysts in December and described how Charter might be a natural fit with Verizon. There is also speculation on Wall Street that Comcast could be the target for Verizon.

Mergers of this size are unprecedented in the industry. Charter has over 20 million residential data customers and is second behind Comcast’s 23 million data customers. And both companies now have a significant portfolio of business customers.

I remember a decade ago when AT&T started buying back some of the RBOCs that had splintered off during divestiture back in 1984. We all joked that they were slowly putting Ma Bell back together. But I don’t think anybody ever contemplated that the biggest telcos would ever merge with the cable companies. That would remove the last pretense that there is any competition for broadband in urban areas.

More Merger Mania. At one point it looked like the new administration would be against the AT&T and Time Warner merger. But Wall Street now seems to be convinced the merger will happen. The merger will likely come with the typical list of conditions, but we know from past experience that such conditions are only given lip service. AT&T has already taken a strong position that the merger doesn’t need FCC approval. That would mean that most of the government analysis would come from the Justice Department. Just like with the rumored Verizon acquisitions, this merger would create a giant company that operates in all of the FCC-controlled silos. We don’t really have an effective way today to regulate such giant companies.

Verizon might need to hurry if it wants to buy a giant cable company since there is a rumor that Comcast, Charter and Cox plan to go together and buy T-Mobile. That makes a lot more sense than for those companies to launch a wireless company using the Verizon or AT&T platform. Such an arbitrage arrangement would always allow the wireless companies to dictate the terms of using their networks.

The Declining Search Engine?

ask-jeevesThere is a subtle battle going on for control of the web. The web as we have come to know it is built upon the search engine. Those who’ve been on the web long enough remember search engines like Archie, Excite, Aliweb, Infoseek, AltaVista and Ask Jeeves. Today Google dominates the search market along with others including Bing, Yahoo and DuckDuckGo. Search engines operate by the simple premise of ‘crawling’ through publicly available web spaces and categorizing web pages by topics that can be searched.

But we might have reached the point in the life of the web where the search engine will decline in value. This is because search engines rely on public content and an ever-increasing amount of our collective information is now being created for and stored in the dark web. This term refers to networks which use the public internet but which require specific software to access.

The best example of the dark web is social media sites like Facebook, Twitter and LinkedIn. There are huge amounts of content now created for and stored inside of these platforms. Facebook is the best example of this. There are now many businesses that no longer have a web site but which instead have a Facebook page. There are many organizations that do the same and that communicate with members only through a Facebook group. Every social media site has something similar. For instance, there are now thousands of articles that are written for and published within LinkedIn.

But the dark web includes a lot more than just social media. Corporations and trade associations now routinely keep information hidden from the general public by requiring password access to read whatever they have published. You can understand the motivation behind this – a trade association might have more luck recruiting members if membership includes access to unique content.

But corporations also hide a lot of content that used to be public. For example, cable companies like Comcast require a customer to enter a valid address before showing them current products and prices. Those prices used to be openly published, but the act of asking for an address now ‘hides’ this information from search engines. Comcast certainly wants this iformation hidden since they don’t offer the same prices everywhere.

While the information behind corporate and trade association web sites is already unavailable to search engines, for now the information behind most social media sites is still searchable. But there is nothing that requires it to stay that way. Google and Facebook are now engaged in a fierce battle to win web advertising and there is nothing to stop Facebook from flicking a switch and hiding all of its content from Google search. To do so would instantly devalue Google since businesses listed only within Facebook would disappear from the search engine results.

It almost seems inevitable that this day will come, and probably not long from now. Both Google and Facebook have requirements from stockholders to continue to grow and both businesses are fueled largely by advertising revenue. It’s hard to think at some point that Facebook won’t deliberately try to gain an edge in this battle.

But the consequences of the dark web to all of us an ever-increasing lack of information. I remember spending evenings in the early days of the web just browsing the web for interesting content. It seems like every college and high school pushed huge amounts of interesting content onto the web – because in those early days that’s what the web was all about. All of the content from textbooks and homework assignments were on the open web for anybody to read. But there is no longer any major motivation to push information to the web. And many schools and universities are now behind a dark web wall as well.

The web has shifted massively towards entertainment and content is no longer king. Most of the early content-heavy web sites have died since they’ve either been pulled down or are no longer maintained. And so every day more content is removed from the web, and every day search engines become a little less valuable to the human race.

We may already be in a world where there is more useful data on the dark web than on the open one. Facebook and a few others could push the search engines onto the road to decline. History has already shown us that search engines can come and go. Excite was one of the first web companies that sold for billions, but the company that bought it, @Home went bankrupt. There is nothing to say that the Google search engine or any other is something that we can always rely on. And in fact, they may just fade away someday due to irrelevance.

2017 Technology Trends

Alexander_Crystal_SeerI usually take a look once a year at the technology trends that will be affecting the coming year. There have been so many other topics of interest lately that I didn’t quite get around to this by the end of last year. But here are the trends that I think will be the most noticeable and influential in 2017:

The Hackers are Winning. Possibly the biggest news all year will be continued security breaches that show that, for now, the hackers are winning. The traditional ways of securing data behind firewalls is clearly not effective and firms from the biggest with the most sophisticated security to the simplest small businesses are getting hacked – and sometimes the simplest methods of hacking (such as phishing for passwords) are still being effective.

These things run in cycles and there will be new solutions tried to stop hacking. The most interesting trend I see is to get away from storing data in huge data bases (which is what hackers are looking for) and instead distributing that data in such a way that there is nothing worth stealing even after a hacker gets inside the firewall.

We Will Start Talking to Our Devices. This has already begun, but this is the year when a lot of us will make the change and start routinely talking to our computer and smart devices. My home has started to embrace this and we have different devices using Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa. My daughter has made the full transition and now talks-to-text instead of screen typing, but us oldsters are catching up fast.

Machine Learning Breakthroughs will Accelerate. We saw some amazing breakthroughs with machine learning in 2016. A computer beat the world Go champion. Google translate can now accurately translate between a number of languages. Just this last week a computer was taught to play poker and was playing at championship level within a day. It’s now clear that computers can master complex tasks.

The numerous breakthroughs this year will come as a result of having the AI platforms at Google, IBM and others available for anybody to use. Companies will harness this capability to use AI to tackle hundreds of new complex tasks this year and the average person will begin to encounter AI platforms in their daily life.

Software Instead of Hardware. We have clearly entered another age of software. For several decades hardware was king and companies were constantly updating computers, routers, switches and other electronics to get faster processing speeds and more capability. The big players in the tech industry were companies like Cisco that made the boxes.

But now companies are using generic hardware in the cloud and are looking for new solutions through better software rather than through sheer computing power.

Finally a Start of Telepresence. We’ve had a few unsuccessful shots at telepresence in our past. It started a long time ago with the AT&T video phone. But then we tried using expensive video conference equipment and it was generally too expensive and cumbersome to be widely used. For a while there was a shot at using Skype for teleconferencing, but the quality of the connections often left a lot to be desired.

I think this year we will see some new commercial vendors offering a more affordable and easier to use teleconferencing platform that is in the cloud and that will be aimed at business users. I know I will be glad not to have to get on a plane for a short meeting somewhere.

IoT Technology Will Start Being in Everything. But for most of us, at least for now it won’t change our lives much. I’m really having a hard time thinking I want a smart refrigerator, stove, washing machine, mattress, or blender. But those are all coming, like it or not.

There will be More Press on Hype than on Reality. Even though there will be amazing new things happening, we will still see more press on technologies that are not here yet rather than those that are. So expect mountains of articles on 5G, self-driving cars and virtual reality. But you will see fewer articles on the real achievements, such as talking about how a company reduced paperwork 50% by using AI or how the average business person saved a few trips due to telepresence.