Broadband and Medicine

Caduceus.svgI think it’s been at least fifteen years since I first began hearing that one of the major benefits of broadband will be improved health care. Yet, except for a few places that are doing telemedicine well, for the average person none of this has yet come to pass. But now I think we are at the cusp of finally seeing medical applications that will need broadband. Following are some areas where we ought to soon see real applications:

Letting the Elderly Stay in Their Homes Longer. This is the holy grail of future medicine products because surveys have shown that a huge majority of Americans want to stay in their homes as long as possible, and die in their homes when it’s time.  There is no one solution that can solve this problem, but a whole suite of technologies and solutions working together – and the good news is that there are now more than a hundred companies looking for ways to make this work.

All solutions for the elderly begin with smart monitoring systems. This means video cameras and sensors of all sorts that look for problems. Medical monitors will monitor vital signs. Smart sensors will track an elderly person and alert somebody if that person doesn’t move for a while. Reminder systems will make sure medications are taken on time. Virtual reality will help homebound elderly to keep in touch with caregivers and to have an active social life from home. Robots can help with physical tasks. The key to a good product is one that ties all of these things together into a package that people can afford (or that is at least less costly than the alternatives). My guess is that we are only a few years away from these packages finally being a reality.

Medical Diagnosis with Artificial Intelligence. IBM’s Big Blue has already demonstrated that it is better than most doctors and nurses at diagnosing medical conditions, with the added benefit that Big Blue generally catches rare diagnoses that doctors tend to not consider. There are already a number of companies working on integrating this into clinics, but this is also going to be taken online so that patients can be screened before even coming to see a doctor. IBM isn’t the only possible solution; companies like Google and Microsoft are now selling time on their AI platforms.

Virtual Reality and Telemedicine. One of the biggest drawbacks today to telemedicine is that a doctor can’t really get a good look at a patient in as much detail as they can in a live visit. But with big bandwidth and virtual reality technology doctors will soon be able to see patients in 3D and in close-up detail, which is going to make telemedicine a lot more accurate and usable. And combining this technology with some sort of medical monitor to supply vital signs can allow for easy treatment of most problems. But this is going to require big bandwidth at homes as well as a big data pipe between the remote community and the doctors.

Nanobots. A lot of future treatment of diseases is going to involve nanobots in the bloodstream. These will be tiny devices that deliver medicine specifically to the areas of the body that need it, are engineered to attack specific viruses or germs, or that monitor ongoing health issues closely. There are researchers who believe that we will carry nanobots with us at all time – to fend off cancer, treat diseases like the common cold before we have any symptoms, to rejuvenate cells, and to act as an early warning system for anything unusual.  There are already nanobot treatments for cancer being tested. We clearly will need to monitor nanobots and that means a reliable broadband connection and specific kinds of sensors.

Getting Access to Conduit

innerduraFuturePathGroupThere is an interesting case at the California Public Utilities Commission where Webpass is fighting with AT&T over access to conduit. You may have seen that Webpass was just recently bought by Google Fiber and I would think this case will be carried forward by Google.

The right for competitive providers to get access to conduit comes from the Telecommunications Act of 1996. In that Act, Congress directed that competitive telecom providers must be provided access to poles, ducts, conduits, and rights-of-way by utilities. A utility is defined as any company, except for electric cooperatives and municipalities, which owns any of those facilities that are used in whole or in part for communications by wire. Under this definition telcos, cable companies, commercial electric companies, gas companies, and others are required by law to make spare conduit available to others.

If a utility allows even one pole or piece of conduit to be used for communications, including for its own internal purposes, then the whole system must be made available to competitors at fair prices and conditions. About half of the states have passed specific rules governing those conditions while states without specific rules revert to the FCC rules.

Webpass tried to get access to AT&T conduits in California and ran into a number of road blocks. It seems like there are a few situations where AT&T has provided conduit to Webpass, but AT&T denied the majority of the requests for access.

This is not unusual. Over the years I have had several clients try to get access to AT&T and Verizon conduit and none of them were successful. AT&T, Verizon, and the other large telcos generally have concocted internal policies that make it nearly impossible to get access to conduit. When a competitor faces that kind of intransigence their only alternative is to take the conduit owner to court or arbitration – and small carriers generally don’t have the resources for this kind of protracted legal fight.

But even fighting the telcos is no guarantee of success because the FCC rules provide AT&T with several reasons to deny access. A utility can deny access on the basis of safety, reliability or operational concerns. So even when a conduit owner is ordered to provide access after invoking one of these reasons, they can just invoke one of the other exceptions and begin the whole fight again. It takes a determined competitor to fight through such a wall of denial.

Trying to get conduit reminds me of the battles many of my clients fought in trying to get access to dark fiber fifteen years ago. I remember that AT&T and Verizon kept changing the rules of the dark fiber request process so often that a competitor had a difficult time even formulating a valid request for dark fiber. Even when Commissions ordered the telcos to comply with dark fiber requests, the telcos usually found another reason to deny the requests.

This is a shame because getting access to conduits might be one of best ways possible to promote real competition. AT&T and Verizon both claim to have many hundreds of thousands of miles of fiber, much of it in conduit. I am sure there are many cases where older conduit is full. But newer conduits contain multiple empty tubes and one would have to think that there is a huge inventory of empty conduits in the telco networks. The same is true for the cable companies and the large electric companies, and I can’t recall any small carriers who has ever gotten access to any of this conduit. I think some of the large carriers like Level3 or XO probably have gotten some access to conduit, but I would imagine even they probably had to fight very hard to get it.

I remember talking to a colleague the day that we first read the Telecommunications Act of 1996 that ordered the telcos to make conduit available to competitors. We understood immediately that the telcos would adopt a strategy of denying such access – and they have steadfastly said no to conduit requests over the years. I am glad to see Webpass renewing this old fight and it will be interesting to see if they can succeed where others have failed.

The End of Moore’s Law

ibm_chip1I’ve been meaning to write this blog for a while. It is now commonly being acknowledged that we are nearing the end of Moore’s law. Moore’s law is named after Gordon Moore, an engineer who later was one of the founders of Intel. In 1965, Moore made the observation that the number of transistors that could be etched onto a circuit board would double every two years. He originally thought this would last for a decade or so, but the microchip industry has fulfilled his prediction for over 50 years now.

In 1965 a single transistor cost about $8 in today’s dollars and now, after so many years of doubling, we can put billions of transistors onto a chip, at a tiny fraction of a cent each. It was the belief that chips could continue to improve that helped to launch Silicon Valley, and that enabled the huge array of technological changes that have been brought about by cheap computer chips.

The companies that make chips have thrived by creating a new generation of chips every few years that represented a significant leap forward in computing power. I think every adult understands the real life consequences of these changes – we’ve all been through the cycle of having to upgrade computers every few years, and more recently of having to upgrade cellphones. Each subsequent generation of PC or smartphone was expected to be considerably faster and more powerful.

But we are starting to reach the end of Moore’s law, mostly driven by limits of physics and the size of atoms. It now looks like there will be better chips perhaps every three years. And within a decade or so Moore’s law will probably come to an end. There may be faster and better computers developed after that point – but improvements will have to come from somewhere other than cramming more transistors into a smaller space.

There are researchers looking to improve computers in other ways – through better software or through chip designs that can be more efficient with the same number of transistors. For instance, IBM and others have been working on chips that use layers of single chips built into a matrix – essentially a 3D chip. And there has been a lot of research into using light instead of electricity to speed up the computing process.

We are already starting to see the result of the slowdown of Moore’s law. The PC and tablet industries are suffering because people are hanging onto those devices a lot longer than they used to. Apple and Samsung are both struggling due to a drastic reduction in the sale of premium smartphones – because new phones are no longer noticeably better than the old ones.

Faster chips also fueled a lot of other technologies, including many in the telecom world. Faster chips have brought us better and faster servers, routers, and switches. Better chips have led to improved generations of fiber optic gear, voice switches, cable TV headends, settop boxes – basically every kind of telecom electronics. No doubt these technologies will keep improving, but soon the improvements won’t be from faster and more powerful processors. The improvements will have to come from elsewhere.

Faster and more powerful chips have enabled the start of whole new industries – smart cars, drones, robots, and virtual reality. But those new industries will not get the same boost during their fledgling years like what happened in the past to other electronics-based industries. And that has a lot of technology futurists concerned. Nobody is predicting the end to innovation and new industries. But anything new that comes along will not get the boost that we’ve enjoyed these many decades through the knowledge that a new technology would improve almost automatically with more powerful processors.

 

 

 

 

The Anniversary of Fiber Optics

Fiber CableI recently saw an article that noted that this month marks the fiftieth anniversary of a scientific paper by Charles Kao in 1966 that kicked off the field of fiber optics communications. That paper eventually won him the Nobel prize for physics in 2009. He was assisted by George Hockman, a British engineer who was awarded the Rank prize for Opto-electronics in 1978.

We are so surrounded by fiber optic technology today that it’s easy to forget what a relatively new technology this is. We’ve gone from theoretical paper to the world covered with fiber optic lines in only fifty years.

As is usual with most modern inventions, Kao and Hockman were not the only ones looking for a way to use lasers for communications. Bell Labs had considered using fiberglass but abandoned the idea due to the huge attenuation they saw in glass – meaning that the laser light signal scattered quickly and wouldn’t travel very far. Bell Labs was instead looking at shooting lasers through hollow metal tubes using focused lenses.

The big breakthrough was when Kao and Hockman found a way to reduce the attenuation within a fiberglass cable to less than 20 decibels per kilometer. At that level of attenuation they could overcome irregularities and impurities in the fiber cable.

It took a decade for the idea to be put to practical use and Corning Glass Works (now Corning Inc.) found ways to lower attenuation even more; they laid the first fiber optic cable in Torino, Italy in 1977.

We didn’t see any wide-spread use of fiber optics in the U.S. until the early 1980s. AT&T and a few other companies like the budding MCI began installing fiber as an alternative to copper for long-haul networks.

We’ve come a very long way since the first generation fiber installations. The glass was expensive to manufacture, and so the early fiber cables generally did not contain very many strands of glass. It was not unusual to see 6 and 8 strand fibers being installed.

Compared to today’s standards, the fiber produced in the 1980s into the early 1990s was dreadful stuff. Early fiber cables degraded over time, mostly due to microscopic cracks introduced into the cable during manufacturing and installation. These cracks grew over time and eventually caused the cables to become cloudy and unusable. Early splicing technologies were also a problem and each splice introduced a significant amount of interference into the fiber run. I doubt that there is much, if any, functional fiber remaining from those early days.

But Corning and other companies have continually improved the quality of fiber optic cable and today’s fiber is lightyears ahead of the early cables. Splicing technology has also improved and modern splices introduce very little interference into the transmission path. In fact, there is no good estimate today of how long a properly-installed fiber cable might last in the field. It’s possible that fiber installed today might still be functional 75 to 100 years from now. The major issues with the life of fiber today is no longer failure of the glass sheath, but rather the damage that is done to fibers over time due to fiber cuts and storm damage.

The speeds achieved in modern fiber optics are incredible. The newly commissioned undersea fiber that Google and others built between Japan and the west coast of the US can pass an incredible 60 Terabits per second of data. Improvements in laser technology have grown probably even faster than the improvements in fiber glass manufacturing. We’ve grown to where fiber optic cable is taken for granted as something that is reliable and relatively easy to install and use. We certainly would be having a very different discussion about broadband today had fiber optic cables not improved quickly over the last several decades.

Decommissioning Copper Lines

FCC_New_LogoThe FCC just released new rules in WC Docket 13-3 having to do with the decommissioning of copper lines. These rules apply to all regulated LECS, not just to the large RBOCs. The order also declared that the large telcos are no longer considered dominant carriers.

These rules are needed because AT&T and Verizon have been pestering the commission for five years to let them tear down copper lines. What has always surprised me about this Order is that it has been included in the docket looking at the transition of the PSTN from TDM technology to Ethernet. Decommissioning copper has nothing to do with that topic since copper lines for customers would function the same as today even with an all-IP network between carriers. But the two big telcos flooded this docket with the end-user network issues until the FCC finally caved and included the topic.

The order establishes rules that carriers must follow if they want to automatically decommission copper. A carrier must file a plan with the FCC that guarantees that:

  • Network performance, reliability and coverage is substantially unchanged for customers.
  • Access to 911 and access for people with disabilities must still both meet current rules and standards.
  • There must be guaranteed compatibility with an FCC list of legacy services that include such things as fire alarms, fax machines, medical monitors, and other devices that might not work on an IP network.

If a carrier can meet all of these requirements then they can file plans for each proposed copper retirement with the FCC. The company then needs to go through a specific notification process with customers.

While the FCC was not quite as explicit with a rule, they also expects that any replacement service to copper remain affordable for customers.

If a telco can’t meet any one of the many requirements, then they have to file with the FCC and go through a formal review process to see if the retirement will be approved. The FCC is making it clear that there will be no guaranteed timeline for the manual process.

The main regulatory impact of the rules is that now all telcos have to go through a formal process before tearing down copper. There have been, in the past, many examples of telcos taking down copper with no notification to customers or to regulators. Small telcos that have been installing fiber to customers must take notice of these rules since they now apply to them as well. These rules also means that a small telco can’t force a customer onto a fiber connection until they have gone through the FCC process.

There is still a lot of concern in rural areas that copper landlines will be taken down with only cellular service offered as the alternative. That may still happen under this process, but it’s likely that those sorts of situations will require the more detailed FCC review process and won’t be allowed automatically.

The dominant carrier issue is interesting. The FCC notes that in some markets traditional copper landlines have dropped nearly to single digit penetration rates. By ending the dominant carrier requirement for the large telcos, the FCC has lowered the regulatory burden on the large companies. For issues like 214 compliance they are now considered the same as smaller telcos. Any FCC rules that were different for dominant versus non-dominant carriers now default to the non-dominant rules. But this ruling does not end any rules that were determined by the difference between price cap and rate-of-return carriers. Those rules remain in place.

A Last Gasp Technology for Copper?

Copper wireGenesis Technical Systems of Canada has announced an improvement to an existing technology that might breathe some life into rural copper networks. The technology is called DSL rings. The technology is not entirely new and I can recall seeing it being discussed fifteen years ago, but the company has added a twist that improves on the concept.

DSL rings are essentially shared DSL. Currently deployed DSL technology can bond together two pairs of copper and in real-life networks can get as much as 50 Mbps speeds.  Under current DSL architecture, the bonded pairs are dedicated to a single home/business. DSL rings instead allows for the bonding of multiple pairs of copper that are then shared among multiple homes. In that bonding process there is a little less new bandwidth available from each pair added, so there is a natural limit on the number of copper pairs that can be bonded.

From the neighborhood device in a pedestal, the “ring” is created by using one copper pair “into” each home and one copper pair “out”. This architecture is looped repetitively through all of the homes in the ‘ring’ so that they are on one continuous copper ‘ring’. For example, in a neighborhood where there are ten homes that can currently each get 10 Mbps using standard DSL,  this technology might create a 80 Mbps pipe that would be shared by all ten homes. But at peak times when all of the homes are using a lot of bandwidth this might not be much faster than today. But by sharing all of the bandwidth with everybody, customers would have access to more bandwidth when the network isn’t busy. A single customer would have access to the whole 80 Mbps pipe. The technology is an improvement on traditional DSL – it uses the same bandwidth-sharing concept as fiber and cable TV nodes where customers in neighborhoods share bandwidth rather than each getting a separate bandwidth pipe.

The current DSL ring technology wouldn’t do anything useful for today’s rural DSL, since there is not a lot of benefit in bonding together slow connections that are only at 1 or 2 Mbps. But as CAF II is implemented by the big telcos and as faster DSL is built into the rural areas, this idea might make sense.

Genesis Technical Systems’ new twist is that they can use the DSL ring base units as a DSL regeneration site, meaning it can not only serve the nearby homes, but the unit can send out bandwidth to the next DSL ring and start a new 2 – 3 mile delivery circle around the next ring in the chain.

The big drawback to that idea is that the second chain is going to be limited to the amount of bandwidth that can be sent to it up the copper, and so it won’t have nearly as much available bandwidth as a DSL ring that is fed by fiber. I see that as the big limiting factor. But this might allow for a network with one or two DSL ring ‘hops’ that can reach further out into the rural area with faster DSL, with each subsequent ring getting significantly smaller bandwidth.

The ideal configuration would be to feed each DSL ring with fiber. But even without considering the cost of building new fiber the technology is not cheap, in the range of $600 to $800 per home added.

There will be other issues to deal with in the rural areas. Most copper networks are ‘loaded’ meaning that there are equalizers to maximize voice quality and this loading would have to be deactivated to use the DSL technology. In some areas, there might not be enough spare copper pairs to make the ring. These days we all assume that most homes have abandoned landlines for cellphones, but in rural areas where the cellular coverage is bad there are still pockets of homes where most have landlines. But copper pairs could be freed by converting analog voice to VoIP.

In looking at the technology, I see the most promising use of it in rural towns, like county seats. Neighborhood rings could be created that would upgrade DSL to compete with most current small town cable modem systems. Where customers might today be buying DSL that has speeds up to 6 Mbps or 12 Mbps they might be able to get speeds up to 50 Mbps or 100 Mbps. But the big caveat on this would be that these rings would slow down during the busiest evening hours similar to older cable TV networks. Still, it would be a major DSL upgrade.

It’s an interesting technology, but at best it’s the last gasp for an old copper network. If this technology is used to move DSLAMs closer to rural homes they are going to get a lot more bandwidth than they get today. It looks like in the ideal situation the technology would let customers burst faster than the FCC’s broadband definition of 25 Mbps. But to some degree this extra speed is illusory – during peak times the DSL would probably be significantly slower. My guess is that if one of the big telcos adopt the technology they will claim the burst speeds in reporting to the FCC and not the achieved speeds at the busy hours of the day. But customers would quickly figure out the difference.

Some Relief for WiFi?

Wi-FiThe FCC is currently considering a proposal by Globalstar to open up a fourth and private WiFi channel. It looks like the vote is going to be close with Commissioners Rosenworcel and Pai saying they oppose the idea.

Globalstar, based in Covington, Louisiana, is a provider of satellite-based telephone systems, but has been dwarfed in that part of the industry by the much larger Iridium. Globalstar was awarded a swath of spectrum in the high 2.4 GHz bandwidth to use for its satellite phones. The Globalstar bandwidth sits next to the part of the WiFi spectrum used for Bluetooth – but there is such a small amount of satellite phone usage that interference has never been an issue.

Globalstar made a proposal to make their spectrum available for WiFi, but with the twist that the want their slice of spectrum to be private and licensed by them. This differs from the rest of the WiFi spectrum that is free and open for anybody to use. Globalstar argues that allowing some large users, such as AT&T, to use their spectrum will take a lot of the pressure off of existing WiFi.

There are places today where WiFi interference is noticeable, and it is likely to get worse. Cisco projects that the amount of data carried by WiFi will triple in the next three years – a growth rate 50% greater than data usage overall. There is expected to be a lot of demand put onto WiFi from the Internet of Things. And the cellular companies have a proposal called LTE-U that would let them dip into the WiFi spectrum for cellular data.

But as might be imagined there is a lot of opposition to the Globalstar plan. One of the major objections is that this would be a private use of the spectrum while the rest of the WiFi is available to everybody. Globalstar could license this to a handful of companies and give them an advantage over other WiFi users by giving them access to a largely empty swath of spectrum that wouldn’t have many users. Having a few companies willing to pay the price for Globalstar’s spectrum flies against the whole concept of making WiFi available to everybody.

But the primary concern about the idea is that it will cause interference with existing WiFi. Today the normal WiFi antennas used to send and receive data are not very expensive, and they routinely broadcast signals outside of the range of the narrow WiFi channels. This creates a condition called adjacent channel interference where WiFi interferes with adjacent bands of spectrum. The FCC has handled this by creating buffers around each WiFi channel that allows for the bleed-over signals.

The Globalstar spectrum sits in one of those adjacent buffer zones and critics say that heavy use of the Globalstar spectrum would directly then interfere with existing WiFi that already bleeds into the Globalstar spectrum. In general it’s never been a good idea to place two heavily used slices of spectrum next to each other without buffers, and the proposal would jam Globalstar spectrum next to existing WiFi. On the other side of the Globalstar spectrum is the part of WiFi reserved for Bluetooth, and again use of the spectrum would eliminate any buffer.

The opponents to the idea have been very vocal. They don’t think the FCC should allow for the risk that Globalstar will create a clear channel for a few carriers while interfering with everybody else trying to use WiFI. The industry as a whole says this is an overall losing idea.

The issue has been in front of the FCC for a few years and looks like it will come to a vote soon. Chairman Wheeler is for the Globalstar plan with two other Commissioners already against it. It will be up to the final two commissioners to decide if this is a go or not.

 

The County Dilemma

eyeballSomebody made a comment to me last week that we need more municipal broadband. I certainly agree with the sentiment, but I’ve recently been working with a lot of rural counties and what I’ve found is that bringing broadband to rural places is a lot harder than it sounds.

In the last year I have analyzed in detail a number of different rural counties. Engineers took a hard look at the cost of bringing broadband to each of these counties and I also created extensive financial models trying to find a way to pay for the broadband solution.

One unsurprising result of these studies is that it’s exceedingly hard to find and fund permanent broadband solution in rural places. A few of the counties I studied were in the Midwest where the soil is deep and soft and where buried fiber is as cheap as, or sometimes even cheaper than getting onto poles. But even with the lowest possible construction costs it can be hard to justify building rural fiber. And most of the country has higher construction costs than in rural Iowa or Minnesota.

I’ve also looked at places where the soil is rocky and hard and expensive to bury fiber. But some of these places also have a big mess on poles, making it a challenge to hang fiber. There are many rural pole networks that consist of short poles that need a lot of work or even replacement to add fiber. And as I have covered in several blogs, there are often major practical issues with getting access to poles even where it makes sense to do so.

But the number one issue with building rural fiber is getting financing. As it turns out, many rural counties have an exceedingly hard time contributing much financing towards a broadband network.

Citizens who want fiber often say that local governments ought to just suck it up and borrow the bonds needed to build fiber. But that sentiment is naïve. Rural counties generally don’t have the borrowing capacity to fully fund a fiber network. I’ve looked at counties recently where the cost of building just the fiber and electronics (which ignores operating losses and the cost of financing) ranged from $20 million to over $100 million. Numbers that large are beyond the ability of most rural counties to finance, even if they have the political will.

Rural counties as a whole don’t have a lot of discretionary money. By that, I mean that the revenues they are able to collect are generally almost entirely needed for the services they are required to provide by law. Counties have a long list of responsibilities. They generally have to maintain extensive road systems and bridges. They generally have to fund a police and jail system. They have to provide a healthcare systems of some sort. Many of them have to provide water and sewer systems to at least some of their constituents. And they have to take care of the daily issues of removing snow, repairing potholes, and all of those things that local governments do for citizens.

Counties everywhere have similar sources of funding. For instance, they collect property taxes, but some significant portion of those taxes is usually earmarked for specific purposes like the school systems. Counties also generally share in the sales taxes collected anywhere in the county, but in rural counties this is a much smaller revenue source than for more urban places. Counties also typically get a significant amount of their funding from the state or federal government, but these funds are usually earmarked for specific purposes as well. And most rural counties don’t collect a lot of taxes from businesses, which are a significant funding source for cities and towns.

I’ve talked to the bond advisors in many rural counties about the possibility of financing fiber. What I’ve generally found is that even if most counties borrow up their credit limit they can’t raise nearly enough to pay for a broadband network. And so many county governments, as much as they might want to find a broadband solution, are not themselves able to contribute much towards paying for the solution. So in many counties, municipal funding is not ever going to be possible, meaning that broadband networks need to funded in some other way.

CAF II Technology Options

Copper CableThere has been a lot of speculation on what technologies the big telcos are going to use to meet their CAF II obligations. They have a tall task in front of them trying to bring a least 10 Mbps broadband to large swaths of rural America.

I know a lot of the areas they are being asked to serve. The typical rural county has some broadband in the county seat – often from both a cable company and from the telco. Businesses in county seats can usually get as much broadband as they want if they can afford the high prices offered in these communities for real broadband.

But the cable TV networks’ service areas usually stop near the city boundaries. And DSL that originates within the county seat doesn’t carry very far into the rural areas. To make matters worse, much of rural America still has older DSL technologies that can deliver only 6 Mbps or 12 Mbps for short distances. It’s not unusual to have a few other pockets of broadband in the typical rural county – there will often be a few subdivisions or other small towns that have DSL and perhaps even cable TV.

However, the vast majority of the physical area in most rural counties is served only by long copper telephone lines, which are usually too far from a DSL hub to get any meaningful DSL. Other than those few subdivisions that have DSL hubs, there is probably little if any fiber running to rural areas. There might be long-haul fiber running through the county, but this fiber was not built to serve local customers.

The CAF II companies are facing the goal of bringing broadband to large copper-only areas that have no existing fiber. The options for technologies that can affordably bring broadband to such areas are limited.

One solution is to build a lot of DSL hubs in the rural areas to bring DSL closer to homes. One advantage of a DSL upgrade is that it uses the existing copper wires to deliver the bandwidth. But DSL on copper won’t carry the 10 Mbps speeds required by CAF II, particularly on the older and smaller gauge copper that is found in rural networks. So the DSL option requires building a lot of fiber and a whole lot of DSL cabinets. That is expensive, particularly since in many rural areas there might only be a few potential subscribers within reach of a given DSL cabinet.

The DSL solution also assumes that the telco has maintained the copper network, and we know from experience that there are many rural areas where maintenance has been neglected for decades. Making DSL work on a degraded and compromised network can be a major challenge. We also know from experience that when you try to cram too many DSL signals in small-gauge copper cables that you get cross-wire interference that degrades the speeds.

One alternative to building fiber to DSL huts would be to instead deliver the bandwidth using point-to-point microwave radios. Microwave radios have been around a long time and are reliable. But the technology requires the use of towers of some sort – something that the telcos don’t own today and that is often not very common in rural areas. Still, there are certainly many places where a microwave radio shot is going to be cheaper than building new fiber, even considering the cost of building some towers.

I have talked to a number of engineers on the topic and they think that the telcos are going to have to introduce some point-to-multipoint wireless radios into the network to reach the most remote customers. I’ve looked at maps of many of the CAF II areas and in most of these areas there are numerous pockets of the network where there might only be a half dozen farms or homes in a large service area – and there is no cheap wireline option to upgrade such sparsely populated areas.

There is one other option that I know of – the telcos might just ignore the most remote customers. Once the networks have been built and the CAF II money spent, I’m not sure what recourse the FCC has to make the telcos finish the job. We certainly have a long history of telcos that have skirted regulatory requirements or that have reneged on promises made to regulators. So I suspect that if the telcos reach some ‘reasonable’ percentage of the people that are supposed to get the CAF upgrade that the FCC will put on its blinders and call it a job well done.

Industry Shorts – July 2016

unflagHere are a few topics I’ve been following but which don’t merit a full blog.

Mediacom Announces Upgrade Plans. Mediacom has announced plans to invest over $1 billion to upgrade its networks. The main thrust of the upgrades would be to increase speeds up to a gigabit in the 1,500 communities it serves in 22 states.

It will be interesting to see how they do this. There are many markets where they don’t have to do a lot more than upgrade to DOCSIS 3.1 and introduce new cable modems for high-bandwidth customers. But a lot of their rural markets will require forklift upgrades involving headend upgrades as well as revamping the coaxial cable plant. In the worst cases they’d have to replace coaxial cables, but in others would have to replace power taps and line amplifiers.

The company also announced it would open public WiFi hotspots in many of its markets. However, their current WiFi program is pretty weak by industry standards and only gives existing broadband subscribers access to 30 free WiFi minutes per month.

Dish Cuts Back on Ad-Skipping. Dish Networks has agreed to largely disable the feature in their new VCRs that let customers skip ads automatically. This has become such a sticky point in negotiations for content that Dish finally agreed to cut back on the very popular feature. Dish reached agreements with Disney and CBS to disable the feature in order to get new programming for Dish’s Sling TV OTT offering.

Google Launches Undersea Cable. Google and Japanese telecoms have built a new undersea cable joining Portland, Seattle, Los Angeles and San Francisco to two POPs in Japan. The cable can carry 60 terabits of data per second and is now the fastest undersea fiber. Google is also planning to complete a fiber between Florida and Brazil by the end of the year. Facebook and Microsoft are working together on an undersea connection between Virginia Beach and Bilboa Spain. With the explosive growth of Internet traffic worldwide this is probably just the beginning of the effort to create the needed connectivity between continents.

It’s interesting to see that some of the big traffic generators on the web are willing to spend money on fiber, and one has to suppose this will save them money in the long term by avoiding transport charges on other fiber routes. It’s probably also not a bad time to own a fiber-laying ship.

UN Declares Broadband Access a Universal Human Right. The United Nations recently passed a series of resolutions that makes online access to the Internet a basic human right. Among the key extracts in the resolutions are:

  • That people have the same rights online as offline, “in particular, freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice.”
  • That human rights violations enacted against people due to making their views known online are “condemned unequivocally,” and states are held accountable for any such violations.
  • Any measures to “intentionally prevent or disrupt access” to the internet are also “condemned unequivocally,” and all states should “refrain from and cease such measures.”

While it’s easy to argue that much of what the UN does has no teeth, it has been the forum since its creation for recognizing human rights.

Netflix Users Would Hate Ads. In a survey with mixed results it’s clear that Netflix users have strong feelings about introducing advertising into the popular ad-free service. In a survey given by All Flicks, 75% of Netflix users said they would dump the service if it started carrying ads.

In a somewhat contradictory finding, the pole indicated that most Netflix users would pay a premium price to avoid ads if there were options. Nearly 60% of Netflix users said they would pay $1 per month to avoid ads with many others saying they would pay even more.