The Fight Over Wireless Pole Attachments

PoleAll around the country there are fights going on between pole owners, governments, and wireless carriers over pole attachments and related issues for small cell deployment. Small cells are the first new technology that is mostly interested in non-traditional attachments, but will soon be followed by a proliferation of companies also wanting to hang devices to transmit millimeter wave radios and wireless local loops. The fights cover a wide range of different issues:

Safety. Most current pole rules were created for the purposes of keeping it safe for technicians to work on poles, particularly during bad weather conditions. Some of the devices that carriers now want to hang on poles are not small. Some are the size of dorm refrigerators or even a bit larger. And these devices are connected to live electric wires. Adding such devices to poles can make it significantly harder for a technician trying to restore power during a rain or snow storm. Just maneuvering around such devices can be a major safety concern even in good weather.

New Poles / Taller Poles. There are reports of wireless carriers asking to install new poles as tall as 120 feet in city rights-of-way. For network deployments that include wireless backhaul it’s vital that each small cell or other device has a clear line-of-sight to other devices in the network – and being higher in the air can create the needed wireless network.

In most towns the poles are not taller than 60 feet and often shorter. Taller poles create a whole new set of problems. They might mean a whole new level of tree trimming or even eliminating taller trees – and many communities take great pride in their trees. And these new poles will need power, meaning stringing more wires in the air, which can detract from the aesthetics of a residential neighborhood as well as to create more issues with downed power lines and trees to keep trimmed.

This also raises the issue of the long-term impact of such new poles. Many cities have moved other utilities underground or have multi-year programs to migrate existing utilities underground. These new wireless-only poles also require a power feed, and at least some of them require a fiber feed. Can a carrier require a wireless pole/tower in a neighborhood where everything else is already underground? Can they insist that their poles be left standing during future conversions of neighborhoods to underground utilities?

There is also the issue of sharing such new poles. Cities fear that they will be swamped with requests for new poles from companies wanting to deploy wireless technologies. It’s not hard to picture an NFL city that might have a dozen different companies wanting to deploy wireless devices – and it’s not hard to picture this resulting in chaos and a proliferation of multiple new poles on the same streets as well as numerous new electric lines to connect all of the new devices.

Right to Say No. Cities largely want the right to decide what goes in their rights-of-way. This often has manifested with requirements that anybody that wants access to rights-of-way get some sort of a franchise. It also has meant the development of local ordinances that define the whole process of using rights-of-way from the permitting process through installation techniques. But the carriers are currently lobbying at the state level and at the FCC to make uniform rules to apply everywhere. If the FCC or a state passes blanket rules there are many cities likely to challenge such rules in court.

Fees for Attachments. The carriers are also lobbying heavily to define the fee structure for attachments of these sorts of new connections. Compensation has always been an issue and my guess is that at some point the FCC will step in here in the same manner they did in the past with other pole attachments.

General Irony. I find it ironic that AT&T is leading the battle to get good terms for attaching wireless devices. AT&T has been the primary entity that has been fighting hard against Google to keep them off AT&T poles. And now AT&T wants the right to force their way onto poles owned by others. But in the regulatory world if we have ever learned any lesson it’s that big companies don’t seem to have a problem with arguing both sides of the same argument when it suits their purposes.

Finally, Unlimited Cellular Data

SONY DSCIn a virtual blink of an eye all of the cellular companies are now offering unlimited data. This probably represents a watershed event for the cellular industry and probably marks the start of the slide of cellular data into a commodity, much as has happened in the past with cellular voice and texting.

Up until now, US cellular data has been the most expensive broadband in the world. They have been selling a gigabyte of download for $10. There are numerous ISPs that will let people download a terabyte of data for between $60 and $120 per month, and that makes cellular data between 80 and 160 times more expensive than landline data.

It’s really impossible to blame this gigantic pricing difference on anything other than greed. Looking back five years ago the cellular companies claimed the high prices were due mostly to a desire to protect their cell sites from being swamped with data usage. Perhaps back when cell sites used traditional TDS backhaul (DS3s mostly), there might have been some truth to this. But today most cell sites have fiber Ethernet backhaul of gigabit or greater capacity.

This change seemed inevitable. The cellular companies have all started offering zero-rated plans where they offer some content (often their own) on an unlimited basis while still counting other content against their stingy data caps. That stark contrast pointed out the hypocrisy of their pricing. And while the current FCC is backing away from enforcing these kinds of net neutrality issues – the price contrast is so large that it might have brought eventual scrutiny from Congress during any Telecom Act re-write.

Possibly the biggest impact of this change is that it’s going to change how people use cellphones and other mobile devices like tablets and laptops. Cellphone data speeds in the US are not the best in the world, but they are good enough in most places to be able to watch a single video stream. It’s not much of a stretch of the imagination to foresee wide usage of apps that will use cellphones to capture and transfer video images to televisions. And that could make cellular data an economic substitute for landline broadband.

There is already a lot of talk about younger people preferring cellular data to landline data – mostly due to the mobility aspect. But this has largely meant that cell phone users had to stay close to WiFi most of the time in order to avoid large cellular bills. But unlimited data plans will free users to go anywhere there is a strong enough cell signal to get the connection speeds they need. Over time this could lead to an erosion of landline broadband connections, where households that have unlimited cellular data will find that to be good enough.

The upside to this, though, might be that poorer households might finally get better access to the Internet. Until now, unless a user was able to sit at home behind a landline WiFi signal, cellular data has been too expensive to use in urban areas for things like homework. Since most people now see a cellphone as mandatory to daily life, poorer households will probably be able to get by with only a cellular data plan. This might be the last nail in the coffin for urban DSL.

This change will really make a difference in rural America. I have heard from many rural households that use their cellphone data plan for their household broadband and it’s not unusual to hear of families with schoolkids spending $500 or more per month for totally inadequate cellular broadband. These households are going to be relieved to be able to buy a $100 unlimited plan instead.

Of course, in a lot of rural America there is not the same kind of cellular service that those in cities take for granted. There are very few rural places that have more than one major cellphone carrier with decent signal. And there are a lot of rural households that live too far from a cell tower to get decent cellular speeds. But unlimited plans will probably be a good band-aid to cover for the lack of affordable broadband for millions of rural homes. It won’t be too many years when this won’t be enough speed, but for today cellular broadband is a whole lot better than no broadband.

We’ll have to wait a bit to see if these plans really are unlimited, and what it means if they aren’t. Obviously the plans will be a lot less useful if they somehow preclude tethering. But however they are priced, we are probably not going backwards to the day when your $60 cell plan includes 2 gigabytes of download with every additional gigabyte costing an additional $10. Teens ten years from now will think anybody who remembers being careful how we used our cellphones is an ancient dinosaur!

Lidar

Tribrid_CarThere have been a mountain of articles about self-driving cars, but little discussion about how they see the world around them. The ability of computers to understand images is still in its infancy – in 2015 there was a lot of talk about how Google was teaching an AI program how to recognize cats within videos.

But obviously a self-driving car has to do a lot better than just ‘seeing’ around it – it needs to paint a 3D picture of everything around it in order to navigate correctly and to avoid problems. It turns out that the primary tool used by self-driving cars is called “Lidar.” Lidar stands for ‘light detection and ranging’ and fits neatly between sonar and radar.

Lidar works by sending out light beams and measuring how long it takes for reflected signals to return, much the same way that a bat sees the world using sonar. Sonar would be fairly useless in a self-driving car since sound waves get distorted in air and only paint an accurate picture for perhaps a dozen feet from the transmitter. That’s great for a bat catching a moth, but not useful for seeing oncoming traffic.

And the radio waves used in radar won’t really work well for self-driving cars. Radar works great for seeing objects far away, like metallic airplanes. But the radio waves pass through many objects (like people) meaning that radar doesn’t create a total picture of the world around it. And radar has problems creating an accurate picture of anything closer than 100 feet.

And that’s where lidar comes in. A lidar device works much like a big radar dish at an airport. It rotates and sends out light signals (actually infrared light signals) and then collects and analyzes the returning echoes to create a picture of the distances to objects around it. Lidar only became practical with modern computer chips which allow the transmitter to ‘rotate’ hundreds of times a second and which possess enough computing power to make sense of the echoed light waves.

And so a self-driving car doesn’t ‘see’ at all. The cars do not rely on standard cameras that try to make sense of the reflected ambient light around the car. The first prototypes of driverless cars tried to do this and could not process or make sense of images fast enough. Instead self-driving cars send out laser light at a specific frequency and then calculates the distance the light travels in every direction to create a picture of the world.

If you want to understand more about what this looks like, consider this Radiohead music video. Most of the images in the video were created with lidar. Don’t pay too much attention to the opening headshots because those are somewhat distorted for artistic effect. But the later images of seeing streets shows you the detail of a lidar image. Unlike the normal images our eyes see, a lidar image is massively more detailed in that the distance to everything in such a picture is known. Our eyeballs basically see in 2D and we use images from two eyes to simulate 3D. But a lidar image is fully 3D and gets full perspective from one transmitter.

Lidar does have limitations. It can be ‘blinded’ by heavy snows and rains. Lidar could be jammed by somebody transmitting a bright signal using the same light frequencies. And so smart cars don’t rely 100% on lidar but also use traditional cameras and sonar using the ultrasound frequencies to complement the lidar images.

Lidar is finding other uses. It’s being used, for example, in helicopters to search for things on the ground. A lidar system can spot a fleeing criminal or a lost child in the woods far more easily than older technologies or human eyeballs. Lidar can also create amazingly detailed images of anything. Archeologists are using it to create permanent images of dig sites during various stages of excavation before objects are removed. It’s not hard to think that within a few years that many traditional surveying techniques will be obsolete and that lidar will be able to locate and plot everything on a building lot, for example, down to the millimeter.

Putting the Lifeline Program on Hold

FCC_New_LogoEarlier this month the FCC under new Chairman Ajit Pai reversed earlier FCC approval for nine Lifeline providers who had been granted the ability to provide either wireline or wireless Lifeline broadband service. The Lifeline program grants a subsidy of $9.25 per month for low-income customers.

These were the first nine companies that had filed for the new Lifeline Broadband Provider designation to provide the subsidy for broadband connections. The Lifeline program for 32 years has provided this same subsidy to telephone service, but last year the program was extended also to data services – with the caveat that a given household is only eligible for one monthly subsidy.

The nine providers are Spot On, Boomerang Wireless, KonaTel, FreedomPop, AR Designs, Kajeet, Liberty, Northland Cable, and Wabash Independent Networks. Four of the providers had obtained their new Lifeline status on December 1 with the others being granted in January. Boomerang Wireless had already started to serve lifeline-eligible customers and the FCC ordered them to notify their customers and to cancel all lifeline subsidies within 60 days of the new order.

The stated reason for the reversals was that the FCC wanted to “promote program integrity by providing the Bureau with additional time to consider measures that might be necessary to prevent further waste, fraud, and abuse in the Lifeline program.” None of these companies has been accused of fraud but rather were the first nine companies to be granted the status of Lifeline Broadband Provider with the ability to sell a subsidized data product.

The fraud issue is an interesting one because the FCC had already overhauled the Lifeline processes to protect against fraud. For years carriers were allowed to self-certify that customers met at least one of several qualifications that made them eligible for Lifeline. But the FCC eliminated self-certification by publishing a national list of eligible customers – the list provided by and updated by other federal agencies overseeing eligible programs.

The FCC had also done compliance audits over the last several years looking for Lifeline fraud and didn’t find much of it. The new FCC order cited a $30 million settlement from Total Call Mobile that had been found to be seeking reimbursement for duplicate and ineligible customers. But the vast majority of the lifeline providers were found to have few or no issues.

Customers may have other options because the 800 carriers that already provide a Lifeline voice subsidy are now also allowed to provide a data subsidy. But nobody knows how many of these existing providers plan to offer subsidized data, and in fact over 80 Lifeline-eligible carriers recently asked to be excused from the program. This includes most of the biggest carriers in the country including AT&T, Verizon, CenturyLink, Charter, Cox, Frontier, Fairpoint, Windstream and Cincinnati Bell. There were also a lot of wireless carriers asking to be excused from the program.

It’s possible that politics has something to do with this order. The FCC under past Chairman Wheeler had reset the Lifeline program’s annual budget to $2.25 billion a year, indexed to inflation. There are Republicans in Congress who have called for the program to be capped instead at $1.75 billion annually. Stopping these new providers is one way to stop the program from growing. One would think that the withdrawal of the biggest carriers from the program will also greatly shrink the fund.

The most interesting thing about this order to me is that it seems to conflict with statements made by new Chairman Pai. On his first day as Chairman he addressed FCC employees and told them that one of his top goals was to bring broadband to all Americans. But this reversal of Lifeline status came just three days later and seems contrary to that goal.

It’s certainly possible that after more internal review that these companies might still be granted Lifeline status. But this also might instead be an indicator that the new Chairman wants to curb the Lifeline program, or maybe even eliminate it. I guess we are going to have to wait a while to see what this all means, including the Chairman’s statements about expanding broadband to all.

The Limitations of Cellular Data

SONY DSCIt’s hard these days to find anybody that is satisfied with the quality of data received over cellphones. A research report published by Aptelligent late last year showed that the US placed 10th in the world in overall cellular network performance, measured by the combination of reliability and speed. We all know that sometimes cellphone data is adequate, but can suddenly deteriorate to where you can’t load simple web pages. There are a number of factors baked into the cellular architecture that contribute to data performance. Following are a few of the key factors:

Data Power Drop-off. Cellular networks, by design, assume a massive drop-off of data speeds with distance. I don’t think most people understand how drastic the power curve is. Cellular companies show us bars to indicate the power of our connections – but these bars are not telling us a true story. The cellular architecture has a 100:1 data rate ratio from cell tower to the edge of the delivery area (generally a few miles). To provide an example, this means that if a cell site if designed to deliver 10 Mbps at the cell tower, that it will deliver only 1 Mbps at the mid-point of the cell tower range and only 0.1 Mbps at the edge.

Shape of the Cellular Footprint. It’s easy to think that there are nice concentric circles of cellphone signals propagating around cell towers. But nothing could be farther from the truth. If you go around any cell site and measure and plot the strength of signals you will find that the footprint of a cell tower looks a lot more like an amoeba, with the signal in some directions traveling a relatively short distance while in others it might travel much farther. If these footprints were static then engineers could design around the vagaries at a given cell site. But the footprint can change quite dramatically according to temperature, humidity and even the number of users concentrated in one portion of the footprint. This is why the delivery of broadcast wireless services is always going to more an art than a science, because the delivery footprint is constantly shifting, in many cases dramatically.

Proliferation of Antennas. Modern cellular networks have improved performance by significantly increasing the number of transmitting antennas on a cell tower (and also more receiving antennas in cell phones). This use of MIMO (multiple input, multiple-output) has produced a significant improvement for customers who are able to gain simultaneous signal from more than one transmitter. But there are two consequences of MIMO that actually decrease performance for some users. First, MIMO largely benefits those that are closest to the cell tower, and that means there are fewer quality connections available for those farther away from the cell tower. Second, MIMO has a secondary characteristic in that MIMO works best using cellular channels that are not-adjacent. And during time of heavy cellular usage this has the result of improving the signal strength in the MIMO channels but decreasing the strength of the other channels, again decreasing quality for customers that grab the weaker channels.

Impaired Hand Offs. Mobility is enabled in a cellular network when a customer is handed off from one cell site to the next while traveling. MIMO and other techniques that increase the signal to a given customer then make it more difficult for that customer to be handed to the next cell site. Hand offs were relatively error free when customers received a one channel signal from one transmitter, but now the quality of hand offs from one cell site to another can vary dramatically, resulting in more disconnects or drastic swings in the strength of connections.

Small-Cell Issues. All of the above issues will be compounded by the introduction of small-cells into the cellular network. In today’s cellular architecture a customer can only be handled by one cell tower at a time. Cellular networks don’t automatically assign the strongest connection to a customer, but rather the nearest available one. While small-cells will increase the opportunity to get a signal in a crowded environment, it also increases the chance of getting a poor connection, or of running into hand off issues for mobile customers.

2D Signal Propagation. Cell tower antennas are largely aimed to transmit close to the ground and do not waste signals by sending signals upwards in a 3D pattern. Anybody who has traveled to a big city and received poor signal on an upper floor of a tall hotel is familiar with this issue. The cellular signals are focused towards street level and not towards anybody higher. That’s not to say that you can’t get a cellular connection at the top of a highrise, or even in an airplane, but the vast majority of the connections (and the strongest connections) are aimed downward.

Crisis Propagation. Cell towers are arranged as an interconnected mesh. When something drastic happens to a given cell tower, such as losing power or being swamped with calls during an emergency, this not only shuts down the tower with a problem, but the problem cascades to nearby towers, often taking them out of service as well. This is similar to a rolling blackout in an electric grid. Carriers have been working on load balancing techniques to try to tamp down this problem, but it’s still relatively easy for a given cell tower to get overwhelmed and start a neighborhood and even regional cascade.

These issues all outline how complicated it is to design a great cellphone network. The above issues are worsened by the fact that in the US our cell sites were largely placed years ago to accommodate voice traffic and thus are not situated to instead provide optimum data traffic. But even a brand new cellular network designed to optimize data traffic would run into these same or different issues. It’s nearly impossible to design a cellular network that can handle all of the issues encountered in the real world. This makes me glad I’m not a cellular engineer.

Amazon as an ISP?

Amazon EchoI mentioned in a blog last week that there is a rumor that Amazon is considering becoming an ISP. This information came from The Information, which says it got this by somebody inside Amazon management.

It’s an intriguing idea. Amazon has shown throughout its history that it loves to own its supply chain. If you recall, Amazon started out as a web reseller of books. But over time the company has built what must certainly be the largest and most efficient bricks and mortar fulfillment infrastructure in the world.

And the company hasn’t stopped there. The company has been building a fleet of semi-trailers used to haul its inventory, thus bypassing UPS and the Post Office. The company uses third-party tractors today but their goal is to build a fleet in anticipation of self-driving trucks within the coming decade. The company is also experimenting with drones, wheeled robots and other ways to bypass local delivery services.

The company has done the same with its successful data center business. They have built massive data centers and assembled a dark-fiber network to connect them together and to connect to major customers. And it is that fiber network that could create the backbone of an ISP network.

You have to think that Amazon learned a lesson from Google Fiber’s foray into FTTP, and so it seems unlikely that they would leap into a massive infrastructure build in that same mold. The article says Amazon might consider using the open access networks in Europe as a way to avoid building fiber. But they don’t have to go the whole way to Europe to try this. For example, just across the mountains from Seattle are a number of Public Utility Districts (county-wide municipal electric companies) that have built open access fiber networks that pass over 100,000 homes – an easy way for Amazon to test the ISP idea.

And around the country are a number of other open access networks. All of the municipal networks in states like Colorado, Utah and Virginia are required by law to be open access. We have the example of Huntsville, AL that built a FTTP network for Google that will become open access after a few years. There are numerous communities around the country that would gladly build fiber networks if they were guaranteed to get companies like Amazon and Google as major ISP tenants. It’s been my experience that almost no city wants to be an ISP unless it has no other option – but there are many who want fiber badly and would welcome Amazon with open arms.

I would think that Amazon will also keep their eye on the developments with wireless last mile. There might come a time when they might be able to leap into the ISP business with a reasonable cost per customer – at least in selected markets.

Amazon would be an interesting ISP. It was just a few years ago that it was clear that an ISP needed a traditional cable TV product to be successful. Google tried to launch without cable TV in Kansas City and hit a brick wall in selling to residential customers. But the tide is turning and I’m not sure that TV is mandatory any longer.

Amazon already has an impressive content platform with Amazon Prime and they have said that they are going to spend billions to create their own content, following the lead of Netflix. It’s also becoming clear that customers are becoming willing to accept an abbreviated line-up of popular cable channels like what’s being sold by Sling TV and other OTT providers. Amazon could be competitive with an abbreviated cable line-up made up of local programming, popular cable channels and its own content.

But Amazon has some advantages that other ISPs don’t have. For now Amazon is leading the pack in the intelligent personal assistant market with its Amazon Echo. I’ve had an Echo for about six months and I already can tell that it is improving. The company is working towards introducing cloud-based AI to the platform and within a few years the Alexa assistant should become a true computer assistant like has been envisioned for decades in science fiction.

My gut tells me that bundles which focus on smart computer services like Alexa will soon be more popular than the traditional triple-play bundles from Comcast and AT&T. Amazon has one huge advantage as a start-up ISP in that customers like using them – something they have fostered by delivering packages regularly on time to a huge percentage of households in the country. They are at the opposite end of the customer service scale from Comcast and the other big ISPs.

I have no idea if this rumor is true. But the idea is so intriguing that I hope Amazon is considering it. One of the major complaints about broadband in this country is the lack of competition and choice. Companies like Amazon can bring fresh competitive bundles that break away from the traditional triple play and that can redefine the ISP of the future.

Update: This rumor persisted and in February 2017 I posted an update about this rumor. https://potsandpansbyccg.com/2017/02/21/amazon-as-an-isp-2/

The Transition to IP Telephony

ATTAT&T reported to the FCC about the progress of its transition of customers from a traditional TDM network to an all-IP network. AT&T had undertaken two trials of such a conversion in Carbon Hill, AL and Delray Beach, FL.

These were voluntary trials. AT&T had advertised widely and asked customers to move to the new IP-based services. In Carbon Hill 36% of residents and 28% of businesses voluntarily moved to the new service. In Delray Beach the numbers were similar with 38% and 25% converting. AT&T reported there were no reports of degraded service, including the transition of business customers to IP-based Centrex and similar services.

Since the trials were announced AT&T has also grandfathered Centrex and TV1-Analog Video service, meaning they will take no new orders for the services. The company also asked the FCC’s permission to discontinue 13 legacy services that are obsolete. This includes products that most people never heard of like 4-wire and voice-grade telemetry and various alarm bridging services. The company also has asked permission to discontinue six operator services including collect calling, person-to-person calling, billed to third party, busy line verification, busy line interrupt and international directory assistance.

These trials need to be put into perspective. From a technical perspective there is no reason to think that transitioning these service from TDM to IP-based technology wouldn’t work because a lot of the rest of the telephony world made that transition years ago. Cable companies like Comcast and anybody operating on an all-fiber network has been offering IP-based telephone products for many years. AT&T’s offerings include many products that are strictly copper-based, such as the legacy products they want to discontinue.

And that leads to the whole purpose behind these trials. AT&T wants to move customers off old copper networks to either a landline or wireless IP-based solution. Since the company’s goal is to tear down copper, the vast majority of such transitions will be to the company’s cellular network. A miniscule percentage of AT&T’s customers are on fiber – particularly residential customers since the company has launched very little FTTP in that market.

The trials are largely the result of what happened to Verizon on Fire Island a few years ago after Hurricane Sandy. There Verizon didn’t replace destroyed copper but moved people to a cellular-based service. But unlike these trials, which were meticulously slow and careful, it seems that in many of the Fire Island cases Verizon did not offer equivalent services to what they had offered before the hurricane. Apparently things like burglar alarms, medical monitoring devices, and other services didn’t work on the new wireless connections.

The FCC has already granted these big telcos the ability to tear down copper as long as they follow customer notification processes. My guess is that after these trials are blessed by the FCC that the companies will begin ripping down rural copper all over the country.

I expect that many customers are going to be unhappy when they lose their copper. Anybody who has traveled in rural areas understands that cellular coverage is often spotty, or even non-existent. Customers are worried about being cut off from telephony services inside their homes. It’s a legitimate concern for somebody with poor cellular service and with little or no broadband options, like we see in millions of rural homes and businesses.

But the time is coming soon when these transitions will not be voluntary like was done in these two communities. The big telcos will issue the legally required notices, and then they will proceed to shut off and tear down the copper. In doing so they will have undone the original FCC’s goal set by the Telecommunications Act of 1934, which was to make telephone service available everywhere. There are now going to be homes and communities that are going to be cut off from a workable alternative to make reliable voice calls.

I honestly never thought I’d see this happen. But I guess it was the pretty obvious end game after it became clear decades ago that the big telcos were not going to properly maintain their rural copper networks. We aren’t too far from the day when copper telephone networks join the list of other technologies that outlived their usefulness and are a thing of the past – at least for the giant telcos. There are still other companies like Frontier and Windstream that are fighting to extend the life of their copper, but we’ll have to see what the future holds for them and their customers.

Who Will Win the Telecom Battle?

facebookNow that Google has pulled back with expansion of Google Fiber it’s easy to see that the cable companies and telcos think they have won the broadband war. But I think if you look a little closer this might not really be the case.

Tech companies like Google, Facebook and Amazon are still focused on making sure that people have enough bandwidth to take advantage of the many products these giant companies offer or plan to offer in the future. And all three companies are growing in importance as content providers.

Consider first the strength of these companies as content providers. Google owns YouTube which is becoming the most important video destination for the younger generation – and those kids are growing up. We’ve seen young millennial households largely reject traditional cable TV offerings. While Amazon Prime is not nearly as big as Netflix it is a strong second and is continuing to grow. Amazon is also reported to be pouring big money into producing original content for its platform. Facebook is on a trajectory to become the preferred source of news and information. And their Facebook Live is also quickly becoming a huge content platform.

But content isn’t everything. Consider that these companies have amassed an enormous private fiber network. Google doesn’t talk about it’s network, but way back in 2013 it was reported that Google had assembled a network consisting of 100,000 miles of dark fiber. We also don’t know the size of the networks, but both Amazon and Facebook have also built large private networks. We know that Google and Facebook have partnered to build a massive undersea fiber to China and are looking at other undersea fiber routes. Amazon has built a huge network to support its cloud services business. It would not be surprising if these companies have already together amassed a larger fiber network than the telcos and cable companies. If they are not bigger yet, they are on a trajectory to get there soon. With these networks the tech companies could hurt the big ISPs where it most hurts – by taking a huge bite out of their special access and transport businesses.

These companies are also not done with the ISP business. Google Fiber has retracted from expanding FTTH networks for now, but they acquired Webpass and are looking to expand as an ISP using wireless last mile. And we saw in Huntsville that Google is not afraid to use somebody else’s fiber network – something we have never seen any of the telcos or cable companies consider. It would not be surprising to see Google make deals with other private networks to expand its ISP business to avoid spending the upfront capital. But perhaps Google’s biggest foray into providing data services is Google Fi, their service that provides unlimited cellular data using WiFi first rather than cellular. It’s been rumored that Google is looking for partnerships to expand WiFi access in many markets. And it’s been reported that Amazon is strongly considering becoming an ISP. I’ve not heard any details about how they might do this, but the company has shown the ability to succeed in everything it’s tackled – so it’s an intriguing possibility.

It’s a gigantic task to take on companies like AT&T and Comcast head on. I think Google Fiber learned this the hard way. But at the end of the day content is still king. As these companies continue to grow in influence as content providers they present a real challenge to traditional programmers. But they also are a growing threat to the big ISPs. If these tech companies decide that their best strategy is to directly deliver their content to subscribers they have a big enough marketing position to pull along a huge number of customers. It’s clear that consumers like these tech companies far more than they like the big ISPs, and in the end the accumulated animus with customers might be their undoing.

This kind of industry shift won’t happen overnight. But it’s already quietly going on behind the scenes. We may not be as far away as you might imagine when these companies provide more content than the traditional programmers and also carry more bandwidth on their own networks than the big ISPs. From my perspective that looks a lot like winning the battle.

Time for a New Telecom Act, Part 2

FCC_New_LogoYesterday’s blog postulated that we would see a new telecom act this year from Congress. That blog looked at what was accomplished by the last Telecommunications Act of 1996. Today I’m looking ahead at the issues that a new Act needs to address.

Last week we learned more about how the process will probably work. A new telecom act would likely be spearheaded by the Energy and Commerce Subcommittee on Communications and Technology. Last week Rep. Marsha Blackburn, head of that committee, told the press that she favored giving the new FCC a shot at fixing the things under its purview before the House would tackle a new Act. The FCC doesn’t have the authority to make many of the needed changes in telecom regulation, but it does have considerable power. Anyway, this probably means a new act is at least a year away.

Here are some of the things that I think the FCC and Congress need to address to modernize telecom:

Need for More Spectrum. It’s becoming clear that a lot of big ISPs are thinking of deploying 5Gn and various other millimeter wave technologies. The FCC needs to continue to open up more spectrum for broadband. There is still a lot of spectrum has been reserved for government use and there needs to be more attempts to share frequency when possible. There also needs to be a fresh look taken at how frequency is used. Historically many bands of frequency had narrow channels aimed at accommodating voice traffic or a single channel of television. From an engineering perspective we can get a lot more out of spectrum if we can make wider channels in the spectrum bands that are already in use.

Tackling Cybersecurity. 2016 was a year when security breaches led the industry news weekly. There is no easy fix for security issues, but there are big steps that can be taken. For example, we are flooding the world with IoT devices that are easily hacked and which can now be used to launch coordinated denial of service attacks. With Congressional backing the FCC could create standards to make IoT devices more secure. The government will never make us free from hacking, but there are a lot of sensible standards and fixes needed for IoT devices.

Expanding Access to Fast Broadband. As somebody who works regularly in rural America I know that lack of broadband there is now one of the biggest problems identified by rural households. We need to find ways to get good broadband to more places, and we have to do this smartly by building infrastructure that will last for decades. We’ve already seen how not to do this with the CAF II program that is being used to expand DSL and LTE wireless – two technologies that are already inadequate today.

Unless we see that fiber is built everywhere this is going to be an ongoing major issue. For example, if we fix broadband for those that have none but ignore the bigger swathe of the country that has only marginally acceptable broadband today, we will be back in a decade looking at how to fix broadband in those places.

We also need rules that unleashes anybody willing to spend money on fiber. I see numerous rural counties and towns that are ready to spring for bond issues to get fiber. We need rules that allow anybody willing to invest in fiber be able to do so – be that local governments, electric cooperatives, rural telcos or anybody else.

Infrastructure Issues. There are still a lot of infrastructure roadblocks to deploying fiber. We have never done a good job of fulfilling the mandate from the 1996 Act to provide access to poles and conduit. And we are now looking at deploying a fiber-fed wireless network that is going to mean bringing both fiber and power to buildings, rooftops, poles and other infrastructure. We need to find a way to get this done without also trampling over the legitimate concerns of local jurisdictions. For example, the FCC can’t just demand that cities allow free and quick fiber construction if that means digging up newly paved streets or overburdening poles – we need to find rules that work. And we need to do a much better job of this than we have done so far.

Programming. It’s now clear that online video content is competitive alternative to traditional cable TV. We need rules that unleash cable companies and anybody else to sell programming that people really want to buy. That means stepping away from the current rigid cable rules that mandate the giant channel lineups. Companies need to be free to create programming bundles that people want to buy. This might mean allowing a la carte programming. And there must be rules that require content providers to sell to everybody in an unbiased manner.

I don’t know how many of these big issues the current FCC is going to be willing to tackle. It seems like a lot of their agenda for the first six months will be to undo things ordered by the previous FCC. While I understand the desire to mold the FCC to the political persuasion of whatever party is in power, most of the issues on my list above are not partisan. They are just things that we all need to solve if we are to have a telecom infrastructure that serves us all well.

Time for a New Telecom Act, Part 1

capitalNothing is ever certain in the regulatory world, but it looks like there is a good chance that we will see a new telecom act this year. There are certainly parts of the old Telecommunications Act of 1996 that need to be refreshed and there are a lot of new topics like broadband, OTT and the IoT that need to be addressed by Congress. Today’s blog is going to review the old telecom act and tomorrow I will address the changes that I hope are included in any new act.

It’s hard to believe but the Telecommunications Act of 1996 was enacted 21 years ago. From a technological perspective that was almost the dark ages. 1996 was the year that AOL launched its unlimited dial-up product for $19.95 per month (before then subscribers paid by the minute). This drew millions of people to the Internet and convinced them to pay a monthly fee for access. DSL and cable modems were still in the lab and dial-up access ruled the world.

The main thrust of the 1996 Act was to create more competition with telephone service. Ma Bell had been broken up in 1984 which had resulted in long distance competition. Long distance rates dropped steadily over the years after divestiture. Congress decided that it was time to also create competition for dial tone. They recognized that the roadblock to competition was that the big telcos owned the vast majority of the copper lines going to homes and businesses and that nobody was likely to build a second telecom network.

So the Act implemented new rules to promote competition. Some of the changed mandated by the new Act were:

  • Creating a new regulatory category for telephone competitors that was labeled CLEC (Competitive Local Exchange Carrier).
  • Requiring the big telcos to ‘unbundle’ their copper network. This meant that they had to provide access to their copper plant to CLECs. To accomplish this the FCC mandated that CLECs had the right to interconnect to the big telco networks and to collocate in their central offices when necessary.
  • Mandating that the big telcos offer up telecom services for resale. They basically had to sell bulk services to competitors who could then sell them to customers.
  • Requiring that anybody that wanted to build new network be given access to poles and conduits and be allowed to connect to telco network at any reasonable place of their choosing.

The Act was immediately successful and unleashed a flurry of competitive activity. Giant new CLECs were formed that collocated in telco offices gained access to copper loops. The most popular product was the unbundled T1 that allowed new competitors to sell data and telephone services to businesses over one connection. There were also giant companies formed to tackle resale. I recall that one of my clients in those days, Talk America, got over one million residential customers by reselling local phone service along with cheap long distance. Many consultants were formed to help the new competitive companies including my company, CCG Consulting.

The Act also brought about many other changes, some of the most significant being:

  • The regional Bell companies were allowed to get into the long distance business and compete against AT&T.
  • The Act granted the FCC the right of preemption to allow it to override conflicting state rules.
  • The Act created intercarrier compensation for paying for the exchange of traffic between telcos and CLECs.
  • The Act also shook up the Universal Service Fund and made compensation more directly cost-based.
  • The Act also tackled a number of other regulatory issues such as preempting telecom services from franchise fees, establishing rules to define obscene programming, and enabling the over-the-air transmission of digital TV signals.

In many ways the 1996 Act was a big success. Prices for telecom services plummeted in subsequent years. But over time the effective lobbying of the large telcos reversed some of the aspects of the Act, like resale and the unbundling of dark fiber. The Act also did not foresee the explosion of cellphones and of landline broadband and those industries have never gotten the same level of regulatory scrutiny that applies to telephone service. There are still CLECs today making a living by providing DSL over telephone copper. But the increasing needs for faster broadband speeds is starting to make that technology irrelevant and it’s definitely time to consider a new Act to deal with today’s issues.