The Resurgence of Wireless Mesh?

I’ve had several calls recently from clients asking about wireless mesh networks. Those that have been in the industry for a while probably remember the mesh network craze in the late 1990s. At that time large cities all over the country considered building WiFi mesh networks to try to bring broadband to everybody in their cities. Many cities deployed pilot systems, but in the end, the technology never panned out. The technology had the squirrely problems often associated with wireless technology and never delivered the bandwidth that the manufacturers promised.

Apparently, the technology is back. I went to the web for a quick investigation, and sure enough there are carrier-class outdoor mesh radios available from a number of different manufacturers. In case you aren’t familiar with the concept of a mesh network, it’s a network comprised of multiple radios, each of which connects to multiple other radios. Most mesh networks are dynamically linked, meaning that the radios work autonomously to find the most efficient routing path for traffic within the mesh. The easiest way to understand this is with this diagram from Cisco, which has been manufacturing mesh network gear for many years. In this diagram each radio interconnects with neighboring radios.

The biggest flaw in the technology two decades ago was that the mesh networks didn’t scale well. This was for two reasons. First, by definition, a wireless link loses half of its bandwidth with every hop to another radio. Mesh networks with too many hops don’t deliver very much bandwidth to the most remote nodes in the network.

Large mesh network also developed an unexpected problem. One of the characteristics of a mesh network is that the radios constantly coordinate with each other. If a given node is temporarily overloaded with a big bandwidth demand from an end user, the network dynamically routes other traffic around the bottleneck. Unfortunately, it turned out that in large networks the radios spent a majority of the bandwidth communicating with each other at the expense of the bandwidth left for end users. As mesh network grew in size the amount of bandwidth throughput decreased significantly. Technicians determined that this excess internode chatter could be reduced by limiting the number of nodes that any radio could communicate with, but in doing so the network was no longer a real mesh.

The other big problem in the 1990s is that the networks were deployed as outdoor radios, meaning that very little bandwidth actually made it into homes. I remember working one day at a client where I could see a nearby mesh radio through a window. As long as I sat where I had a direct line of sight to the radio I could use the WiFi, but if I moved to another part of the room the signal completely died. Broadcasting WiFi with outside radios is an inefficient way to provide bandwidth indoors.

Those inherent problems are still an issue today. There is no way to eliminate the issue of the bandwidth decreasing with each hop. However, the difference from today and the 19990s is that we can feed a mesh network with gigabits of broadband instead of with a few T1s. To some degree, that means that we can overpower the system so that at least some bandwidth makes it to the furthest nodes in the network.

One of the other weaknesses of a mesh network is that most networks use WiFi spectrum. Practically every wired home uses WiFi today to move bandwidth around the house. Superimposing a mesh WiFi network in a neighborhood means a lot more high-power WiFi sources to cause interference with every other WiFi device. Anybody who has ever tried to maintain a WiFi signal in a crowded business hotel understands the issues with WiFi interference.

Even with those limitations, I can see some great uses for a mesh network. The vendors are pushing the technology as a way to bring bandwidth more easily to outdoor spaces like parks. There is a brand of outdoor mesh devices being marketed as a way to spread WiFi around a farmhouse to the outdoor buildings. While nobody seems to be marketing the idea yet, a mesh network might be a good way to spread WiFi signals to fields and pastures to track the small bandwidth sensors being used to collect data from fields and herds.

What my clients really wanted to know is if a mesh network could be used to provide residential broadband. There might be situations where this makes sense. Rather than trying to beam the bandwidth from outside hotspots, each radio could feed a wire into a home. But mesh networks still have the same inherent problems as in the past and in most cases other solutions can probably produce faster and more consistent bandwidth. As a consultant I always have an open mind, but having seen the technology crash and burn once before I’d want to see this in practice before buying the resurgence of the technology again.

Another Alternative for Local TV

One of the factors that is jacking up the price of cable TV is the retransmission fees paid to local networks. Cable companies pay hefty fees to local ABC, CBS, FOX, and NBC affiliates in order to carry the stations on a cable system. A decade ago this right was mostly granted for free, but cable companies now typically pay $12 to $15 per month per customer to carry the local network stations.

Locast has found an interesting way to put local networks on the Internet without paying the local retransmission fees. They have launched their service in New York, Philadelphia, Boston, Washington DC, Baltimore, Chicago, Houston, Dallas, Sioux Falls, Denver, Rapid City, Los Angeles, and San Francisco.

Locast is taking advantage of a loophole in the law that allows ‘broadcast translators’ to receive and transmit a local broadcast TV signal without a copyright license. In the US, a broadcast translator was traditionally a relay station that would receive TV signals from an antenna and then retransmit the signal to an area that couldn’t receive the signal. There are always places in and around cities that are in radio ‘holes’ that can’t get a signal, similar to dead areas for cellular coverage. Further, it’s common for folks living in areas with a lot of high rises to not be able to receive local TV through the air without access to rooftop antennas.

A law passed by Congress in 1976, 17 U.S.C. 111(a)(5), allows for a non-profit organization to make a secondary transmission of a local broadcast signal as long as the non-profit doesn’t receive any ‘direct or indirect commercial advantage’ from the process. Non-profits are allowed to charge a fee that allows them to recover actual and reasonable costs, but no more.

Locast is a non-profit that is operated by the Sports Fans Coalition NY, another non-profit that has been fighting to make sure that New York City residents can see local sports over the air. For example, the organization has fought to modify the blackout rules enforced by many major league sports.

In New York City the organization puts 14 local stations on the Internet that includes the major networks. Customers of the service receive the channels along with a traditional channel line-up for the local channels. Locast claims the service is of huge benefit in the city since there are many households who cannot receive signals over-the-air with access to roof-top antennas.

Locast also paints itself as being of benefit to local stations. They geofence each city and only make the internet feeds available to those that can verify they live in the specific metropolitan area. Locast says that they are reaching cord cutters and are providing benefit to local stations because they provide feedback of what people are watching. This gives stations verifiable ‘eyeballs’ that can be counted when selling advertising. Stations are otherwise not seeing any financial benefit from cord cutters.

Locast solicits donations – with the base suggested amount of $5 per month. Viewers are not required to donate, but reviews say that those that don’t donate are interrupted regularly asking for a donation.

It’s an interesting model. A few years ago, Aereo tried to do something similar and was selling low-cost access to local stations through a technology that beamed the signal directly to each viewer. The broadcasters hounded Aereo in court until they finally forced them out of business.

The interesting difference here is the non-profit loophole. It’s a little surprising that Locast hasn’t yet been sued, having started this business in early 2018. They admit on their website that they expect at some point to get sued. But perhaps they won’t get sued if the local stations see the benefit – Locast claims that some stations in Philadelphia are actively working with them since they bring verifiable customers and tracking of views.

The Homework Gap

There are numerous studies and also mountains of anecdotal evidence from teachers that students without home broadband lag in academic performance compared to students with home broadband. This problem has come to be called the homework gap.

In a recent article, the Associated Press estimated that 17% of students – 3 million students – don’t have a computer at home. The estimate is that 18% don’t have broadband at home. That same article referenced a major study performed by the National Center for Education Statistics (NCES), an agency inside of the US Department of Education. That study compared test scores for 8th grade students both with and without a home computer. The results showed:

  • On tests of reading comprehension, students who have a computer at home had an average score of 268 compared to a score of 247 for students without a computer.
  • In testing for mathematics, students with a computer at home scored 285, while those without score 262.
  • In testing science, students with a computer scored 156 compared to 136 for students without a computer.
  • In testing competency in information and communication technology, students with a home computer scores 152, compared to 128 for students without a home computer.

The NCES also gathered statistics from around the world and they found that in 34 out of 37 countries that students with a home computer outperformed students without a computer in mathematics.

It’s not easy to gather this data. Student populations change every school year. There are also numerous kids in gray areas. For example, there are many students trying to do homework on smartphones – somewhere between having a home computer and not. There are also temporary broadband solutions such as school systems and libraries that lend hot spots to some students.

I’ve worked with several school systems that provide laptops or tablets to all students and then struggle with having to deal with rural students that don’t have a home broadband connection. NCES reports that teachers often modify their curriculum to not disadvantage students without home broadband – literally dumbing down courses for everybody to account for the fact that some students don’t have home broadband.

The homework gap is a problem because other studies have shown that lagging behind in school carries over into adult life. For example, students that lag in school drop out of school at a much higher rate and enroll in college at a much lower rate. Other studies have shown that students that don’t finish high school or enroll in college earn significantly less over a lifetime than students that graduate and/or get at least some college.

When rural communities come to me looking for a broadband solution, the homework gap is often the number one issue. Parents often have to make extraordinary efforts to find access to broadband – such as driving to town nightly to sit outside of hotspots. These parents are among the most vocal proponents for broadband. I’ve found that in rural communities the support for getting broadband for students run deep. My consulting company conducts surveys, and in rural communities we find nearly universal support for finding a local solution for the homework gap.

NCES is worried that the homework gap is growing as more school systems migrate their curriculum online. It’s getting harder for schools to try to accommodate students without broadband or home computers. As a country we pay a lot of lip service to the topic of finding a rural broadband solution – politicians and regulators seem to talk about it non-stop. But if we don’t find a solution there is one thing we know for sure – those 3 million students are not going to do perform as well as everybody else – and that’s a problem that affects all of us. We have to do better.

Continued Lobbying for White Space Spectrum

In May, Microsoft submitted a petition to the FCC calling for some specific changes that will improve the performance of white space spectrum used to provide rural broadband. Microsoft has now taken part in eleven white space trials and makes these recommendations based up on the real-life performance of the white space spectrum. Not included in this filing is Microsoft’s long-standing request for the FCC to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has long favored creating just one channel of unlicensed white space spectrum per market – depending on what’s available.

A number of other parties have subsequently filed comments in support the Microsoft proposals including the Wireless Internet Service Providers Association (WISPA), Next Century Cities, New America’s Open Technology Institute, Tribal Digital Village and the Gigabit Libraries Network. One of the primary entities opposed to earlier Microsoft proposals is the National Association of Broadcasters (NAB), which worries about interference with TV stations from white space broadband. However, the group now says that it can support some of the new Microsoft proposals.

As a reminder, white space spectrum consists of the unused blocks of spectrum that are located between the frequencies assigned to television stations. Years ago, at the advent of broadcast television, the FCC provided wide buffers between channels to reflect the capability of the transmission technology at the time. Folks my age might remember back to the 1950s when neighboring TV stations would bleed into each other as ghost signals. As radio technology has improved the buffers are now larger than needed and are larger than buffers between other blocks of spectrum. White space spectrum is using those wide buffers.

Microsoft has proposed the following:

  • They are asking for higher power limits for transmissions in cases where the spectrum sits two or more channels away from a TV station signal. Higher power means greater transmission distances from a given transmitter.
  • They are asking for a small power increase for white space channels that sit next to an existing TV signal.
  • They are asking for white space transmitters to be placed as high as 500 meters above ground (1,640 feet). In the US there are only 71 existing towers taller than 1,000 feet.
  • Microsoft has shown that white space spectrum has a lot of promise for supporting agricultural IoT sensors. They are asking the FCC to change to white space rules to allow for narrowband transmission for this purpose.
  • Microsoft is asking that the spectrum be allowed to support portable broadband devices used for applications like school buses, agricultural equipment and IoT for tracking livestock.

The last two requests highlight the complexity of FCC spectrum rules. Most people would probably assume that spectrum licenses allow for any possible use of spectrum. Instead, the FCC specifically defines how spectrum can be used and the rural white space spectrum is currently only allowed for use as a hot spot or for fixed point-to-point data using receiving antennas at a home or business. The FCC has to modify the rules to allow use for IoT for farms sensors, tractors and cows.

The various parties are asking the FCC to issue a Notice of Proposed Rulemaking to get comments on the Microsoft proposal. That’s when we’ll learn if any other major parties disagree with the Microsoft proposals. We already know that the cellular companies oppose providing multiple white space bands for anything other than cellular data, but these particular proposals are to allow the existing white space spectrum to operate more efficiently.

Will Broadband Go Wireless?

For years it’s been impossible to go to any industry forum without meeting a few folks who predict that residential broadband will go wireless. This buzz has accelerated with the exaggerated claims that fast 5G broadband is right around the corner. I’ve seen even more talk about this due to a recent Pew poll that shows that the number of people that only use their cellphones for data has climbed significantly over the last few years – I’m going to discuss that poll in another upcoming blog.

The question I’m asking today is if it’s possible that most residential broadband usage in the country can go wireless. Like I usually do I looked around the web to try to define the current aggregate amount of landline and cellular data currently being used in the US. It’s a slippery number to get a grasp of for a number of reasons, not the least being that broadband usage is growing rapidly for both cellphones and landline connections. It looks like landline data usage per household is still doubling about every three years; it looks like cellphone data usage is doubling every two years.

OpenVault recently reported that the average monthly household broadband usage has grown to 273.5 gigabytes for the first quarter of this year, up from 215.4 gigabytes a year earlier in 2018 – a growth rate of 27% which almost exactly doubles usage in three years if sustained.

There are currently a little more than 127 million households, and the FCC says that around 85% of all households have broadband. Extrapolating that all out means that US landline networks in aggregate carried almost 30 exabytes of broadband for households monthly in the first quarter of this year. (An exabyte is 1 million terabytes, or 1 billion gigabytes).

I’ve seen a few recent statistics that says that about 77% of Americans now have a smartphone, up from 67% in 2017. Recent statistics from several sources say that the average data usage per smartphone is now over 4 gigabytes per month, with buyers of ‘unlimited’ data plans averaging more than 6 gigabytes per month and others still down closer to 1 gigabyte per month. With a current population around 329 million and using an average of 4 gigabytes per month per residential phone, the cellular networks are currently carrying about 1 exabyte of residential broadband per month.

If we extrapolate forward six years, assuming keeping the existing growth rate for each kind of broadband, we can predict that total monthly US residential broadband usage will be something like the table below. Note that these figures exclude business broadband usage.

:

Monthly Exabytes
Landline Cellular
2019 30 1.0
2020 38 1.4
2021 48 2.0
2022 61 2.9
2023 78 4.2
2024 99 6.0

Today the landline residential broadband networks are carrying 29 exabytes more of data per month than cellular. Within six years that difference grows to 93 exabytes. There is no reasonable path forward that will have cellular data usage overtake residential usage in our lifetime.

The next issue to address is the overall capacity of the cellular network. The engineers at the cellular networks are likely cringing at the possibility of having to carry 6 exabytes of cellular data per month in six years – a 600% increase over today. The cellular companies are going to be increasing data capacity in three ways – adding small cells, adding more mid-range spectrum, and adding 5G efficiency captured mostly through frequency slicing. It’s going to take all of those upgrades just to keep up with the growth in the above chart.

There are those who say that the way the cellular companies will handle future growth is through millimeter wave spectrum. However, that technology will require a fiber-fed small cell site near to every home. We really need to stop referring to millimeter wave spectrum as 5G wireless and instead call it what it is – fiber-to-the curb. When thought of that way, it’s easy to realize that there are no carriers likely to make the investment to deploy that much fiber along every residential street in America. Wireless 5G fiber-to-the-curb is not coming to most neighborhoods. The bottom line is that the world is not going to go wireless, and anybody saying so is engaging in hyperbole and not reality.

Are Broadband Investments Increasing?

The largest ISPs and their lobbying arm USTelecom are still claiming that the level of industry capital spending has improved as a direct result of the end of Title II regulation. In a recent blog they argue that capital spending was up in 2018 due to the end of regulation – something they describe as a “forward-looking regulatory framework”. In reality, the new regulatory regime is now zero regulation since the FCC stripped themselves of the ability to change ISP behavior for broadband products and practices.

The big ISPs used this same argument for years leading up to deregulation. They claimed that ISPs held back on investments since they were hesitant to invest in a regulatory-heavy environment. This argument never held water for a few reasons. First, the FCC barely ever regulated broadband companies. Since the advent of DSL and cable modems in the late 1990s, each subsequent FCC has largely been hands-off with the ISP industry.

The one area where the last FCC added some regulations was with net neutrality. According to USTelecom that was crippling regulation. In reality, the CEO of every big telco and cable company has publicly stated that they could live with the basic principles of net neutrality. The one area of regulation that has always worried the big ISPs is some kind of price regulation. That’s really not been needed in the past, but all of the big companies look into the future and realize that the time will come when they will probably raise broadband rates every year. We are now seeing the beginnings of that trend, which is probably why USTelecom keeps beating this particular dead horse to death – the ISPs are petrified of rate regulation of any kind.

The argument that the big ISPs held back on investment due to heavy regulation has never had any semblance to reality. The fact is that the big ISPs make investments for the same reasons as any large corporation – to increase revenues, to reduce operating costs, or to protect markets.

As an example, AT&T has been required to build fiber past 12.5 million passings as part of the settlement reached that allowed them to buy DirecTV. AT&T grabbed that mandate with gusto and has been aggressively building fiber for the past several years and selling fiber broadband. Both AT&T and Verizon have also been building fiber to cut transport expense to cell sites – they are building where that transport is too costly, or where they know they want to install small cell sites. The large cable companies all spent capital on DOCSIS 3.1 for the last few years to boost broadband speeds to protect and nurture their growing monopoly of urban broadband. All of these investment decisions were made for strategic business reasons that didn’t consider the difference between light regulation and no regulation. Any big ISP that says they will forego a strategic investment due to regulation would probably see their stock price tumble.

As a numbers guy, I always become instantly suspicious of deceptive graphs. Consider the graph included in the latest USTelecom blog. It shows the levels of industry capital investments made between 2014 and 2018. The graph makes the swings of investment by year look big due to the graphing trick of starting the bottom of the graph at $66 billion instead of at zero. The fact is that 2018 capital investments are less than 3% higher than the investments made in 2014. This is an industry where the aggregate level of annual investment varies by only a few percent per year – the argument that the ISPs have been unleashed due to the end of Title II regulation is laughable and the numbers don’t show it.

There are always stories every year that can explain the annual fluctuation in industry spending. Here are just a few things that made an significant impact on the aggregate spending in the past few years:

  • Sprint had a cash crunch a few years ago and drastically cut capital spending. One of the primary reasons for the higher 2018 spending is that Sprint spent almost $2 billion more in 2018 than the year before as they try to catch up on neglected projects.
  • AT&T spent $2 billion in 2018 for FirstNet, the nationwide public safety network. But AT&T is not spending their own money – that project is being funded by the federal government and ought to be removed from these charts.
  • Another $3 billion of AT&T’s spending in 2018 was to beef up the 4G network in Mexico. I’m not sure how including that spending in the numbers has any relevance to US regulation.
  • AT&T has been on a tear building fiber for the past four years – but they announced last month that the big construction push is over, and they will see lower capital spending in future years. AT&T has the largest capital budget in the industry and spent 30% of the industry wide $75 billion in 2018 – how will USTelecom paint the picture next year after a sizable decrease in AT&T spending?

The fact that USTelecom keeps harping on this talking point means they must fear some return to regulation. We are seeing Congress seriously considering new consumer privacy rules that would restrict the ability of ISPs to monetize customer data. We know it’s likely that if the Democrats take back the White House and the Senate that net neutrality and the regulation of broadband will be reinstated. For now, the big ISPs have clearly and completely won the regulatory battle and broadband is as close to deregulated as any industry can be. Sticking with this false narrative can only mean that the big ISPs think their win is temporary.

Why 5G Won’t Be Here Tomorrow

I just saw another article yesterday written by a major-city newspaper telling the public that 5G is coming in 2020. I hate to see reporters who have accepted the nonsense being peddled by the carriers without digging a little deeper to find the truth. At some point in the near future, the public will finally realize that the 5G talk has mostly been hype.

I don’t mean to always sound like a 5G critic because over time 5G will vastly improve the cellular experience. However, many of the improvements being suggested by the cellular companies – like gigabit cellular service – may never happen. Of more immediacy is the fact that there won’t be any major improvements to cellular networks from 5G for at least 3 – 5 years. The carriers have the country and politicians fully convinced that 5G is right around the corner – but it’s not.

There was a recent article written by Sue Marek in FierceWireless that is a great example of why 5G is not going to be here tomorrow. Titled Network Slicing is a Security Nightmare for Operators, Marek explains how complicated it’s going to be to implement network slicing – perhaps the most important new aspect of 5G cellular service.

Network slicing is the ability of the cellular network to size the transmission path to exactly meet a customer’s bandwidth needs. Network slicing is one of the ways that will enable a cell site to communicate with many more customers at the same time. Today, every customer gets the same-sized data channel, meaning a lot of bandwidth is wasted when customers use less than a full channel.

Marek points out the difficult technical challenge for providing security for every slice of bandwidth. She says that getting this right is going to take two to three years. Until network slicing is viable there really is nothing that can be called 5G. The important takeaway from her article is how difficult it is to implement new technology. 5G is a drastic change from 4G in many ways. There are thirteen major changes in the 5G specification compared to 4G and implementing each of them will be a technical challenge.

What is annoying about the 5G marketing hype is that we’ve always known it would take up to a decade to fully implement 5G, just as it did to implement 4G. The cellular companies can’t seem to help themselves from overhyping new technology, but the 5G hype is many times worse than the 4G hype a decade ago. This mostly seems due to the fact that the cellular carriers decided to use the 5G hype as a way to cram through regulatory changes they’ve wanted for a long time. That forced them to really crank up the 5G rhetoric.

5G will take the same path used by all other electronic technologies – there is a tried-and-true method of introducing upgrades. New breakthroughs start in a lab. They then go to a ‘breadboard’ process where working models are developed. Once the breadboards have been thoroughly tested they go into prototype chips, which are then retested to make sure the performance made it through the conversion to silicone. Finally, the chip design is approved and the new breakthrough goes into production. At the very fastest this process might be done in 12 – 18 months, although this can take as long as three years. Breaking in new changes in the cellular world is doubly complicated because these same changes also have to be introduced into cellphone handsets.

The likely progression we’ll see for 5G is that some new aspect of the 5G specification will make it annually into chipsets. As that happens, only the newest phones will be able to use the upgrades, while earlier versions of 5G phones won’t recognize the new breakthroughs. The idea that the handset manufacturers are introducing 5G handsets in 2020 is laughable because practically none of the important 5G upgrades are yet in chip production. Those handsets will be 5G in name only (and still priced ridiculously high).

Marek is pointing out the complexity of getting 5G security right. There are dozens of other equally difficult technical challenges needed to fully realize 5G, and there are scientists in labs working on all of them. The labs will plow through all of this over time, and long after the hype is far in the past, we’ll get 5G phones that implement most of the 5G specification. It’s worth noting that there never may be a phone that meets the entire specification – because the specifications for a new technology are a wish list. It may turn out that some parts of the specification may never practically work in the field.

Is One Touch Make-Ready Really Faster?

The new federal rules for one-touch make ready (OTMR) finally went into effect on May 21, after having been passed by the FCC last November. For those not familiar with the term make-ready, this refers to any work that has to be done to a pole to make it ready to add a new wire, like a fiber cable. There are national safety standards that define the distance required between different kinds of wires and also clearance required from wires to the ground – and often existing poles can’t accommodate a new wire that meets all of the needed spacing. The make-ready that’s needed to get onto an existing pole often involves rearranging existing wires to create the needed clearance, or in drastic cases a replacement of an old pole with a taller pole.

The new OTMR rules apply only in the thirty states that follow FCC pole attachment rules. The FCC has strongly encouraged other states to implement something similar, but they are not mandated to do so. The new rules also don’t change the fact that poles owned by electric cooperatives and municipalities are exempt from federal pole attachment rules.

The new rules speed up the process of getting onto most poles – but as I’ve dug into the new rules, I’m not sure they are really going to drastically cut the timeline needed to build fiber on poles.

The most significant change in the rules is a new classification of poles as either simple or complex make-ready. The order defines how to make this classification. In real life practice, the new attacher will suggest this determination, although it could get overturned by the pole owners.

There are streamlined new rules and timelines for completing the make-ready on simple poles. If the pole owner is unwilling to commit to fixing simple poles in the needed time frame, then the new attacher is allowed to make the changes after properly notifying the pole owner. The new attacher is free to rearrange any existing wires as needed, again after having properly notified all of the parties. These new rules eliminate situations where a pole owner refuses to cooperate with a new attacher, as happened in a few cities where AT&T fought Google Fiber. Something to consider is that the rules require using a make-ready contractor that has been pre-approved by the pole owner – but there are ways around this in some circumstances.

This sounds like a huge improvement in the pole attachment process because new fiber builders now have a guaranteed process for getting onto poles with simple make-ready. In most places, the majority of poles ought to be classified as simple. This isn’t true everywhere and we’ve seen cities where the majority of poles are crowded and might be classified as complex.

The problem that still remains is any complex poles. Those are poles where the make-ready could cause damage to existing wires or where the old pole must be replaced. The make-ready process for complex poles has always been slow. The new rules tighten up time frames a little, but the time required to get onto a complex pole can still take a long time.

For complex poles the process will still allow the existing wire owners to work sequentially. This coordination has to be scheduled by the pole owner. The process could still take six months even if done perfectly. What’s troubling is that I still don’t see any easy resolution for when the pole owner on the existing attachers drag their feet on complex poles. Other than some slightly improved timelines, the work on complex poles looks to still be as dreadful as today.

What does this mean for aerial construction? Consider a long run of 30 poles where 2 of the poles require complex make-ready. The new attacher can get the make-ready done on the 28 simple poles more quickly than in the past. Those simple poles might be ready to hang the new fiber within 60-days. But new fiber still can’t be hung on this route until all 30 poles are ready.

A new fiber builder still faces the same bad choices they have today. They can wait six months or more for the complex make-ready to be completed. If the complex work bogs down the new attacher faces the prospect of going to the state regulatory commission for help – something that can add an additional six months. The only other alternative is to bury around the complex poles – something that can add a lot of cost, especially when there is rocky soil.

The one-touch make-ready rules would be awesome if networks were comprised mostly of simple poles. A fiber overbuilder could have fiber on poles within a few months of starting a project. However, the reality is that there are many poles in the world that won’t be classified as simple. Many urban poles are too short to add another wire and have to be replaced with taller poles. Poles at busy intersections can already hold a maze of wires. Some poles today are going to carry other impediments like small cell sites that are going to make it harder to add fiber.

We’re going to have to see these new rules in practice before we can conclude that one-touch make-ready provides a major benefit. The FCC’s motives for OTMR are good and they are trying to favor easier fiber construction. We’re just going to have to wait to see if the new rules make any actual difference with the overall timeline for aerial construction.

Video Camera Ethics

I have a number of clients that now offer security products, many which come with video cameras that can be used at the front door or elsewhere at a customer location. There are a lot of discussions nationwide about the ethics involved with providing video cameras. Today’s blog discusses topics you should consider if you offer, or plan to offer video cameras.

ISP Access to Video. If an ISP provides customer video cameras, there are numerous concerns if your employees are able to access and watch customer video feeds. Your company could be subject to large legal liabilities if it ever came to light that any of your employees are watching customer videos. It’s incredibly tempting for employees to spy on their exes or watch their neighbors, and your ISP would be financially liable, and possibly even criminally liable if you enable violations of customer privacy.

Most, but not all customers are going to want you to record video. They will want to look at past events such as a burglary. Customers might just want to glance back through home activity every evening. But customers also are going to want privacy so that they are the only ones who can watch video, so you’ll have to come up with some method that assures that privacy. This is not as easy as it sounds, because typically any kind of archive that is available to customers also is probably accessible by your employees.

If you can develop a system that guarantees the desired privacy you will have a marketing advantage while also reducing your liabilities.

Spying on Neighbors. One of the most discussed topics in the home security industry is the ability of cameras to inadvertently watch activity at neighbors. For example, a front door camera can usually be placed so that it only sees people who approach the front door but can alternately be angled to see everything in front of the house including the neighbors across the street.

Setting cameras to see the whole street raises a number of ethical issues. You’re first inviting customers to watch their neighbors if you provide a wider view of the front of the home. You also are creating a video camera recording of events that happen beyond the boundary of the customer’s premise. It’s not hard to imagine seeing every passing car and every pedestrian that passes in front of a home. There are incidents in the news of homeowners accusing innocent bypassers of bad behavior simple because they capture video of them walking past their home often.

Law Enforcement. There are many law-enforcement issues in the gray area. There are a few specific laws that give law enforcement the ability to subpoena telephone call records or to wiretap phone calls or internet connections. There are not yet many such laws that have been updated to include video camera recordings.

For example, is an ISP obligated to turn over video from indoor cameras to law enforcement, particularly if the customer doesn’t approve it? There is probably some precedent to allow law enforcement to look at past recordings with a subpoena, but it’s a legal gray area to talk about giving live access to indoor cameras to law enforcement. To what degree would an ISP be violating customer privacy if they grant law enforcement access and there is no clear law authorizing video camera access?

There are also local police departments with programs where homeowners give law enforcement the passwords to allow them to view live feeds from outdoor and front door cameras. This essentially gives law enforcement the ability to watch the street or watch a neighbor without a subpoena.

I’m sure that over time that some of these issues will be clarified through legislation or regulatory rulings. But for now, there are a lot of gray areas. If you are going to offer a video camera service. you might want to determine your policies up-front rather than waiting for the inevitable issues to confront you.

AT&T and Verizon Fiber

If you look at the annual reports or listen to the quarterly investor calls, you’d think that AT&T and Verizon’s entire future depends upon 5G. As I’ve written in several blogs, there doesn’t seem to be an immediate financial business case for 5G and the big carriers are going to have to figure out how to monetize 5G – something that’s going to take years. Meanwhile, both companies have been expanding their fiber footprints and aggressively adding fiber-based broadband customers.

According to the Leichtman Research Group, AT&T added only 34,000 net broadband customers in the first quarter of this year – not an impressive number when considering that they have 15.7 million broadband numbers. But the underlying story is more compelling. One the 1Q investor call, the company says they added 297,000 fiber customers during the first quarter, and the smaller net number recognizes the decline of DSL customers. The overall financial impact was a net gain of 8% for broadband revenues.

AT&T is starting to understand the dynamics of being a multimedia company in addition to being a wireless carrier and an ISP. According to John Stephens, the AT&T CFO, the company experiences little churn when they are able to sell fiber-based Internet, a video product and cellular service to a customer.

The company views its fiber business as a key part of its growth strategy. AT&T now passes over 20 million homes and businesses with fiber and is aggressively pushing fiber broadband. The company has also undergone an internal consolidation so that all fiber assets are available to every business unit. The company has been expanding its fiber footprint significantly for the last few years, but recently announced they are at the end of major fiber expansion. However, the company will continue to take advantage of the new fiber being built for the nationwide FirstNet network for first responders. In past years the company would have kept FirstNet fiber in its own silo and not gotten the full value out of the investment.

Verizon has a similar story. The company undertook an internal project they call One Fiber where every fiber asset of the company is made available to all Verizon business units. There were over a dozen Verizon business units with separate fiber networks in silos.

Verizon is currently taking advantage of the One Fiber plan for expanding its small cell site strategy. The company knows that small cell sites are vital for maintaining a quality cellular network and they are also still weighing how heavily to invest in 5G wireless loops that deliver wireless broadband in residential neighborhoods.

Verizon has also been quietly expanding its FiOS fiber footprint. The company has gotten regulatory approval to abandon the copper business in over 100 exchanges in the northeast where it operates FiOS. In those exchanges, the company will no longer connect customers to copper service and says they will eventually tear down the copper and become fully fiber-based. That strategy means filling in neighborhoods that were bypassed by FiOS when the network was first built 20 years ago.

Verizon is leading the pack in terms of new fiber construction. They say that are building over 1,000 route miles of fiber every month. This alone is having a big impact on the industry as everybody else is having a harder time locating fiber construction crews.

Verizon’s wireline revenues were down 4% in the first quarter of this year compared to 2018. The company expects to start benefitting from the aggressive fiber construction program and turn that trend around over the next few years. One of the most promising opportunities for the company is to start driving revenues in markets where it’s owned fiber but had never fully monetized the opportunity.

The main competitor for all of the fiber construction by both companies are the big cable companies. The big telcos have been losing broadband customers for years as the cable company broadband has been clobbering DSL. The two telcos are counting on their fiber products to be a fierce competitor to cable company broadband and the companies hope to start recapturing their lost market share. As an outsider I’ve wondered for years why they didn’t do this, and the easy answer was that both companies sunk most of their capital investments into wireless. Now they are seeing that 5G wireless needs fiber, and both companies have decided to capitalize on the new fiber by also selling landline broadband. It’s going to be an interesting battle to watch since both telcos still face the loss of huge numbers of DSL customers – but they are counting on fiber to position them well for the decades to come.