Capping the Universal Service Fund

FCC Chairman Ajit Pai recently suggested capping the size of the total Universal Service Fund at $11.4 annually, adjusted going forward for inflation. The chairman has taken a lot of flack on this proposal from advocates of rural broadband. Readers of this blog know that I have been a big critic of this FCC on a whole host of issues. However, this idea doesn’t ive me much heartburn.

Critics of the idea are claiming that this proves that the FCC isn’t serious about fixing the rural broadband problem. I totally agree with that sentiment and this current FCC hasn’t done very little to fix rural broadband. In fact, they’ve gone out of their way to try to hide the magnitude of the rural problem by fiddling with broadband statistics and by hiding behind the faulty data from carriers that come out of the FCC’s broadband mapping effort. My personal guess is that there are millions of more homes that don’t have broadband than are being counted by the FCC.

With that said, the Universal Service Fund shouldn’t be the sole funding source for fixing rural broadband. The fund was never intended for that. The fund was created originally to promote the expansion of rural telephone service. Over time it became the mechanism to help rural telcos survive as other sources of subsidies like access charges were reduced over time. Only in recent years was it repositioned to fund rural broadband.

Although I’m a big proponent for better rural broadband, I am not bothered by capping the Universal Service Fund. First, the biggest components of that fund have been capped for years. The monies available for the rural high cost program, the schools and library fund and for rural healthcare have already been capped. Second, the proposed cap is a little larger than what’s being spent today, and what has been spent historically. This doesn’t look to be a move by the FCC to take away funding from any existing program.

Consumers today fund the Universal Service Fund through fees levied against landline telephone and cellphones. Opponents of capping the fund apparently would like to see the FCC hike those fees to help close the rural broadband gap. As a taxpayer I’m personally not nuts about the idea of letting federal agencies like the FCC print money by raising taxes that we all pay. For the FCC to make any meaningful dent in the rural broadband issue they’d probably have to triple or quadruple the USF fees.

I don’t think there is a chance in hell that Congress would ever let the FCC do that – and not just this Congress, but any Congress. Opponents of Pai’s plan might not recall that past FCCs have had this same deliberation and decided that they didn’t have the authority to unilaterally increase the size of the USF fund.

If we want to federal government to help fix the rural broadband problem, unfortunately the only realistic solution is for Congress to appropriate real money to the effort. This particular Congress is clearly in the pocket of the big telcos, evidenced by the $600 million awarded for rural broadband in last year’s budget reconciliation process. The use of those funds was crippled by language inserted by the big telcos to make it hard to use the money to compete against the telcos.

And that’s the real issue with federal funding. We all decry that we have a huge rural broadband crisis, but what we really have is a big telco crisis. Every rural area that has crappy broadband is served by one of the big telcos. The big telcos stopped making investments to modernize rural networks decades ago. And yet they still have to political clout to block federal money from being used to compete against their outdated and dying networks.

The FCC does have an upcoming opportunity for funding a new broadband program from the Universal Service Fund. After 2020 nearly $2 billion annually will be freed up in the fund at the end of the original CAF II program. If this FCC is at serious about rural broadband the FCC should start talking this year about what to do with those funds. This is a chance for Chairman Pai to put his (USF) money where his mouth is.

Verizon to Retire Copper

Verizon is asking the FCC for permission to retire copper networks throughout its service territory in New York, Massachusetts, Maryland, Virginia, Rhode Island and Pennsylvania. In recent months the company has asked to kill copper in hundreds of exchanges in those states. These range from urban exchanges in New York City to exchanges scattered all over the outer suburbs of Washington DC and Baltimore. Some of these filings can be found at this site.

The filings ask to retire the copper wires. Verizon will no longer support copper in these exchanges and will stop doing any maintenance on copper. The company intends to move people who still are served by copper over to fiber and is not waiting for the FCC notice period to make such conversions. Verizon is also retiring the older DMS telephone switches, purchased years ago from the long-defunct Northern Telecom. Telephone service will be moved to more modern softs switches that Verizon uses for fiber customers.

The FCC process requires Verizon to notify the public about plans to retire copper and if no objections are filed in a given exchange the retirement takes place 90 days after the FCC’s release of the public notice to retire. Verizon has been announcing copper retirements since February 2017 and was forced to respond to intervention in some locations, but eventually refiled most retirement notices a second time.

Interestingly, much of the FiOS fiber network was built by overlashing fiber onto the copper wires, so the copper wires on poles are likely to remain in place for a long time to come.

From a technical perspective, these changes were inevitable. Verizon is the only big telco to widely build fiber plan in residential neighborhoods and it makes no sense to ask them to maintain two technologies in neighborhoods with fiber.

I have to wonder what took them so long to get around to retiring the copper. Perhaps we have that answer in language that is in each FCC request where Verizon says it “has deployed or plans to deploy fiber-to-the-premises in these areas”. When Verizon first deployed FiOS they deployed it in a helter-skelter manner, mostly sticking to neighborhoods which had the lowest deployment cost, usually where they could overlash on aerial copper. At the time they bypassed places where other utilities were buried unless the neighborhood already had empty conduit in place. Perhaps Verizon has quietly added fiber to fill in these gaps or is now prepared to finally do so.

That is the one area of concern raised by these notices. What happens to customers who still only have a copper alternative? If they have a maintenance issue will Verizon refuse to fix it? While Verizon says they are prepared to deploy fiber everywhere, what happens to customers until the fiber is in front of their home or business? What happens to their telephone service if their voice switch is suddenly turned off?

I have to hope that Verizon has considered these situations and that they won’t let customers go dead. While many of the affected exchanges are mostly urban, many of them include rural areas that are not covered by a cable company competitor, so if customers lose Verizon service, they could find themselves with no communications alternative. Is Verizon really going to build FiOS fiber in all of the rural areas around the cities they serve?

AT&T is also working towards eliminating copper and offers fixed cellular as the alternative to copper in rural places. Is that being considered by Verizon but not mentioned in these filings?

I also wonder what happens to new customers. Will Verizon build a fiber drop to a customer who only wants to buy a single telephone line? Will Verizon build fiber to new houses, particularly those in rural areas? In many states the level of telephone regulation has been reduced or eliminated and I have to wonder if Verizon still sees themselves as the carrier of last resort that is required to provide telephone service upon request.

Verizon probably has an answer to all of these questions, but the FCC request to retire copper doesn’t force the company to get specific. All of the questions I’ve asked wouldn’t exist if Verizon built fiber everywhere in an exchange before exiting the copper business. As somebody who has seen the big telcos fail to meet promises many times, I’d be nervous if I was a Verizon customer still served by copper and had to rely on Verizon’s assurance that they have ‘plans’ to bring fiber.

FCC Looking at Rural Spectrum Rules

The FCC released a Notice of Proposed Rulemaking on March 15, in WT Docket No. 19-38. This NPRM asks if there are changes to spectrum rules that might make spectrum more easily available for small carriers and in rural markets.

This NPRM was required by the MOBILE NOW Act that was included in the Ray Baum’s Act that reauthorized the FCC. That Act required the FCC to ask the following questions:

  • Should the FCC establish a new program, or modify existing programs to make it easier to partition, disaggregate, or lease spectrum in rural areas and spectrum access by small carriers?
  • Should the FCC allow ‘reaggregation’ of spectrum that has been partitioned or disaggregated on the secondary market, up to the size of the original market area?
  • Would relaxing performance requirements for partitioned or disaggregated licenses make it easier for small carriers to use rural spectrum?
  • Are there any procedural changes that would make it easier to transfer spectrum to small carriers?
  • Are there incentives the FCC can provide to encourage spectrum license holders to lease or sell spectrum to small carriers that will serve rural areas?

If the FCC is serious about helping to solve the rural broadband divide they need to take a hard look at the suggestions various parties will make in this docket. The docket notes that there have been over 1,000 assignments of spectrum over the last decade, but most of these have been from speculators (who buy spectrum with the goal to sell and not use) assigning spectrum to the larger carriers. There are not many examples where the big spectrum holders have peeled off portions of their spectrum for rural use.

Today most spectrum is being used in urban areas but not deployed in the surrounding rural areas. It’s hard to fault the cellular companies for this practice. The low customer density in rural areas doesn’t require cellular carriers to deploy the same mix of spectrum needed to satisfy urban cellular bandwidth needs.

This unused spectrum could be used to provide spectacular fixed wireless broadband – something that is not really a significant part of the business plan of cellular companies. With newer techniques for combining multiple frequencies to serve a single customer, the availability of more swaths of spectrum could be used to significantly increase rural broadband speeds.

There are also regulatory reasons for the pool of unused rural spectrum. The cellular carriers have always lobbied hard to have spectrum auctioned to cover huge geographic footprints. It’s a lot easier for the carriers and the FCC to not bother with auctioning off rural coverage areas separately. The FCC’s coverage rules are also lax that a spectrum license holder can satisfy deployment requirements by deploying spectrum in the urban areas while ignoring the rural parts of the license areas. The FCC has also been extremely lax in enforcing deployment requirements, and license holders in some cases have gone for a decade without deploying spectrum without fear of losing the license.

The big cellular companies have opposed making it easier to deploy frequency in rural areas. They have some legitimate concerns about interference, but there are techical solutions to guard against interference. The big companies mostly don’t want to deal with smaller users of the spectrum. I would expect them to file comments in this docket that say that the existing system is adequate. Today’s rules already allow for leasing or partitioning of spectrum and the big companies don’t want new rules that might force them to work with rural providers.

Probably the most interesting question in the docket is the one asking if there are incentives that would drive the big license holders to work with smaller providers. I can think of several solutions, but the easiest one is what I call ‘use it or lose it’. The FCC ought to change the rules to be able to reclaim licensed spectrum that isn’t being used. The rules should not allow the deployment of spectrum in a city to tie up the use of that same spectrum for a huge surrounding rural area.

While the MOBILE NOW Act required the issuance of this NPRM within a year, it doesn’t require the FCC to act on any of the suggestions made by respondents to the NPRM. I would strongly encourage anybody interested in using rural spectrum to contact their members of Congress and ask them to encourage the FCC to take this NPRM seriously. Over the last two years it’s hard to point to any actions of this FCC that support rural broadband over the interests of the large carriers. The big wireless companies don’t want the hassle of dealing with smaller providers – but that’s the right thing to do. Spectrum ought to benefit all parts of the country, not just the urban areas.

What Are Small Cells?

By far the most confusing industry term that is widely used today is ‘small cell’. I see at least a couple of different articles every day talking about some aspect of small cell deployment. What becomes quickly clear after reading a few such articles is that the small cell terminology is being used to describe a number of different technologies.

A lot of the blame for this confusion comes from the CTIA, the industry group that representing the large cellular carriers. As part of lobbying the FCC last year to get the ruling that allows the carriers to deploy devices in the public rights-of-way the CTIA constantly characterized small cell devices to be about the size of pizza boxes. In reality, there are devices that range from the size of a pizza box up to devices the size of dorm refrigerators.

There are a number of different kinds of deployments all being referred to as small cells. The term small cell brings to mind the idea of devices hung on poles that perform the same functions as the big cellular towers. Fully functional pole-mounted cellular sites are not small devices. The FCC set a limit for a pole-mounted small cell to be no larger than 28 cubic feet, and a cell tower replacement device will use most of that allotted space. Additionally, a full cell tower replacement device generally requires a sizable box of electronics and power supply that sits on the ground – often in cabinets the size of the traditional corner mailbox.

These cell-tower replacements are the devices that nobody wants in front of their house. They are large and can be an eyesore. The cabinets on the ground can block the sidewalk – although lately the carriers have been getting smarter and are putting the electronics in an underground vault. These are the big ‘small cell’ devices that are causing safety concerns for line technicians from other utilities that have to worry about working around the devices to fix storm damage.

Then there are the devices that actually are the size of pizza boxes. While they are being called small cells just like to giant boxes, I would better classify these smaller devices as cellular repeaters. These smaller devices re-originate cellular signals to boost coverage in cellular dead spots. I happen to live in a hilly city and I would love to see more of these devices. Cellular coverage here varies widely block by block according to line-of-sight to the big cellular towers. Cellular carriers can boost coverage in a neighborhood by placing one of these devices within sight of a large tower and then beaming from there to cover the dead spots.

If you look at the industry vendor web sites they claim shipment of millions of small cell sites last year. It turns out that 95% of these ‘small cell’ devices are indoor cellular boosters. Landlords deploy these in office buildings, apartment buildings and other places where cellular coverage is poor. Perhaps the best terminology to describe these devices is a cellular offload device that relieves traffic on cell sites. The indoor units use cellular frequencies to communicate with cellphones but then dump cellular data and voice traffic onto the broadband connection of the landlord. It turns out in urban downtowns that 90% plus of cellular usage is done indoors, and these devices help to meet urban demand cellular without the hassle of trying to communicate through the walls of larger buildings.

The next use of the term small cell is for the devices that Verizon recently used to test wireless broadband in a few test markets. These devices have nothing to do with cellular traffic and would best be described as wireless broadband loops. Verizon is using millimeter wave spectrum to beam broadband connections for a thousand feet or so from the pole-mounted devices.

The general public doesn’t understand the wide array of different wireless devices that are being deployed. The truly cellular devices, for now, are all 4G devices that are being used by the cellular carriers to meet the rapidly-growing demand for cellular data. The industry term for this is densification and the carriers are deploying full cell-tower substitute devices or neighborhood repeaters to try to relieve the pressure on the big cellular towers. These purely-cellular devices will eventually handle 5G when it is rolled out over the next decade.

The real confusion I see is that most people now equate ‘small cell’ with fast data. I’ve talked to several cities recently who thought that requests for small cell attachments mean they are going to get gigabit broadband. Instead, almost every request for a small cell site today is for the purpose of beefing up the 4G networks. These extra devices aren’t going to increase 4G data speeds, aren’t bringing 5G and are definitely not intended to beam broadband into people’s homes. These small cells are being deployed to divvy up the cellular traffic to relieve overloaded cellular networks.

One-Web Launches Broadband Satellites

Earlier this month OneWeb launched six test satellites intended for an eventual satellite fleet intended to provide broadband. The six satellites were launched from a Soyuz launch vehicle from the Guiana Space Center in Kourou, French Guiana.

OneWeb was started by Greg Wyler of Virginia in 2012, originally under the name of WorldVu. Since then the company has picked up heavy-hitter investors like Virgin, Airbus, SoftBank and Qualcomm. The company’s plan is to launch an initial constellation of 650 satellites that will blanket the earth, with ultimate deployment of 1,980 satellites. The plans are to deploy thirty of the sixty-five pound satellites with each launch. That means twenty-two successful launches are needed to deploy the first round.

Due to the low-earth orbits of the satellites, at about 745 miles above earth, the OneWeb satellites will avoid the huge latency that is inherent from current satellite broadband providers like HughesNet, which uses satellites orbiting at 22,000 miles above the earth. The OneWeb specifications filed with the FCC talks about having latency in the same range as cable TV networks in the 25-30 millisecond range. But where a few high-orbit satellites can see the whole earth, the big fleet of low-orbit satellites is needed just to be able in see everywhere.

The company is already behind schedule. The company had originally promised coverage across Alaska by the end of 2019. They are now talking about having customers demos sometime in 2020 with live broadband service in 2021. The timeline matter for a satellite company because the bandwidth license from the FCC requires that they launch 50% of their satellites within six years and all of them within nine years. Right now, OneWeb and also Elon Musk’s SpaceX have both fallen seriously behind the needed deployment timeline.

The company’s original goal was to bring low-latency satellite broadband to everybody in Alaska. While they are still talking about bringing broadband to those who don’t have it today, their new business plan is to sell directly to airlines and cruise ship lines and to sell wholesale to ISPs who will then market to the end user.

It will be interesting to see what kinds of speeds will really be delivered. The company talks today about a maximum speed of 500 Mbps. But I compare that number to the claim that 5G cellphones can work at 600 Mbps, as demonstrated last year by Sprint – it’s possible only in a perfect lab setting. The best analog to a satellite network is a wireless transmitter on a tower in a point-to-multipoint network. That transmitter is capable of making a relatively small number of big-bandwidth connections or many more low-bandwidth connections. The economic sweet spot will likely be to offer many connections at 50 – 100 Mbps rather than fewer connections at a higher speed.

It’s an interesting business model. The upfront cost of manufacturing and launching the satellites is high. It’s likely that a few launches will go awry and destroy satellites. But other than replacing satellites that go bad over time, the maintenance costs are low. The real issue will be the bandwidth that can be delivered. Speeds of 50 – 100 Mbps will be welcomed in the rural US for those with no better option. But like with all low-bandwidth technologies – adequate broadband that feels okay today will feel a lot slower in a decade as household bandwidth demand continues to grow. The best long-term market for the satellite providers will be those places on the planet that are not likely to have a landline alternative – which is why they first targeted rural Alaska.

Assuming that the low-earth satellites deliver as promised, they will become part of the broadband landscape in a few years. It’s going to be interesting to see how they play in the rural US and around the world.

Ideas for Better Broadband Mapping

The FCC is soliciting ideas on better ways to map broadband coverage. Everybody agrees that the current broadband maps are dreadful and misrepresent broadband availability. The current maps are created from data that the FCC collects from ISPs on the 477 form where each ISP lists broadband coverage by census block. One of the many problems with the current mapping process (I won’t list them all) is that census blocks can cover a large geographic area in rural America, and reporting at the census block level tends to blur together different circumstances where some folks have broadband and others have none.

There have been two interesting proposals so far. Several parties have suggested that the FCC gather broadband speed availability by address. That sounds like the ultimate database, but there are numerous reasons why this is not practical.

The other recommendation is a 3-stage process recommended by NCTA. First, data would be collected by polygon shapefiles. I’m not entirely sure what that means, but I assume it means using smaller geographic footprints than census blocks. Collecting the same data as today using a smaller footprint ought to be more accurate. Second, and the best idea I’ve heard suggested, is to allow people to challenge the data in the mapping database. I’ve been suggesting that for several years. Third, NCTA wants to focus on pinpointing unserved areas. I’m not sure what that means, but perhaps it means creating shapefiles to match the different availability of speeds.

These ideas might provide better broadband maps than we have today, but I’m guessing they will still have big problems. The biggest issue with trying to map broadband speeds is that many of the broadband technologies in use vary widely in actual performance in the field.

  • Consider DSL. We’ve always known that DSL performance decreases with distance from a DSL base station. However, DSL performance is not as simple as that. DSL also varies for other reasons like the size of the gauge of copper at a customer or the quality of the copper. Next door neighbors can have a significantly different DSL experience if they have different size wires in their copper drops, or if the wires at one of the homes have degraded over time. DSL also differs by technology. A telco might operate different DSL technologies out of the same central office and see different performance from ADSL versus VDSL. There really is no way for a telco to predict the DSL speed available at a home without installing it and testing the actual speed achieved.
  • Fixed wireless and fixed cellular broadband have similar issues. Just like DSL, the strength of a signal from a wireless transmitter decreases over distance. However, distance isn’t the only issue and things like foliage affect a wireless signal. Neighbors might have a very different fixed wireless experience if one has a maple tree and the other has a pine tree in the front yard. To really make it difficult to define the speed, the speeds on wireless systems are affected to some degree by precipitation, humidity and temperature. Anybody who’s ever lived with fixed wireless broadband understands this variability. WISPs these days also use multiple spectrum blocks, and so the speed delivered at any given time is a function of the particular mix of spectrum being used.

Regardless of the technology being used, one of the biggest issues affecting broadband speeds is the customer home. Customers (or ISPs) might be using outdated and obsolete WiFi routers or modems (like Charter did for many years in upstate New York). DSL speeds are just as affected by the condition of the inside copper wiring as the outdoor wiring. The edge broadband devices can also be an issue – when Google Fiber first offered gigabit fiber in Kansas City almost nobody owned a computer capable of handling that much speed.

Any way we try to define broadband speeds – even by individual home – is going to still be inaccurate. Trying to map broadband speeds is a perfect example of trying to fit a round peg in a square hole. It’s obvious that we can do a better job of this than we are doing today. I pity a fixed wireless ISP if they are somehow required to report broadband speeds by address, or even by a small polygon. They only know the speed at a given address after going to the roof of a home and measuring it.

The more fundamental issue here is that we want to use the maps for two different policy purposes. One goal is to be able to count the number of households that have broadband available. The improved mapping ideas will improve this counting function – within all of the limitations of the technologies I described above.

But mapping is a dreadful tool when we use it to start drawing lines on a map defining which households can get grant money to improve their broadband. At that point the mapping is no longer a theoretical exercise and a poorly drawn line will block homes from getting better broadband. None of the mapping ideas will really fix this problem and we need to stop using maps when awarding grants. It’s so much easier to decide that faster technology is better than slower technology. For example, grant money ought to be available for anybody that wants to replace DSL on copper with fiber. I don’t need a map to know that is a good idea. The grant process can use other ways to prioritize areas with low customer density without relying on crappy broadband maps.

We need to use maps only for what they are good for – to get an idea of what is available in a given area. Mapping is never going to be accurate enough to use to decide which customers can or cannot get better broadband.

Regulatory Sleight of Hand

I was looking through a list of ideas for blogs and noticed that I had never written about the FCC’s odd decision to reclassify commercial mobile broadband as private mobile broadband service in WC Docket No. 17-108 – The Restoring Internet Freedom order that was used to kill net neutrality and to eliminate Title II regulation of broadband. There was so much industry stir about those larger topics that the reclassification of the regulatory nature of mobile broadband went largely unnoticed at the time by the press.

The reclassification was extraordinary in the history of FCC regulation because it drastically changed the definition of one of the major industries regulated by the agency. In 1993 the Congress had enacted regulatory amendments to Section 332 of the FCC’s rules to clarify the regulation for the rapidly burgeoning cellular industry.

At that time there were about 16 million cellular subscribers that used the public switched telephone network (PSTN) and another two million private cell phones that used private networks primarily for corporate dispatch. Congress made a distinction between the public and private use of cellular technology and coined the term CMRS (Commercial Mobile Radio Service) to define the public service we still use today for making telephone calls on cell phones. That congressional act defined CMRS service as having three characteristics: a) the service is for profit, b) it’s available to the entire public, and c) it is interconnected to the PSTN. Private mobile service was defined as any cellular service that didn’t meet any one of the three tests.

The current FCC took the extraordinary step of declaring that cellular broadband is private cellular service. The FCC reached this conclusion using what I would call a regulatory sleight-of-hand. Mobile broadband is obviously still for profit and also available to the public, and so the FCC tackled the third test and said that mobile broadband is part of the Internet and not part of the public telephone network. It’s an odd distinction because the path of a telephone call and a data connection from a cellphone is usually identical. A cellphone first delivers the traffic for both services to a nearby cellular tower (or more recently to pole-mounted small cell sites). The traffic for both services is transported from the cell tower using ethernet transport that the industry calls trunking. At some point in the network, likely a switching hub, the voice and data traffic are split and the voice calls continue inside the PSTN while data traffic is peeled off to the Internet. There is no doubt that the user end of every cellular call or cellular data connection uses the network components that are part of the PSTN.

Why did the FCC go through these mental gymnastics? This FCC had two primary goals of this particular order. First, they wanted to kill the net neutrality rules established by the prior FCC in 2015. Second, they wanted to do this in such a way as to make it extremely difficult for a future FCC to reverse the decision. They ended up with a strategy of declaring that broadband is not a Title II service. Title II refers to the set of rules established by the Telecommunications Act of 1934 that was intended as the framework for regulating common carriers. Until the 2017 FCC order, most of the services we think of as telecommunications – landline telephone, cellular telephones, and broadband – were all considered as common carrier services. The current FCC strategy was to reclassify landline and mobile broadband as a Title I information service and essentially wash their hands from regulating broadband at all.

Since net neutrality rules applied to both landline and mobile data services, the FCC needed to first decree that mobile data was not a public and commercial service before they could remove it from Title II regulation.

The FCC’s actions defy logic and it’s clear that mobile data still meets the definition of a CMRS service. It was an interesting tactic by the FCC and probably the only way they could have removed mobile broadband from Title II regulation. However, they also set themselves up for some interesting possibilities from the court review of the FCC order. For example, a court might rule that mobile broadband is a CMRS service and drag it back under Title II regulation while at the same time upholding the FCC’s reclassification of landline broadband.

Why does this matter? Regulatory definitions matter because the regulatory process relies on an accumulated body of FCC orders and court cases that define the actual nature of regulating a given service. Congress generally defines regulation at a high level and later FCC decisions and court cases better define issues that are disputed. When something gets reclassified in this extreme manner, most of the relevant case law and precedents go out the window. That means we start over with a clean slate and much that was adjudicated in the past will likely have to be adjudicated again, but now based upon the new classification. I can’t think of any time in our industry where regulators decided to arbitrarily redefine the basic nature of a major industry product. We are on new regulatory ground, and that means uncertainty, which is never good for the industry.

Streamlining Regulations

Jonathan Spalter of USTelecom wrote a recent blog calling on Congress to update regulations for the telecom industry. USTelecom is a lobbying arm representing the largest telcos, but which also still surprisingly has a few small telco members. I found the tone of the blog interesting, in that somebody who didn’t know our industry would read the blog and think that the big telcos are suffering under crushing regulation.

Nothing could be further from the truth. We currently have an FCC that seems to be completely in the pocket of the big ISPs. The current FCC walked in the door with the immediate goal to kill net neutrality, and in the process decided to completely deregulate the broadband industry. The American public hasn’t really grasped yet that ISPs are now unfettered to endlessly raise broadband prices and to engage in network practices that benefit the carriers instead of customers. Deregulation of broadband has to be the biggest regulatory giveaway in the history of the country.

Spalter goes on to praise the FCC for its recent order on poles that set extremely low rates for wireless pole connections and which lets wireless carriers place devices anywhere in the public rights-of-way. He says that order brought “fairness’ to the pole attachment process when in fact the order was massively unbalanced in favor of cellular companies and squashes any local input or authority over rights-of-ways – something that has always been a local prerogative. It’s ironic to see USTelecom praising fairness for pole attachments when their members have been vehemently trying to stop Google Fiber and others from gaining access to utility poles.

To be fair, Spalter isn’t completely wrong and there are regulations that are out of date. Our last major telecom legislation was in 1996, at a time when dial-up Internet access was spreading across the country. The FCC regulatory process relies on rules set by Congress, and since the FCC hasn’t acted since 1996, Spalter accuses Congress of having “a reckless abdication of government responsibility”.

I find it amusing that the number one regulation that USTelecom most dislikes is the requirement for the big telcos make their copper wires available to other carriers. That requirement of the Telecommunications Act of 1996 was probably the most important factor in encouraging other companies to compete against the monopoly telephone companies. In the years immediately after the 1996 Act, competitors ordered millions of wholesale unbundled network elements on the telco copper networks.

There are still competitors that using the telco copper to provide far better broadband than the telcos are willing to do, so we need to keep these regulations as long as copper remains hanging on poles. I would also venture a guess that the telcos are making more money selling this copper to the competitors than they would make if the competitors went away – the public is walking away from telco DSL in droves.

I find it curious that the telcos keep harping on this issue. In terms of the total telco market the sale of unbundled elements is a mere blip on the telco books. This is the equivalent to a whale complaining about a single barnacle on his belly. But the big telcos never miss an opportunty to harp about the issue and have been working hard to eliminate sale of copper to competitors since the passage of the 1996 Act. This is not a real issue for the telcos – they just have never gotten over the fact that they lost a regulatory battle in 1996 and they are still throwing a hissy fit over that loss.

The reality is that big telcos are less regulated than ever before. Most states have largely deregulated telephone service. The FCC completely obliterated broadband regulation. While there are still cable TV regulations, the big telcos like AT&T are bypassing those regulations by moving video online. The big telcos have already won the regulatory war.

There are always threats of new regulation – but the big telcos always lobby against new rules far in advance to weaken any new regulations. For example, they are currently supporting a watered-down set of privacy rules that won’t afford much protection of customer data. They have voiced support for a watered-down set of net neutrality rules that doesn’t obligate them to change their network practices.

It’s unseemly to see USTelecom railing against regulation after the telcos have already been so successful in shedding most regulations. I guess they want to strike while the iron is hot and are hoping to goad Congress and the FCC into finishing the job by killing all remaining regulation. The USTelcom blog is a repeat of the same song and dance they’ve been repeating since I’ve been in the industry – which boils down to, “regulation is bad”. I didn’t buy this story forty years ago and I still don’t buy it today.

The Cost of Siting Small Cells

One of the more unusual things ordered by the current FCC was setting a low cap on local fees that a City can charge to review an application for placing a small cell site. The FCC capped the application fee at up to $500 for a request to up to five small cell sites and $100 per site after that. The FCC also set a cap of $270 for an annual fee to use the rights-of-way for each small cell site.

Cities have an option to charge a more and can bill a ‘reasonable approximation’ of actual costs, but a City can expect a legal fight from wireless carriers for fees that are much higher than the FCC caps.

It’s worth looking back at the history of the issue. Wireless carriers complained to the FCC that they were being charged exorbitant fees to put equipment on utility poles in the public rights-of-way. The wireless carriers cited examples of having to pay north of $10,000 per small cell site. In most cases, fees have been far smaller than that, but citing the worst examples gave cover to the FCC for capping fees.

However, some of the examples of high fees cited by the carriers were for installations that would not be considered as a small cell. I’ve seen applications requests for hanging devices the size of a refrigerator on poles and also placing large cabinet on the sidewalk under a pole. The FCC acknowledged this in their order and set a size limit on what constitutes a small cell as a device occupying something less than 28 cubic feet.

It’s worth noting that much of the FCC’s order for small cell sites are under appeal. The most controversial issues being challenged are aspects of the order that stripped cities of the ability to set local rules on what can and cannot be hung on poles. The FCC basically said that cellular carriers are free to do what they want anywhere in the public rights-of-way and cities are arguing that the order violates the long-standing precedent that rights-of-ways issues should be decided locally.

Communities all over the country are upset with the idea that they have to allow a small cell site any place that the carriers want to put one. There are also active citizen’s groups protesting the implementation of millimeter wave cell sites due to public health concerns. A lot of the prominent radio scientists from around the world have warned of the potential public health consequences for prolonged exposure to the millimeter wave spectrum – similar to the spectrum used in airport scanners, but which would be broadcast continuously from poles in front of homes. There is also a lot of concern that carriers that hang millimeter wave transmitters are going to want aggressive tree trimming to maintain lines-of-sight to homes. Finally, there are concerns about the wild proliferation of devices if multiple wireless providers install devices on the same street.

The cap on local fees has already been implemented and cities are now obligated to charge the low rates unless they undertake the effort (and the likely legal fight) for setting higher fees. It is the setting of low fees that is the most puzzling aspect of the FCC order. It seems that the FCC has accepted the wireless carrier’s claim that high fees would kill the deployment of 5G small cell sites everywhere.

I live in a city that is probably pretty typical and that has an application process and inspectors for a huge range of processes, from building inspection, restaurant inspections, electrical and gas installation inspections and inspections of anything that disturbs a city street surface or is hung in the public rights-of-way. The city takes a strong position in assuring that the public rights-of-way are maintained in a way that provides the best long-term opportunity for the many uses of the rights-of-way. They don’t let any utility or entity take steps that make it harder for the next user to gain the needed access.

The $100 fee is to compensate the city for processing the application for access, to survey the site of the requested access and to then inspect that the wireless carrier really did what they promised and didn’t create unsafe conditions or physical hindrances in the right-of-way. It’s hard to think that $100 will compensate any city for the effort required. It will be interesting to see how many cities acquiesce to the low FCC rates instead of fighting to implement fair rates. Cities know that fights with carriers can be costly and they may not be willing to tackle the issue. But they also need to realize that the wireless carriers could pepper their rights-of-ways with devices that are likely to hang in place for decades. If they don’t tackle the issue up front they will have no latitude later to rectify small cell sites that were hung incorrectly or unsafely. I’ve attended hundreds of city council meetings and have always been amazed at the huge number of different issues that local politicians have to deal with. This is just one more issue added to that long list, and it will be understandable if many cities acquiesce to the low fees.