The Impending Cellular Data Crisis

There is one industry statistic that isn’t getting a lot of press – the fact that cellular data usage is more than doubling every two years. You don’t have to plot that growth rate very many years into the future to realize that existing cellular networks will be inadequate to handle the increased demand in just a few years. What’s even worse for the cellular industry is that the growth is the nationwide average. I have many clients who tell me there isn’t nearly that much growth at rural cellular towers – meaning there is likely even faster growth at some urban and suburban towers.

Much of this growth is a self-inflicted wound by the cellular industry. They’ve raised monthly data allowances and are often bunding in free video with cellular service, thus driving up usage. The public is responding to these changes by using the extra bandwidth made available to them.

There are a few obvious choke points that will be exposed with this kind of growth. Current cellphone technology limits the number of simultaneous connections that can be made from any given tower. As customers watch more video they eat up slots on the cell tower that otherwise could have been used to process numerous short calls and text messages. The other big chokepoint is going to be the broadband backhaul feeding each cell cite. When usage grows this fast it’s going to get increasingly expensive to buy leased backbone bandwidth – which explains why Verizon and AT&T are furiously building fiber to cell sites to avoid huge increases in backhaul costs.

5G will fix some, but not all of these issues. The growth is so explosive that cellular companies need to use every technique possible to make cell towers more efficient. Probably the best fix is to use more spectrum. Adding an additional spectrum to a cell site immediately adds capacity. However, this can’t happen overnight. Any new spectrum is only useful if customers can use it and it takes a number of years to modify cell sites and cellphones to work on a new spectrum. The need to meet growing demand is the primary reason that the CTIA recently told the FCC they need an eye-popping 400 MHz of new mid-range spectrum for cellular use. The industry painted that as being needed for 5G, but it’s needed now for 4G LTE.

Another fix for cell sites is to use existing frequency more efficiently. The most promising way to do this is with the use of MIMO antenna arrays – a technology to deploy multiple antennas in cellphones to combine multiple spectrum together to create a larger data pipe. MIMO technology can make it easier to respond to a request from a large bandwidth user – but it doesn’t relieve the overall pressure on a cell tower. If anything, it might do the exact opposite and let cell towers prioritize those that want to watch video over smaller users who might then be blocked from making voice calls or sending text messages. MIMO is also not an immediate fix and also needs to work through the cycle of getting the technology into cellphones.

The last strategy is what the industry calls densification, which is adding more cell sites. This is the driving force behind placing small cell sites on poles in areas with big cellular demand. However, densification might create as many problems as it solves. Most of the current frequencies used for cellular service travel a decent distance and placing cell sites too close together will create a lot of interference and noise between neighboring towers. While adding new cell sites adds additional local capacity, it also decreases the efficiency of all nearby cell sites using traditional spectrum – the overall improvement from densification is going to be a lot less than might be expected. The worse thing about this is that interference is hard to predict and is very much a local issue. This is the primary reason that the cellular companies are interested in millimeter wave spectrum for cellular – the spectrum travels a short distance and won’t interfere as much between cell sites placed closely together.

5G will fix some of these issues. The ability of 5G to do frequency slicing means that a cell site can provide just enough bandwidth for every user – a tiny slice of spectrum for a text message or IoT signal and a big pipe for a video stream. 5G will vastly expand the number of simultaneous users that can share a single cell site.

However, 5G doesn’t provide any additional advantages over 4G in terms of the total amount of backhaul bandwidth needed to feed a cell site. And that means that a 5G cell site will get equally overwhelmed if people demand more bandwidth than a cell site has to offer.

The cellular industry has a lot of problems to solve over a relatively short period of time. I expect that in the middle of the much-touted 5G roll-out we are going to start seeing some spectacular failures in the cellular networks at peak times. I feel sympathy for cellular engineers because it’s nearly impossible to have a network ready to handle data usage that doubles every two years. Even should engineers figure out strategies to handle five or ten times more usage, in only a few years the usage will catch up to those fixes.

I’ve never believed that cellular broadband can be a substitute for landline broadband. Every time somebody at the FCC or a politician declares that the future is wireless I’ve always rolled my eyes, because anybody that understands networks and the physics of spectrum can easily demonstrate that there are major limitations on the total bandwidth capacity at a given cell site, along with a limit on how densely cell sites can be packed in an area. The cellular networks are only carrying 5% of the total broadband in the country and it’s ludicrous to think that they could be expanded to carry most of it.

New European Copyright Laws

I’ve always kept an eye on European Union regulations because anything that affects big web companies or ISPs in Europe always ends up bleeding over into the US. Recently the EU has been contemplating new rules about online copyrights, and in September the European Parliament took the first step by approving two new sets of copyright rules.

Article 11 is being referred to as a link tax. This legislation would require that anybody that carries headlines or snippets of longer articles online must pay a fee to the creator of the original content. Proponents of Article 11 argue that big companies like Google, Facebook and Twitter are taking financial advantage of content publishers by listing headlines of news articles with no compensation for the content creators. They argue that these snippets are one of the primary reasons that people use social media and they browse articles suggested by their friends. Opponents of the new law argue that it will be extremely complicated for a web service to track the millions of headlines listed by users and that they will react to this rule by only allowing headline snippets from large publishers. This would effectively shut small or new content creators from gaining access to the big platforms – articles would be from only a handful of content sources rather than from tens of thousands of them.

Such a law would certainly squash small content originators like this blog. Many readers find my daily blog articles via short headlines that are posted on Twitter and Linked-In every time I release a blog or when one of my readers reposts a blog. It’s extremely unlikely that the big web platforms would create a relationship with somebody as small as me and I’d lose my primary way to distribute content on the web. I guess, perhaps, that the WordPress platform where I publish could make arrangements with the big web services – otherwise their value as a publishing platform would be greatly diminished.

This would also affect me as a user. I mostly follow other people in the telecom and the rural broadband space by browsing through my feed on Twitter and LinkedIn to see what those folks are finding to be of interest. I skip over the majority of headlines and snippets, but I stop and read news articles I find of interest. The beauty of these platforms is that I automatically select the type of content I get to browse by deciding who I want to follow on the platforms. If the people I follow on Twitter can’t post small and obscure articles, then I would have no further interest in being on Twitter.

The second law, Article 13 is being referred to as the upload filter law. Article 13 would make a web platform liable for any copyright infringements for content posted by users. This restriction would theoretically not apply to content posted by users as long as they are acting non-commercially.

No one is entirely sure how the big web platforms would react to this law. At one extreme a platform like Facebook or Reddit might block all postings of content, such as video or pictures, for which the user can’t show ownership. This would mean the end of memes and kitten videos and much of the content posted by most Facebook users.

At the other extreme, this might mean that the average person could post such links since they have no commercial benefit from posting a cute cat video. But the law could stop commercial users from posting content that is not their own – a movie reviewer might not be able to include pictures or snippets from a film in a review. I might not be able to post a link to a Washington Post article as CCG Consulting but perhaps I could post it as an individual. While I don’t make a penny from this blog, I might be stopped by web platforms from including links to news articles in my blog.

In January the approval process was halted when 11 countries including Germany, Italy, and the Netherlands said they wouldn’t support the final language in these articles. EU law has an interesting difference from US law in that for many EU ordinances each country gets to decide, within reason, how they will implement the law.

The genesis of these laws comes from the observation that the big web companies are making huge money from the content created by others and not fairly compensating content creators. We are seeing a huge crisis for content creators – they used to be compensated through web advertising ‘hits’, but these revenues are disappearing quickly. The EU is trying to rebalance the financial equation and make sure that content creators are fairly compensated – which is the entire purpose of copyright laws.

The legislators are finding out how hard it will be to make this work in the online world. Web platforms will always try to work around laws to minimize payments. The lawyers of the web platforms are going to be cautious and advise the platforms to minimize massive class action suits.

But there has to be a balance. Content creators deserve to be paid for creating content. Platforms like Facebook, Twitter, Reddit, Instagram, Tumblr, etc. are popular to a large degree because users of the platforms upload content that they didn’t create – the value of the platform is that users get to share things of interest with their friends.

We haven’t heard the end of these efforts and the parties are still looking for language that the various EU members can accept. If these laws eventually pass they will raise the same questions here because the policies adopted by the big web platforms will probably change to match the European laws.

The Slow Deployment of 5G

Somebody asked me a few days ago why I write so much about 5G. My response is that I am intrigued by the 5G hype. The major players in the industry have been devoting big dollars to promote a technology that is still mostly vaporware. The most interesting thing about 5G is how politicians, regulators and the public have bought into the hype. I’ve never seen anything like it. I can remember other times when the world was abuzz over a new technology, but this was usually a reaction to an actual technology you could buy like the first laptop computers, the first iPhone and the first iPod.

Anybody that understands our industry knew that it will take a number of years to roll out any major new technology, particularly a wireless technology since wireless behaves differently in the field compared to the lab. We’re only a year past the release of 5G standards, and it’s unrealistic to think those standards could be translated into operation hardware and software systems in such a short time. You only have to look back at the history of 4G, which started as slowly as 5G and which finally had the first fully-compliant 4G cell site late last year.  It’s going to take just as long until we see a fully functional 5G cell site. What we will see, over time, is the incremental introduction of some of the aspects of 5G as they get translated from lab to the field. That rollout is further complicated for cellular use by the timeline needed to get 5G-ready handsets into peoples’ hands.

This blog was prompted by a Verizon announcement that 5G mobile services will be coming to 30 cities later this year. Of course, the announcement was short on details, because those details would probably be embarrassing for Verizon. I would expect that the company will introduce a tiny few aspects of 5G into the cell sites in business districts of major cities and claim that as a 5G roll-out.

What does that a roll-out this year mean for cellular customers? There are not yet any 5G capable cellphones. Both AT&T and Verizon have been working with Samsung to introduce a 5G version of their S10 phone later this year. Verizon has also been reported to be working with Lenovo for a 5G modular upgrade later this year. I’m guessing these phones are going to come with a premium price tag for the early adaptors willing to pay for 5G bragging rights. These phones will only work as 5G from the handful of cell sites with 5G gear – and that will only be for a tiny subset of the 5G specifications. I remember when one of my friends bought one of the first 4G phones and crowed about how it worked in downtown DC. At the time I told him his great performance was because he was probably the only guy using 4G – and sure enough, his performance dropped as others joined the new technology.

On the same day that I saw this Verizon announcement I also saw a prediction by Cisco that only 3% of cellular connections will occur over a 5G network by the end of 2022. This might be the best thing I’ve seen that pops the 5G hype. Even for folks buying the early 5G phones, there will be a dearth of cell sites around the country that will work with 5G for a number of years. Anybody who understands the lifecycle of cellular upgrades agrees with the Cisco timeline. It takes years to work through the cycle of upgrading cell sites, upgrading handsets and then getting those handsets to the public.

The same is true for the other technologies that are also being called 5G. Verizon made a huge splash just a few months ago about introducing 5G broadband using millimeter wave spectrum in four cities. Even at the time of that announcement, it was clear that those radios were not using the 5G standard, and Verizon quietly announced recently that they were ceasing those deployments while they wait for actual 5G technology. Those deployments were actually a beta test of millimeter wave radios, not the start of a rapid nationwide deployment of 5G broadband from poles.

AT&T had an even more ludicrous announcement at the end of 2018 where they announced 5G broadband that involved deployment of WiFi hotspots that were supposedly fed by 5G. However, this was a true phantom product for which they had no pricing and that nobody could order. And since no AT&T cell sites have been upgraded to 5G, one had to wonder how this involved any 5G technology. It’s clear this was technology roll-out by press release only so that they could have the bragging rights of saying they were the first ones to have 5G.

The final announcement I saw on that same day was one by T-Mobile saying they would begin deploying early 5G in cell sites in 2020. But the real news is that they aren’t planning on charging any more for any extra 5G speeds or features.

I come back to my original question about why I write about 5G so often. A lot of my clients ask me if they should be worried about 5G and I don’t have an answer for them. I can see that actual 5G technology is going to take a lot longer to come to market than the big carriers would have you believe. But I look at T-Mobile’s announcement on price and I also have to wonder what the cellular companies will really do once 5G works. Will AT&T and Verizon both spend billions to put 5G small cells in residential neighborhoods if it doesn’t drive any new cellular revenues? I have to admit that I’m skeptical – we’re going to have to wait to see what the carriers do rather than listen to what they say.

Making a Safe Web

Tim Berners-Lee was one of the founders of the Internet and implemented the first successful communication between a client and a server using HTTP in 1989. He’s always been a proponent for an open Internet and doesn’t like how the web has changed. The biggest profits on the web today come from the sale of customer data.

Berners-Lee has launched a new company along with cybersecurity expert John Bruce that proposes to ‘restore rightful ownership of the data back to every web user”. The new start-up is called Inrupt which is proposing to develop an alternate web for users who want to protect their data and their identity.

Berner-Lee has been working at the Computer Sciences and Artificial Intelligence Laboratory (CSAIL) at MIT to develop a software platform that can support his new concept. The platform is called Solid, which has the main goal of decoupling web applications from the data they produce.

Today our personal data is stored all over the web. Our ISPs make copies of a lot of our data. Platforms like Google, Facebook, Amazon, and Twitter gather and store data on us. Each of these companies captures a little piece of the picture of who we each are. These companies use our data for their own purposes and then sell it to companies that buy, sort and compile that data to make profiles on all of us. I saw a disturbing statistic recently that there are now up to 1,400 data points created daily for the typical data user every day – data gathered from our cellphones, smart devices, and our online web activity.

The Solid platform would change the fundamental structure of data storage. Each person on the Solid platform would create a cache of their own personal data. That data could be stored on personal servers or on servers supplied by companies that are part of the Solid cloud. The data would be encrypted and protected against prying.

Then, companies like Berners-Lee’s Inrupt would develop apps that perform functions users want without storing any customer data. Take the example of shopping for new health insurance. An insurance company that agrees to be part of the Solid platform would develop an app that would analyze your personal data to determine if you are a good candidate for the insurance policy. This app would work on your server to analyze your medical records and other relevant personal information. The app would do its analysis and decide if you are a good candidate for a policy. It might report information back to the insurance company such as some sort of rating of you as a potential customer, but the insurance would never see the personal data.

The Solid concept is counting on the proposition that there are a lot of people who don’t want to share their personal data on the open web. Berners-Lee is banking that there are plenty of developers who would design applications for those in the Solid community. Over time the Solid-based apps can provide an alternate web for the privacy-minded, separate and apart from the data-collection web we share today.

Berners-Lee expects that this will first take a foothold in industry groups that value privacy like coders, lawyers, CPAs, investment advisors, etc. Those industries have a strong desire to keep their client’s data private, and there is no better way to do that than by having the client keep their own data. This relieves lawyers, CPAs and other professionals from the ever-growing liabilities from data breaches of client data.

Over time Berners-Lee hopes that all sorts of other platforms will want to cater to a growing base of privacy-minded users. He’s hoping for a web ecosystem of search engines, news feeds, social media platforms, and shopping sites that want to sell software and services to Solid users, but with the promise of not gathering personal data. One would think current existing privacy-minded platforms like Mozilla Firefox would join this community. I would love to see a Solid-based cellphone operating system. I’d love to use an ISP that is part of this effort.

It’s an interesting concept and one I’ll be watching. I am personally uneasy about the data being gathered on each of us. I don’t like the idea of applying for health insurance, a credit card or a home mortgage and being judged in secret by data that is purchased about me on the web. None of us has any idea of the validity and correctness of such data. And I doubt that anybody wants to be judged by somebody like a mortgage lender using non-financial data like our politics, our web searches, or the places we visit in person as reported by our cellphones. We now live in a surveillance world and Berners-Lee is giving us the hope of escaping that world.

Streamlining Regulations

Jonathan Spalter of USTelecom wrote a recent blog calling on Congress to update regulations for the telecom industry. USTelecom is a lobbying arm representing the largest telcos, but which also still surprisingly has a few small telco members. I found the tone of the blog interesting, in that somebody who didn’t know our industry would read the blog and think that the big telcos are suffering under crushing regulation.

Nothing could be further from the truth. We currently have an FCC that seems to be completely in the pocket of the big ISPs. The current FCC walked in the door with the immediate goal to kill net neutrality, and in the process decided to completely deregulate the broadband industry. The American public hasn’t really grasped yet that ISPs are now unfettered to endlessly raise broadband prices and to engage in network practices that benefit the carriers instead of customers. Deregulation of broadband has to be the biggest regulatory giveaway in the history of the country.

Spalter goes on to praise the FCC for its recent order on poles that set extremely low rates for wireless pole connections and which lets wireless carriers place devices anywhere in the public rights-of-way. He says that order brought “fairness’ to the pole attachment process when in fact the order was massively unbalanced in favor of cellular companies and squashes any local input or authority over rights-of-ways – something that has always been a local prerogative. It’s ironic to see USTelecom praising fairness for pole attachments when their members have been vehemently trying to stop Google Fiber and others from gaining access to utility poles.

To be fair, Spalter isn’t completely wrong and there are regulations that are out of date. Our last major telecom legislation was in 1996, at a time when dial-up Internet access was spreading across the country. The FCC regulatory process relies on rules set by Congress, and since the FCC hasn’t acted since 1996, Spalter accuses Congress of having “a reckless abdication of government responsibility”.

I find it amusing that the number one regulation that USTelecom most dislikes is the requirement for the big telcos make their copper wires available to other carriers. That requirement of the Telecommunications Act of 1996 was probably the most important factor in encouraging other companies to compete against the monopoly telephone companies. In the years immediately after the 1996 Act, competitors ordered millions of wholesale unbundled network elements on the telco copper networks.

There are still competitors that using the telco copper to provide far better broadband than the telcos are willing to do, so we need to keep these regulations as long as copper remains hanging on poles. I would also venture a guess that the telcos are making more money selling this copper to the competitors than they would make if the competitors went away – the public is walking away from telco DSL in droves.

I find it curious that the telcos keep harping on this issue. In terms of the total telco market the sale of unbundled elements is a mere blip on the telco books. This is the equivalent to a whale complaining about a single barnacle on his belly. But the big telcos never miss an opportunty to harp about the issue and have been working hard to eliminate sale of copper to competitors since the passage of the 1996 Act. This is not a real issue for the telcos – they just have never gotten over the fact that they lost a regulatory battle in 1996 and they are still throwing a hissy fit over that loss.

The reality is that big telcos are less regulated than ever before. Most states have largely deregulated telephone service. The FCC completely obliterated broadband regulation. While there are still cable TV regulations, the big telcos like AT&T are bypassing those regulations by moving video online. The big telcos have already won the regulatory war.

There are always threats of new regulation – but the big telcos always lobby against new rules far in advance to weaken any new regulations. For example, they are currently supporting a watered-down set of privacy rules that won’t afford much protection of customer data. They have voiced support for a watered-down set of net neutrality rules that doesn’t obligate them to change their network practices.

It’s unseemly to see USTelecom railing against regulation after the telcos have already been so successful in shedding most regulations. I guess they want to strike while the iron is hot and are hoping to goad Congress and the FCC into finishing the job by killing all remaining regulation. The USTelcom blog is a repeat of the same song and dance they’ve been repeating since I’ve been in the industry – which boils down to, “regulation is bad”. I didn’t buy this story forty years ago and I still don’t buy it today.

The American Broadband Initiative

On February 13 the Secretary of Commerce Wilbur Ross led a group of more than 20 federal agencies in announcing what the administration is calling the American Broadband Initiative (ABI). The stated purpose of this initiative is to promote broadband deployment. This was accompanied by this Milestones Report that lists numerous specific federal initiatives and associated timelines. The stated purpose of the ABI is to streamline the federal permitting process and to leverage federal assets to lower the cost of deploying broadband.

Big announcements of this sort are usually mostly for public relations purposes rather than anything useful, and this is no exception. The main purpose of ABI seems to be to show rural America that the federal government cares about the lack of rural broadband. Unfortunately, this kind of PR effort works, as evidenced by a conversation I had with rural politician soon after the ABI announcement who hoped this would mean real movement towards broadband deployment in his region. I felt bad when I told him that I see nothing new or of consequence in the ABI announcement, and nothing that I thought would improve broadband in his area.

This is not to say that there was nothing of importance in the ABI. However, the most important initiatives included in the ABI are repeats of previous announcements. For example, the leading bullet point in the ABI is the announcement of the $600 million e-connectivity grant/loan program – something that everybody in the industry has known about since last fall. There were a few other repeats of past announcement such as the intention to ease the permitting process on federal land.

A lot of the announcements have to do with the permitting for broadband facilities and access to public land, including:

  • The U.S. Department of Interior will make it’s 7,000+ towers available to carriers and will publish a map. Any tall towers in this list are already included in the FCC tower database.
  • The NTIA is creating a web site that will centralize the information needed to get permits to place telecom assets on public land.
  • The GSA is undertaking an effort to document flow charts of the process required to get a permit for the use of federal land or federal towers.
  • The GSA will also tackle simplifying the permitting application forms.
  • The GSA is soliciting comments from the public to identify areas with poor cellular coverage, with the hope that the GSA can then identify public assets that might help alleviate lack of cellular coverage.

There are a few other announcements that could be beneficial such as streamlining the environmental and historic preservations reviews on public properties. Those requirements are a definite roadblock to using public land, but streamlining is not the same thing as eliminating, so I’d have to see what this means in practice to know if this is an actual improvement.

I have no doubt that these efforts will help a few broadband projects. However, federal lands tend to be lightly populated and I have to wonder how many broadband projects want to use federal lands? In the hundreds of broadband projects I’ve been involved in I can count on one hand the times when federal rights-of-way were an issue.

There is one situation this could be a benefit – the siting of antennas on top of federal buildings. In many small towns the court house is the tallest structure and has largely been unavailable to wireless providers. But until I see this work easily in real life I’m going to remain skeptical.

The ABI report is mostly fluff. It seems obvious that all cabinet agencies were asked to provide a list of ways they can help broadband, and they all scrambled to come up with something to report. While a few of the announced initiatives might help a handful of projects, for the most part the initiatives listed in the ABI aren’t going to help anybody. If the administration really wanted to help brpadband, they can create grant programs that don’t have forced ties to RUS loans than many ISPs can’t accept, or they would eliminate the inane requirement that federal grants can only be used where homes don’t have 10/1 Mbps speeds.

The Cost of Siting Small Cells

One of the more unusual things ordered by the current FCC was setting a low cap on local fees that a City can charge to review an application for placing a small cell site. The FCC capped the application fee at up to $500 for a request to up to five small cell sites and $100 per site after that. The FCC also set a cap of $270 for an annual fee to use the rights-of-way for each small cell site.

Cities have an option to charge a more and can bill a ‘reasonable approximation’ of actual costs, but a City can expect a legal fight from wireless carriers for fees that are much higher than the FCC caps.

It’s worth looking back at the history of the issue. Wireless carriers complained to the FCC that they were being charged exorbitant fees to put equipment on utility poles in the public rights-of-way. The wireless carriers cited examples of having to pay north of $10,000 per small cell site. In most cases, fees have been far smaller than that, but citing the worst examples gave cover to the FCC for capping fees.

However, some of the examples of high fees cited by the carriers were for installations that would not be considered as a small cell. I’ve seen applications requests for hanging devices the size of a refrigerator on poles and also placing large cabinet on the sidewalk under a pole. The FCC acknowledged this in their order and set a size limit on what constitutes a small cell as a device occupying something less than 28 cubic feet.

It’s worth noting that much of the FCC’s order for small cell sites are under appeal. The most controversial issues being challenged are aspects of the order that stripped cities of the ability to set local rules on what can and cannot be hung on poles. The FCC basically said that cellular carriers are free to do what they want anywhere in the public rights-of-way and cities are arguing that the order violates the long-standing precedent that rights-of-ways issues should be decided locally.

Communities all over the country are upset with the idea that they have to allow a small cell site any place that the carriers want to put one. There are also active citizen’s groups protesting the implementation of millimeter wave cell sites due to public health concerns. A lot of the prominent radio scientists from around the world have warned of the potential public health consequences for prolonged exposure to the millimeter wave spectrum – similar to the spectrum used in airport scanners, but which would be broadcast continuously from poles in front of homes. There is also a lot of concern that carriers that hang millimeter wave transmitters are going to want aggressive tree trimming to maintain lines-of-sight to homes. Finally, there are concerns about the wild proliferation of devices if multiple wireless providers install devices on the same street.

The cap on local fees has already been implemented and cities are now obligated to charge the low rates unless they undertake the effort (and the likely legal fight) for setting higher fees. It is the setting of low fees that is the most puzzling aspect of the FCC order. It seems that the FCC has accepted the wireless carrier’s claim that high fees would kill the deployment of 5G small cell sites everywhere.

I live in a city that is probably pretty typical and that has an application process and inspectors for a huge range of processes, from building inspection, restaurant inspections, electrical and gas installation inspections and inspections of anything that disturbs a city street surface or is hung in the public rights-of-way. The city takes a strong position in assuring that the public rights-of-way are maintained in a way that provides the best long-term opportunity for the many uses of the rights-of-way. They don’t let any utility or entity take steps that make it harder for the next user to gain the needed access.

The $100 fee is to compensate the city for processing the application for access, to survey the site of the requested access and to then inspect that the wireless carrier really did what they promised and didn’t create unsafe conditions or physical hindrances in the right-of-way. It’s hard to think that $100 will compensate any city for the effort required. It will be interesting to see how many cities acquiesce to the low FCC rates instead of fighting to implement fair rates. Cities know that fights with carriers can be costly and they may not be willing to tackle the issue. But they also need to realize that the wireless carriers could pepper their rights-of-ways with devices that are likely to hang in place for decades. If they don’t tackle the issue up front they will have no latitude later to rectify small cell sites that were hung incorrectly or unsafely. I’ve attended hundreds of city council meetings and have always been amazed at the huge number of different issues that local politicians have to deal with. This is just one more issue added to that long list, and it will be understandable if many cities acquiesce to the low fees.

San Jose Tackles the Digital Divide

As a country we have done well with 85% of households in most areas now buying some form of broadband connection. But that still means that 15% of homes don’t have broadband. Certainly there are some homes that don’t want broadband, but it’s clear that a significant percentage of those without broadband can’t afford it.

Affordability is going to become more of an issue now that we see a strategy of the big ISPs to raise rates every year. I don’t think there’s much doubt that the cost of broadband is going to climb faster than the overall rate of inflation. We recently saw Charter raise the rate of bundled broadband by $5 per month. Wall Street is crediting the higher earnings of several big cable companies due to the trend that the companies are cutting back on their willingness to offer special prices for term contracts – I think the cable companies are finally acknowledging that they have won the war against DSL.

San Jose is no different than any big city in that it has big numbers of homes without broadband. The city recently estimated that there are 95,000 residents of the city without a home broadband connection. The city just announced a plan to begin solving the digital divide and pledged $24 million to kick off the effort. They claim this is the biggest effort being taken by a major city to solve the digital divide.

The digital divide became apparent soon after the introduction of DSL and cable modems in the late 1990s. Even then there were households locked out from the new technology due to the cost of buying broadband service. The digital divide gets more acute every year as more and more of our daily lives migrate online. It’s grown to become unimaginable for a student to have an even chance in school without access to broadband. Anybody with broadband only has to stop and imagine for a second what it would be like to lose broadband access – and then realize that there are huge numbers of homes that are missing out on many of the basic benefits that those with broadband take for granted.

The San Jose plan is light on detail at this early stage, but it’s clear that the city will be looking for infrastructure plans to extend broadband rather than subsidizing service from incumbent ISPs. Consider the mayor’s stated vision for broadband:

“Ensure all residents, businesses, and organizations can participate in and benefit from the prosperity and culture of innovation in Silicon Valley . . . Broaden access to basic digital infrastructure to all residents, especially our youth, through enabling free or low cost, high-speed, 1 gigabit or faster broadband service in several low-income communities, and increasing access to hardware, including tablets and laptops, for low-income students.”

The city won’t be tackling the issue alone and is hoping for involvement from the business and charitable organizations in the city. For example, the city is already working with the Knight Foundation that has been addressing this issue for years. The city is interested in technologies like Facebook’s terragraph wireless technology that plans to use 60 GHz spectrum to create fast outdoor wireless broadband.

The city recognizes that there are no instant fixes and already recognizes that it might take a decade to bring fast affordable broadband to everybody in the city. I’m sure that $24 million is also just a downpayment towards a permanent broadband solution. But this plan puts the city ahead of every other major metropolitan area in the willingness to tackle the problem head-on.

There has been a cry for solving the digital divide for twenty years. Some communities have found solutions that help, like the charitable effort by E2D in Charlotte, NC that is bringing laptops and wireless broadband to large numbers of homeless and low-income school students. But no city has directly tackled the problem before with a pledge of serious taxpayer funds to help find a solution. It’s been obvious from the beginning of the digital divide discussions that it was going to take money and broadband infrastructure to solve the problem. I’m sure that many other cities will be watching San Jose because the broadband gap is becoming a significant contributor to creating an underclass that has less access to education, healthcare and the chance for good paying jobs. I’m willing to make a bet that the long-term economic benefits from solving the digital divide in San Jose will be far greater than the money they are putting into the effort.

Excess Dark Fiber

A few weeks ago I wrote about a recommendation from one of the BDAC subcommittees to expand the base for the fees collected to fund the Universal Service Fund. BDAC is the acronym for the Broadband Deployment Advisory Committees created by FCC Chairman Ajit Pai to advise on ideas to promote better broadband.

That BDAC subcommittee is the one that is tasked with developing Model State Codes – ideas for states to consider in legislation. The subcommittee came up with another real doozy of an idea. In their latest draft report to the FCC in Article 4 – Rights to Access to Existing Network Support Infrastructure,  the group suggests that broadband could be more affordably expanded if excess fiber built by municipalities was made available to commercial providers for cheap prices.

The BDAC subcommittee suggests that any excess municipal fiber that is not in a 50-year fiber plan must be made available for lease to other carriers. The group also oddly proposes that this would also apply to municipal buildings, I guess to save carriers from having to build huts. I can think of a hundred reasons why forcing government buildings to house carriers is an extremely dumb idea, but let’s look closer at the fiber idea.

The BDAC suggestion clearly comes from the big ISPs who would love to get their hands onto municipal fiber for a bargain price. The way I know that the idea comes from the big ISPs is that they are suggesting that this would only be applied to municipal fiber. If the group had been looking for ways to improve broadband deployment they would have expanded this idea to include all excess dark fiber, regardless of the owner.

I always hear that one of the reasons we don’t have more fiber-to-the-home is that there is not enough fiber already in our communities. I don’t think that’s true. If I look at my city of Asheville, NC I would bet there is already fiber within a quarter mile of everybody in the City. The City might own fiber to connect schools or other government buildings. There is probably some fiber that supports public-safety networks and traffic lights. The incumbent cable company and telco deploys fiber to get to neighborhood nodes. There is fiber built to reach to large businesses. There’s fiber built to get to cellular towers. There is certainly fiber built to places like our large regional hospital complex, the universities, and various federal government office buildings. There is fiber owned by the electric company, and perhaps also by the gas and water companies. And as a regional hub at the nexus of a few major highways, there is likely long-haul fiber passing through here on the way to somewhere else, plus NCDOT fiber used for more local uses.

I’m positive that if all of this fiber was mapped that Asheville would look like a fiber-rich City – as would many places. Even rural counties often have a surprising amount of existing fiber that satisfies these same kinds of purposes. Yet most existing fiber was built to satisfy a single purpose and isn’t available for all of the other ways that fiber could benefit a community. Asheville might be fiber rich, but that fiber is off-limits to somebody interested in building fiber-to-the-home.

That’s the implied justification for the BDAC suggestion – that excess fiber shouldn’t sit idle if it could benefit better broadband. That’s also the basis for my suggestion of expanding this concept to all fiber, not just to government fiber. If AT&T builds a 24-fiber cable to a cell tower and will never use more than a few strands, then why shouldn’t they be required to sell the excess fiber capacity for cheap if it benefits the community?

The idea of forcing big ISPs to make fiber available is not a new one. In the Telecommunications Act of 1996, Congress required the big telcos to unbundle their excess dark fiber and make it available to anybody. However, the telcos actively resisted that order and began immediately to petition the FCC to soften the requirement, and as a consequence, very little dark fiber has ever been provided to others. I helped a few dozen companies try to get access to telco dark fiber and only a few succeeded. However, Congress was on the right track by recognizing that idle dark fiber is a valuable asset that could benefit the larger community.

I wrote a blog a few weeks back that talked about how Romania has the best broadband in Europe based upon hundreds of small ISPs that have built fiber just in their immediate neighborhood. I think that if all of the excess fiber capacity in a city was made available that it would unleash all sorts of creative entrepreneurs to do similar things. I know I would consider building a fiber network in my own neighborhood if there was a way for me to backhaul to a larger partner ISP.

However, the BDAC suggestion is not quite as altruistic as it might sound – the BDAC subcommittee is not worried that the public is missing out on the benefits from excess dark fiber. Remember that the big ISPs largely control the BDAC committees and I think this suggestion comes from AT&T and Comcast that want to punish any city with the audacity to build fiber to compete with them. This requirement would allow the big ISPs to take advantage of those competitive networks to effectively squash municipal competition.

But we shouldn’t let the vindictive nature of the suggestion erase the larger concept. I’ve rarely gotten a chance in our industry to say that, “What’s good for the goose is good for the gander” – but this is that opportunity. The BDAC has correctly identified the fact that broadband deployment would be easier everywhere if we could unleash the capacity of unused dark fiber. The BDAC subcommittee just didn’t take this idea to the natural conclusion by applying it to all existing fiber. I’m certain that if a state embraced applying this concept to all fiber that we’d see the big ISP screaming about confiscation of capital – which is exactly what it is.