States Fight Back Against CAF II

Jon Brodkin of ArsTechnica wrote a recent article about the Mississippi Public Service Commission (PSC) notifying the FCC that AT&T had failed to meet its CAF II requirements in the state. AT&T had taken over $49 million per year for six years ending this December and was supposed to use that money to upgrade broadband to almost 144,000 residents in the state to at least 10/1 Mbps broadband.

The PSC notification informs that FCC that they don’t believe the upgrades have been done or that many of those homes were able to get faster broadband. AT&T has certified to the FCC that the CAF II work has been completed on schedule. AT&T has stonewalled the PSC on data requests to find out how many homes have successfully been able to access faster broadband.

The FCC is supposed to begin testing CAF II homes in 2021 and is supposed to fine the big telcos like AT&T if homes in the CAF II area aren’t getting the faster speeds. However, that testing program is badly flawed in that the telcos are going to have some say about which homes get tested, and they’ll certainly funnel the testing into places that meet the speed test.

AT&T elected to use the CAF II funding to upgrade speeds by offering fixed cellular service to customers that formerly had slow DSL service. From what I can see, AT&T has not widely advertised the new wireless product and it’s unlikely that they have added many people to the cellular technology in Mississippi or anywhere else. The company is refusing to tell the state how many homes are on the new product.

Unfortunately, what AT&T is doing in Mississippi is not unusual. AT&T took $2.57 billion nationwide for CAF II and it’s likely It hasn’t made many upgrades in other states as well. I’ve seen a lot of evidence that Frontier ($1.7 billion) and CenturyLink ($3.03 billion) have also failed to upgrade rural customers. Those two companies elected to mostly upgrade rural DSL to the faster speeds. We’ve recently had engineers in counties where Frontier and CenturyLink were supposed to make CAF II upgrades and we could find no evidence of upgraded DSL anywhere in the rural parts of these counties. We’ve also helped counties to solicit speed test from citizens and we’ve studied a number of counties where no rural DSL service tested even close to the 10/1 Mbps goal of CAF II.

To make matters even worse, the FCC recently decided to award these big telcos a seventh year of subsidy. That means AT&T will get $428 million in 2021, Frontier will get $283 million, and CenturyLink will get $506 million. The companies have no obligation for this addition funding and don’t have to use it to improve rural broadband.

While 10/1 Mbps broadband isn’t great, it’s a lot better than the DSL that was in these rural areas in 2015 when the CAF II payments began. The CAF II areas are remote and most customers who could even get DSL saw speeds under 1 or 2 Mbps download.

The impact of AT&T’s failure to make the upgrades became apparent this year when millions of students were sent home during the pandemic. A student might be able to squeak out a school connection on a 10/1 Mbps broadband connection, but students cannot function on the slower DSL that is still in place due to lack of upgrades. The actions of the FCC and the greed of the big telcos robbed millions of rural homes from getting better broadband.

The failure of CAF II rests entirely on the FCC. The last FCC under Chairman Wheeler awarded the funding to upgrade to 10/1 speeds, even though the definition of broadband at the time was 25/3 Mbps. The current FCC under Chairman Pai has turned a blind eye to the non-performance of the big telcos and absurdly is awarding them with an additional year of CAF II funding. The overall CAF II program handed out over $10 billion in funding for improving rural broadband that might as well have been flushed down the drain. The FCC could have awarded this money instead to broadband grants that could have brought better broadband in the CAF II rural areas.

I hope the Mississippi PSC does more than just write a letter. I’d like to see them ask for AT&T to refund the CAF II money to the state to use for broadband grants. And I’d love to see other states do the same and take back the billions of CAF II broadband funding that was wasted.

Broadband Interference

Jon Brodkin of ArsTechnica published an amusing story about how the DSL went out of service in a 400-resident village in Wales each morning at 7:00 am. It turns out that one of the residents turned on an ancient television that interfered with the DSL signal to the extent that the network collapsed. The ISP finally figured this out by looking around the village in the morning with a spectrum analyzer until they found the source of the interference.

It’s easy to think that the story points out another weakness of old DSL technology, but interference can be a problem for a lot of other technologies.

This same problem is common on cable company hybrid-fiber coaxial networks. The easiest way to understand this is to think back to the old days when we all watched analog TV. Anybody who watched programming on channels 2 through 5 remembers times when the channels got fuzzy or even became unwatchable. It turns out that there are a lot of different devices that interfere with the frequencies used for these channels including things like microwave ovens, certain motors like power tools and lawnmowers, and other devices like blenders. It was a common household occurrence for one of these channels to go fuzzy when somebody in the house, or even in a neighboring home used one of these devices.

This same interference carries forward into cable TV networks. Cable companies originally used the same frequencies for TV channels inside the coaxial wires that were used over the air and the low TV channels sat between the 5 MHz and 42 MHz frequency. It turns out that long stretches of coaxial wires on poles act as a great antenna, so cable systems pick up the same kinds of interference that happens in homes. It was pretty routine for channels 2 and 3, in particular, to be fuzzy in an analog cable network.

You’d think that this interference might have gone away when cable companies converted TV signals to digital. The TV transmissions for channels 2 through 5 got crystal clear because cable companies relocated the digital version of these channels to better frequency. When broadband was added to cable systems the cable companies continue to use the low frequencies. CableLabs elected to use these frequencies for the upload portion of broadband. There is still plenty of interference in cable networks today – probably even more than years ago as coaxial networks have aged and have more points for interference to seep into the wires. Until the pandemic, we didn’t care much about upload bandwidth, but it turns out that one of the major reasons that cable companies struggle to deliver reliable upload speeds is that they are using the noisiest spectrum for the upload function.

The DSL in the village suffered from the same issue since the telephone copper wires also act as a big outdoor antenna. In this village, the frequency emanating from the old TV exactly matched the frequencies used for DSL.

Another common kind of interference is seen in fixed wireless networks in a situation where there are multiple ISPs using the same frequencies in a given rural footprint. I know of counties where there are as many as five or six different wireless ISPs, and most use the same frequencies since most WISPs rely on a handful of channels in the traditional WiFi bandwidth at 2.4 MHz and 5 MHz. I’ve heard of situations where WiFi is so crowded that the performance of all WISPs suffer.

WiFi also suffers from local interference in the home. The WiFi standard says that all devices have an equal chance of using the frequencies. This means that a home WiFi router will cycle through all the signals from all devices trying to make a WiFi connection. When a WiFi router connects with an authorized device inside the home it allows for a burst of data, but then the router disconnects that signal and tries the next signal – cycling through all of the possible sources of WiFi.

This is the same issue that is seen by people using WiFi in a high-rise apartment building or a hotel where many users are trying to connect to WiFi at the same time. Luckily this problem ought to improve. The FCC has authorized the use of 6 GHz spectrum for home broadband which opens up numerous new channels. Interference will only occur between devices trying to share a channel, but that will be far fewer cases of interference than today.

The technology that has no such interference is fiber. Nothing interferes with the light signal between a fiber hub and a customer. However, once customers connect the broadband signal to their home WiFi network, the same interference issues arise. I looked recently and can see over twenty other home WiFi networks from my office – a setup ripe for interference. Before making too much fun of the folks in the Welsh village, there is a good chance that you are subject to significant interference in your home broadband today.

Comcast Offers New Work-from-home Product

The pandemic has forced millions of people to work from home. This instantly caused heartburn for the IT departments of large corporations because remote workers create new security vulnerabilities and open companies to cyberattacks and hacking. Big companies have spent the last decade moving data behind firewalls and suddenly are being asked to let thousands of employees pierce the many layers of protection against outside threats.

Comcast announced a new product that will alleviate many of the corporate IT concerns. Comcast, along with Aruba has created the Comcast Business Teleworker VPN product. This product creates a secure VPN at an employee’s home and transports the VPNs for all remote workers to a remote datacenter where corporate IT can then deal with all remote workers in one place.  This isolates the worker connections from the corporate firewalls and employees instead deal with copies of corporate software that sit in a datacenter.

There is a perceived long-term need for the product since as many as 70% of companies say that they are likely to continue with the work-from-home model after the end of the pandemic. Working from home is now going to be a routine component of corporate life.

At the home end, the Comcast product promises to not interfere with existing home broadband. The only way for Comcast to do this is to establish a second data stream from a house using a separate cable modem (or utilizing modems that can establish more than one simultaneous connection). This is an important aspect of the product because one of the biggest complaints about working from home is that many homes have problems accommodating more than one or two workers or students at the same time. This new product would be ill-received by workers if implementing it means less bandwidth for everybody else in the home.

By routing all remote employees to a common hub, Comcast will enable corporate IT staff to mimic the work computing environment for remote workers. Many companies are currently giving remote employees limited access to core software systems and data, but this arrangement effectively establishes the Comcast hub as a secure node on the office network.

This is something that any ISP with a fiber network should consider mimicking. An open-access network on fiber already does this same thing today. An open-access network creates a VPN at each customer of a given ISP and then aggregates the signals, untouched, to deliver to the ISP. On a fiber network, this function can be done by fairly simple routing.  Fiber ISPs can also provide the home working path separate from the consumer path by either carving out a VPN or else providing a second data path – something most fiber ONTs already allow.

Comcast has taken the extra step of partnering with Aruba to enable a corporation to establish a virtual corporate data center at a remote site. But fiber ISPs don’t have to be that complicated and rather than offering this to only large corporate clients, a fiber network could deliver a secure path between home and office for a business with only a few remote employees.

This could even be provided to sole proprietors and could safely link home and office on a VPN.  That allows for the marketing of a ‘safe office’ connection for businesses of any size and would provide the average small business a much more secure connection between home and office than they have today.

Every fiber provider that serves both residential communities and business districts ought to develop some version of this product by year-end. If working from home is a new reality, then fiber-based ISPs ought to be catering to that market using the inherent robustness and safety of a fiber network to create and route VPNs over the local fiber network.

You Can’t Force Innovation

The new video service Quibi failed after only 7 months of operation and after having received $2 billion in backing from big industry players. The concept was to offer short 5 to 7-minute video serials that would get viewers engaged in a story from day-to-day and week-to-week. The failure seems to be due to nobody being interested in the format. Younger viewers aren’t interested in scripted Hollywood content and instead watch content created by their peers. Older people have now been trained to binge-watch. It turns out there no audience for the concept of short cliff-hanger videos.

The Quibi failure reminded me that you can’t force innovations onto the public. We live in a society where everything new is hyped beyond belief. New technologies and innovations are not just seen as good, but in the hype-world are seen as game changers that will transform society.  A few innovations live up to the hype, such as the smartphone. But many other highly-hyped innovations have been a bust.

Consider bitcoin. This was a new form of currency that was going to replace government-backed currency. But the public never bought into the concept for one big fundamental reason – there is nothing broken about our current form of money. We deposit our money in banks, and it sits there safely until we’re ready to use it. For all of the endless hype about how bitcoin would change the world, I never heard a good argument about why bitcoin is better than our current banking system – except maybe for criminals and dictators that want to hide wealth.

Another big bust was Google Glass. People were not ready to engage with somebody in public who could film them and replay a casual conversation later or post it on social media. People were even more creeped out by the stalker aspect of men using facial recognition to identify and stalk women. To give credit to Google, the folks there never envisioned this as a technology for everybody, but the Internet hype machine played up the idea beyond belief. The public reaction to the technology was a resounding no.

Google was involved in another project that hit a brick wall. Sidewalk Lab, a division of Alphabet envisioned a new smart city being created on the lakefront in Toronto. To tech folks, this sounded great. The city would be completely green and self-contained. Robots would take care of everything like emptying trashcans when they are full, to setting up picnics in the park and cleaning up afterwards. Traffic was all underground and an army of robots and drones would deliver everything people wanted to their doorstep. But before this even got off the drawing board, the people of Toronto rejected the idea as too big-brotherish. The same computer systems that catered to resident demands would also watch people at all times and record and categorize everything they do. In the end, privacy won out over technology.

Some technologies are hyped but never materialize. Self-driving cars have been touted as a transformational technology for over a decade. But in the last few years, the engineers working on the technology acknowledge that a fully self-sufficient self-driving car is still many years away. But this doesn’t stop the hype and there are still articles about the promise of self-driving cars in the press every month.

Nothing has been hyped more in my lifetime than 5G. In the course of recently watching a single football game, I must have seen almost a dozen 5G commercials. Now that 5G phones are hitting the market, the new technology is likely going to soon be perceived by the public as a bust. The technology is being painted as something amazing and new, but recent tests show that 5G is no faster than 4G in 21 of 23 cities. 5G will eventually be faster and better, but will today’s hype make it hard for the cell companies to explain when 5G is actually here?

I could continue to list examples. For example, if I had believed the hype, I’d now live in a fully-automated home where I could talk to my home and have it cater to my every whim. I’d have unlimited power from a cheap neighborhood fusion power plant that produces unlimited and clean power fueled by water. I’d be able to avoid a commute by using my flying car. There is much to like in the hype-world, but sadly it’s not coming any time soon.

Pricing Strategies

One of the things that new ISPs always struggle with is pricing, and I’m often asked advice on the right pricing strategy. It’s not an easy answer and in working across the country I see a huge range of different pricing strategies. It’s really interesting to see so many different ideas on how to sell residential broadband service, which is fundamentally the same product when it’s offered on a fiber network. The following are some of the most common pricing strategies:

High, Low, or Market Rates? The hardest decision is where to set rates in general. Some ISPs are convinced that they need low rates to beat the competition. Others set high rates since they only want to sell products with high margins. Most ISPs set rates close to the market rates of the competitors. I sat at a bar once with a few ISPs who argued this for hours – in the end, the beer won.

One Broadband Product. A few ISPs like Google Fiber, Ting, and a handful of smaller ISPs offer only a single broadband product – a symmetrical gigabit connection. Google Fiber tried going to a 2-product tier but announced this year that they’ve returned to the flat-rate $70 gigabit. The downside to this approach is that it shuts out households that can’t afford the price. The upside is that every customer has a high margin.

Simple Tiers. The most common pricing structure I see offers several tiers of prices. An ISP might have three-tier offerings at $55, $70, and $90, ranging from 100 Mbps to gigabit. Generally, such prices have no gimmicks – no introductory pricing, term discounts, or bundling. There are still ISPs with half a dozen, or even more tiers this would confuse me as a customer. For example, I don’t know how a customer would be able to choose between buying 75 Mbps, 100 Mbps, and 125 Mbps.

ISPs with this philosophy differ most by the gap between pricing tiers. Products could be priced $10 apart of $30 apart, and that makes a significant statement to customers. Small steps between tiers invite customers to upgrade, while bigger steps between tiers make a statement about the value of the faster speeds.

Low Basic Price. I’ve seen a number of ISPs that have a low-price basic broadband product, but otherwise somewhat normal tiers of pricing. This is done more often by municipal ISPs trying to make broadband affordable to more homes, but there are commercial ISPs with the same philosophy. As an example, an ISP might have an introductory tier of 25 Mbps for $40. This pricing strategy has always bothered me. This can be a dangerous product to offer because the low price might attract a lot of customers who would otherwise pay more. I’ve always thought that it makes more sense to offer a low-income product only to homes that qualify in some manner but give them real broadband.

Introductory Marketing Rate. Some ISPs set a low introductory rate for first-time customers. These rates are generally good for one or two years and customers routinely sign contracts to get the low rates. The long-term downside of this pricing philosophy is that customers come to expect low rates. Customers that take the introductory rate will inevitably try to renegotiate for continued low rates at the end of the contract period.

An ISP with this pricing structure is conveying some poor messages. First, they are telling customers that their rates are negotiable. They are also conveying the message that there is a lot of profits in their normal rates and they are willing to sell for less. Customers dislike the introductory rate process because they invariably get socked with an unexpected rate increase when rates jump back to list prices. The time of introductory discounts might be coming to an end. Verizon recently abandoned the special pricing strategy because it attracts low-margin customers that often leave at the end if the contract period.

Bundling. This is a pricing strategy to give a discount for buying multiple services and has been the bread and butter for the big cable companies. Bundling is making less sense in today’s market where there is little or no margin in cable TV. Most small ISPs don’t bundle and take the attitude that their list prices are a good deal – much the same as car dealers who no longer haggle over prices. In order to bundle, an ISP has to set rates high – and many ISPs prefer to instead to set fair rates and not bother with the bundle.

The Working-from-home Migration

Upwork, a platform that supports freelancers conducted a major survey of more than 20,000 adults to look at the new phenomenon of people moving due to the pandemic, with questions also aimed at understanding the motivation for moving. Since Upwork supports people who largely work out of their homes, the survey concentrated on that issue.

What the survey verified what is already being covered widely by the press – people are moving due to the pandemic in large numbers. The survey found that the rate of migration is currently three to four times higher than the normal rate from recent years.

The key findings from the survey are as follows:

  • Between 6.9% and 11.5% of all households are considering moving due to the ability to work remotely. That equates to between 14 and 23 million people. It’s a pretty wide range of results, but likely a lot of people that want to move will end up not moving.
  • 53% of people are moving to find housing that is significantly less expensive than their current home.
  • 54% of people are moving beyond commuting distance and are moving more than a two-hour drive away from their current job.
  • People are moving from large and medium cities to places with lower housing density.

These findings are corroborated by a lot of other evidence. For example, data from Apartments.com show that rental occupancy and rates in cities are falling in the most expensive markets compared to the rest of the country. Realtors in smaller markets across the country are reporting a boom of new residents moving into communities.

Economic disruption often causes big changes in population migration and we saw spikes in people moving during the last two economic downturns. In those cases, there was a big shift in people moving from rural areas to cities and in people moving from the north to the south to follow job opportunities.

Interestingly, this new migration might reverse some of those past trends. Many rural communities have been losing population over the last few decades and the new migration patterns might reverse some of that long-term trend. People have been leaving rural parts of states to get jobs in urban centers and working from home is going to let many of these same people move back to be closer to families.

Of course, one of the issues that a lot of folks moving away from cities are going to face is that the broadband is often not as good where they want to move. The big cable companies have better networks in big cities than in smaller markets. You don’t have to move far outside of suburbs or rural county seats to find homes with little or no broadband. Even cellular coverage is a lot spottier outside of cities. I’ve seen local newspaper stories from all over the country of people who have bought rural homes only to find out that there was no broadband available.

But this isn’t true everywhere. There are some smaller towns with fiber to every home. There are rural areas with fiber to the farms. Rural communities that have fiber ought to be advertising it far and wide right now.

As a thought experiment, I looked at the states around me to see if I could identify areas that have fiber. The search was a lot harder than I thought it should be. States ought to have an easy-to-find map showing the availability of fiber because those communities are going to move to the top of the list for people who want a rural setting and who will be working from home.

I’ve worked from home for twenty years and I’m happy to see this opportunity open for millions of others. It gives you the freedom to live where you want and to choose where to live for reasons other than a job. It’s going to be an interesting decade ahead if people can move to where they want to live. I just have to warn local elected officials that new people moving to your community are going to be vocal about having great broadband.

Can the FCC Regulate Facebook?

At the urging of FCC Chairman Ajit Pai, the FCC General Counsel Tom Johnson announced in a recent blog that he believes that the FCC has the authority to redefine the immunity shield provided by Section 230 of the FCC’s rules that comes from the Communications Decency Act from 1996.

Section 230 of the FCC rules is one of the clearest and simplest rules in the FCC code:  “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider“.

In non-legalese, this means that a web companies is not liable for third-party content posted on its platform. It is this rule that enables public comments on the web. All social media consists of third-party content. Sites like Yelp and Amazon thrive because of public post reviews of restaurants and products. Third-party comments are in a lot more places on the web such as the comment section of your local newspaper, or even here on my blog.

Section 230 is essential if we are going to give the public a voice on the web. Without Section 230 protections, Facebook could be sued by somebody who doesn’t like specific content posted on the platform. That’s dangerous because there is somebody who hates every possible political position.  If Facebook can be sued for content posted by its billions of users, then the platform will have to quickly fold – there is no viable business model that can sustain the defense of huge volumes of lawsuits.

Section 230 was created when web platforms started to allow comments from the general public. The biggest early legal challenge to web content came in 1995 when Wall Street firm Stratton Oakmont sued Prodigy over a posting on the platform by a user that accused the president of Stratton Oakmont of fraud. Stratton Oakmont won the case when the New York Supreme Court ruled that Prodigy was a publisher because the platform exercised some editorial control by moderating content and because Prodigy had a clearly stated set of rules about what was allowable content on the Prodigy platform. As might be imagined, this court case had a chilling impact on the burgeoning web industry, and fledgling web platforms worried about getting sued over content posted by the public. This prompted Representatives Rob Wyden and Chris Cox to sponsor the bill that became the current Section 230 protections.

Tom Johnson believes the FCC has the authority to interpret Section 230 due to Section 201(b) of the Communications Act of 1934, which confers on the FCC the power to issue rules necessary to carry out the provisions of the Act. He says that when Congress instructed that Section 230 rules be added to FCC code, that implicitly means the FCC has the authority to interpret the rules.

But then Mr. Johnson does an interesting tap dance. He distinguishes between interpreting the Section 230 rules and regulating companies that are protected by these rules. If the FCC ever acts to somehow modify Section 230, the legal arguments will concentrate on this nuance.

The FCC has basically been authorized by Congress to regulate common carriers of telecommunications services as well as a few other responsibilities specifically assigned to the agency.

There is no possible way that the FCC could ever claim that companies like Facebook or Google are common carriers. If they can’t make that argument, then the agency likely has no authority to impose any obligations on these companies, even should it have the authority to ‘interpret’ Section 230. Any such interpretation would be meaningless if the FCC has no authority to impose such interpretations on the companies that rely on Section 230 protections.

What is ironic about this effort by the FCC is that the current FCC spent a great deal of effort to declassify ISPs from being common carriers. The agency has gone as far as possible to wipe its hands of any responsibility for regulating broadband provided by companies like AT&T and Comcast. It will require an amazing set of verbal gymnastics to somehow claim the ability to extend FCC authority to companies like Facebook and Twitter, which clearly have zero characteristics of being a common carrier while at the same time claiming that ISPs are not common carriers.

The Aftermath of Natural Disasters

The never-ending hurricane season in Louisiana this year is a reminder that fiber network owners should have disaster recovery plans in place before they are hit with unexpected major network damages and outages.

The magnitude of the storm damages in Louisiana this year is hard for the mind to grasp. Entergy, the largest electric company in the area reported that the latest hurricane Laura took out 219 electric transmission lines and 1,108 miles of wiring. The storm damaged 9,760 poles, 3,728 transformers, and 18,706 spans of wires. And Entergy is not the only electric company serving the storm-damaged area. To make matters worse, the utility companies in the area were still in the process of repairing damage from the two earlier hurricanes.

Hurricanes aren’t the only natural disaster that can damage networks. The recent fires in the northwest saw large numbers of utility poles burnt and miles of fiber melted. The town of Ruston, Louisiana saw hurricane damage this year after having massive damage last year from a major tornado.

How does the owner of a fiber network prepare for major damage? Nobody can be truly prepared for the kind of damage cited above by Entergy, but there are specific steps that should be taken long before damage hits.

One of the first steps is to have a disaster plan in place. This involves identifying ahead of time all of the first steps that should be taken when a disaster hits. This means knowing exactly who to call for help. It means having at least a minimal amount of key spare components on hand, and knowing where to find what’s needed in a hurry. It involves having plans for how to get a message out to affected customers during the emergency.

Probably the best step to take is to join a mutual aid group. This is a group of other similar network owners that agree to send repair teams after a disaster strikes. For the kind of damage caused by the hurricanes this year, hundreds of additional work crews are needed to tackle the repairs. Every utility industry has such groups. For example, the American Public Power Association has a Mutual Aid Network. This group mobilizes crews from member utilities and rushes them to the affected area, as needed. Any company joining these groups must realize that they will be asked to send crews when other group members are hit by disasters.

These mutual aid groups are a lifesaver. They not only gather the needed workforce required to fix disaster damages, but they help to coordinate the logistics of housing and feeding crews and of locating the raw materials – fiber and poles, needed to repair damages.

There is also a money side of disasters to deal with. Much of the funding to repair major storm damage comes from FEMA as funds are authorized when governors declare states of emergency. There is a huge pile of paperwork needed to claim disaster funding and there are specialized consulting firms that can help with the efforts.

There was a time when electric networks and fiber networks were separate entities, but today electric companies all utilize fiber networks as a key component for operating the electric grid. When repairing downed electric lines, it’s now mandatory to also reconnect the fiber networks that allow electric substations to function. This means that crews of fiber splicers are needed alongside electric utility technicians.

The massive damages seen this year ought to be a reminder for anybody that operates a large network to have a disaster recovery plan. I know fiber overbuilders who have never considered this, and perhaps this year will prompt them to get ready – because you never know where the next disaster will hit.

FCC Expands Rural Use of White Space Spectrum

At the October monthly meeting, the FCC modified its Part 15 rules to allow for better utilization of white space spectrum in rural America – a move that should provide a boon to fixed wireless technology. The term ‘white space’ refers to spectrum that has been assigned for over-the-air television broadcasting but that sits empty in and is not being used by a television station. In any given market there are channels of television spectrum that are not being used, and today’s ruling describes new ways that wireless ISPs, school systems, and others can better use the unused spectrum.

The FCC action follows a long-standing petition from Microsoft asking for better use of unused white space spectrum. The FCC asked Microsoft and the National Association of Broadcasters to negotiate a reasonable plan for using idle spectrum, and the actions taken by the agency reflect the cooperation of the parties. The FCC further plans to issue a Notice for Proposed Rulemaking to investigate other questions related to white space spectrum.

First, the FCC is allowing for increased height for white space transmitters. The transmitters were previously limited to being no more than 250 meters above the average terrain in an area, and that has been boosted to 500 meters. In case somebody is envisioning 1,500-foot towers, wireless companies achieve this height when placing towers on hilltops. The extra height is important for two reasons. Fixed wireless technology requires line-of-sight between the tower and a customer location, and the higher the tower the better chance of being able to ‘see’ some portion of a customer premise. Using higher towers also means that wireless signal can travel farther – white space spectrum is unique compared to many other spectrum bands in that it can deliver some broadband at significant distances from a tower.

The FCC order also is allowing increased power and has increased the maximum effective radiated power from 10 watts to 16 watts. Power levels are important because the strength of the signal matters at the customer location – higher power means a better chance of delivering full broadband speeds.

The order builds in some additional protection for existing television stations. The FCC order increases the separation between an ISP wireless signal and existing television station frequencies. Transmissions with white space spectrum tend to stray out of band and allowing broadband signals too close to television signals would mean degraded performance for both the television station and ISP. One of the questions to be asked by the NPRM is if there is a way to utilize the bands closer to existing television signals.

The FCC’s order also authorized the use of narrowband devices that use white space. This opens up the door to using white space spectrum to communicate with Internet of Things devices. In rural areas, this might be a great way to communicate with agricultural sensors since the white space spectrum can travel to the horizon.

Finally, the order allows for higher power applications in isolated geographic areas that can be ‘geo-fenced’, meaning that the transmissions can be done in such a way as to keep the signals isolated to a defined area. The envisioned uses for this kind of application would be to provide broadband along school bus routes or to provide coverage of defined farm fields.

These changes were a long time in coming, with Microsoft asking for some of these changes since 2008. The issues have been bouncing around the FCC for years and it finally took the compromise between the parties to make this work. Maybe some of the other parties arguing over spectrum allocation could learn from this example that cooperation beats years of regulatory opposition.