Continued Lobbying for White Space Spectrum

In May, Microsoft submitted a petition to the FCC calling for some specific changes that will improve the performance of white space spectrum used to provide rural broadband. Microsoft has now taken part in eleven white space trials and makes these recommendations based up on the real-life performance of the white space spectrum. Not included in this filing is Microsoft’s long-standing request for the FCC to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has long favored creating just one channel of unlicensed white space spectrum per market – depending on what’s available.

A number of other parties have subsequently filed comments in support the Microsoft proposals including the Wireless Internet Service Providers Association (WISPA), Next Century Cities, New America’s Open Technology Institute, Tribal Digital Village and the Gigabit Libraries Network. One of the primary entities opposed to earlier Microsoft proposals is the National Association of Broadcasters (NAB), which worries about interference with TV stations from white space broadband. However, the group now says that it can support some of the new Microsoft proposals.

As a reminder, white space spectrum consists of the unused blocks of spectrum that are located between the frequencies assigned to television stations. Years ago, at the advent of broadcast television, the FCC provided wide buffers between channels to reflect the capability of the transmission technology at the time. Folks my age might remember back to the 1950s when neighboring TV stations would bleed into each other as ghost signals. As radio technology has improved the buffers are now larger than needed and are larger than buffers between other blocks of spectrum. White space spectrum is using those wide buffers.

Microsoft has proposed the following:

  • They are asking for higher power limits for transmissions in cases where the spectrum sits two or more channels away from a TV station signal. Higher power means greater transmission distances from a given transmitter.
  • They are asking for a small power increase for white space channels that sit next to an existing TV signal.
  • They are asking for white space transmitters to be placed as high as 500 meters above ground (1,640 feet). In the US there are only 71 existing towers taller than 1,000 feet.
  • Microsoft has shown that white space spectrum has a lot of promise for supporting agricultural IoT sensors. They are asking the FCC to change to white space rules to allow for narrowband transmission for this purpose.
  • Microsoft is asking that the spectrum be allowed to support portable broadband devices used for applications like school buses, agricultural equipment and IoT for tracking livestock.

The last two requests highlight the complexity of FCC spectrum rules. Most people would probably assume that spectrum licenses allow for any possible use of spectrum. Instead, the FCC specifically defines how spectrum can be used and the rural white space spectrum is currently only allowed for use as a hot spot or for fixed point-to-point data using receiving antennas at a home or business. The FCC has to modify the rules to allow use for IoT for farms sensors, tractors and cows.

The various parties are asking the FCC to issue a Notice of Proposed Rulemaking to get comments on the Microsoft proposal. That’s when we’ll learn if any other major parties disagree with the Microsoft proposals. We already know that the cellular companies oppose providing multiple white space bands for anything other than cellular data, but these particular proposals are to allow the existing white space spectrum to operate more efficiently.

Are Broadband Investments Increasing?

The largest ISPs and their lobbying arm USTelecom are still claiming that the level of industry capital spending has improved as a direct result of the end of Title II regulation. In a recent blog they argue that capital spending was up in 2018 due to the end of regulation – something they describe as a “forward-looking regulatory framework”. In reality, the new regulatory regime is now zero regulation since the FCC stripped themselves of the ability to change ISP behavior for broadband products and practices.

The big ISPs used this same argument for years leading up to deregulation. They claimed that ISPs held back on investments since they were hesitant to invest in a regulatory-heavy environment. This argument never held water for a few reasons. First, the FCC barely ever regulated broadband companies. Since the advent of DSL and cable modems in the late 1990s, each subsequent FCC has largely been hands-off with the ISP industry.

The one area where the last FCC added some regulations was with net neutrality. According to USTelecom that was crippling regulation. In reality, the CEO of every big telco and cable company has publicly stated that they could live with the basic principles of net neutrality. The one area of regulation that has always worried the big ISPs is some kind of price regulation. That’s really not been needed in the past, but all of the big companies look into the future and realize that the time will come when they will probably raise broadband rates every year. We are now seeing the beginnings of that trend, which is probably why USTelecom keeps beating this particular dead horse to death – the ISPs are petrified of rate regulation of any kind.

The argument that the big ISPs held back on investment due to heavy regulation has never had any semblance to reality. The fact is that the big ISPs make investments for the same reasons as any large corporation – to increase revenues, to reduce operating costs, or to protect markets.

As an example, AT&T has been required to build fiber past 12.5 million passings as part of the settlement reached that allowed them to buy DirecTV. AT&T grabbed that mandate with gusto and has been aggressively building fiber for the past several years and selling fiber broadband. Both AT&T and Verizon have also been building fiber to cut transport expense to cell sites – they are building where that transport is too costly, or where they know they want to install small cell sites. The large cable companies all spent capital on DOCSIS 3.1 for the last few years to boost broadband speeds to protect and nurture their growing monopoly of urban broadband. All of these investment decisions were made for strategic business reasons that didn’t consider the difference between light regulation and no regulation. Any big ISP that says they will forego a strategic investment due to regulation would probably see their stock price tumble.

As a numbers guy, I always become instantly suspicious of deceptive graphs. Consider the graph included in the latest USTelecom blog. It shows the levels of industry capital investments made between 2014 and 2018. The graph makes the swings of investment by year look big due to the graphing trick of starting the bottom of the graph at $66 billion instead of at zero. The fact is that 2018 capital investments are less than 3% higher than the investments made in 2014. This is an industry where the aggregate level of annual investment varies by only a few percent per year – the argument that the ISPs have been unleashed due to the end of Title II regulation is laughable and the numbers don’t show it.

There are always stories every year that can explain the annual fluctuation in industry spending. Here are just a few things that made an significant impact on the aggregate spending in the past few years:

  • Sprint had a cash crunch a few years ago and drastically cut capital spending. One of the primary reasons for the higher 2018 spending is that Sprint spent almost $2 billion more in 2018 than the year before as they try to catch up on neglected projects.
  • AT&T spent $2 billion in 2018 for FirstNet, the nationwide public safety network. But AT&T is not spending their own money – that project is being funded by the federal government and ought to be removed from these charts.
  • Another $3 billion of AT&T’s spending in 2018 was to beef up the 4G network in Mexico. I’m not sure how including that spending in the numbers has any relevance to US regulation.
  • AT&T has been on a tear building fiber for the past four years – but they announced last month that the big construction push is over, and they will see lower capital spending in future years. AT&T has the largest capital budget in the industry and spent 30% of the industry wide $75 billion in 2018 – how will USTelecom paint the picture next year after a sizable decrease in AT&T spending?

The fact that USTelecom keeps harping on this talking point means they must fear some return to regulation. We are seeing Congress seriously considering new consumer privacy rules that would restrict the ability of ISPs to monetize customer data. We know it’s likely that if the Democrats take back the White House and the Senate that net neutrality and the regulation of broadband will be reinstated. For now, the big ISPs have clearly and completely won the regulatory battle and broadband is as close to deregulated as any industry can be. Sticking with this false narrative can only mean that the big ISPs think their win is temporary.

Disney Jumps into the OTT Market

Disney is jumping into the OTT fray joining Netflix, Amazon Prime and others that offer a unique library of content for online viewing. The online product will be marketed as Disney+ and will launch on November 12 in the US, later worldwide. Disney seems to be taking the same low-price philosophy as other online start-ups with initial pricing at $6.99 per month, or an annual subscription for $69.99 for the first year.

Disney is counting on a unique customer base of what it calls ‘true fans’ that adore everything Disney. I can attest they exist and have a wife and daughter in that category. Disney thinks these fans will gain them a lot more subscribers than other competing services.

The company is already sitting on one of the largest libraries of valuable content. Disney has been making huge revenues over the years rolling out old Disney classics periodically and then withdrawing them from the market. I’ve seen speculation that Disney plan to offer their full catalog of Disney classics as part of the offering, but we won’t know for sure until the service is available. Disney has a lot of other popular content as well such as the Star Wars and Marvel comics franchises.

Analysts say the $6.99 price is too low and Disney seems to acknowledge it. I read where an analyst at BTIG Research said that Disney didn’t expect for the service to be profitable until 2024 until the service has over 60 million customers. It’s hard to fathom needing that many customers to break even. But Disney is not quite the same as other programmers. Perhaps they are willing to take a loss on the video library in order to drive revenues in other ways. There has to be a big upside to have over 60 million fans watching your content and advertising and buying Disney merchandise.

The Disney launch enters an already crowded market and in doing so makes it that much harder to justify cord cutting. The OTT services that mimic cable TV like DirecTV Now, Hulu, Sling TV, Playstation Vue, FuboTV, and others have increased prices to rival a subscription to an expanded basic line-up from a cable company. There are then dozens of add-on options of other programming like Netflix, CBS All-Access, HBO Now, and the upcoming Apple TV that lure viewers with unique content. It’s starting to be clear that cord cutting is not cheaper unless a viewer has the discipline to restrict content to only one or two services.

Something else is starting to become clear in the industry, which is that customers who buy traditional cable TV are also subscribing to OTT services like Netflix. At the end of last year PwC reported that the number of Netflix subscribers in the US had surpassed the number of traditional cable subscribers.

The PwC study intended to understand the viewers of OTT content. They wanted to see how viewers handle the huge array of programming options. One of their most interesting findings is that age is becoming less of a factor in understanding OTT usage. When PwC started watching the market four years ago it was easy to identify differently buying and viewing habits between younger and older viewers, but those differences seem to be converging.

For example, PwC found that 28% of older consumers had cut the cord, while in earlier years it was a much smaller percentage. They found that 61% of older viewers now watch content on the Internet, up from less than 50% just a year earlier. They found that there was a higher percentage of customers who claimed they are loyal to traditional cable TV from younger viewers ages 25-34 (22%) than with those older than 50 (16%).

One of the most interesting finding in the PwC study was the extent to which people don’t like for video services to suggest viewing. Only 21% of viewers feel that the suggested viewing on sites like Netflix is better than what they can do themselves. The biggest complaint about all OTT services is the ease of finding content and on 12% say they can find the content they want to watch easily.

The End of the Bundle?

There are a few signs in the industry that we are edging away from the traditional triple play bundle or telephone, cable TV and broadband. The bundle was instrumental in the cable company’s success. Back in the day when DSL and cable modems had essentially the same download speed the cable companies introduced bundles to entice customers to use their broadband. The lure of getting a discount for cable TV due to buying broadband was attractive and gave the cable companies an edge in the broadband marketing battle.

Over time the cable companies became secure in their market share and they created mandatory bundles, meaning they would not sell standalone broadband. Over time this spit the broadband market in cities – the cable company got customers who could afford bundles and the telco with DSL got everybody else. Many of the cable companies became so smug about their bundles that they forced customers to buy cable TV just to get their broadband. I’ve noticed over the last year that most of the mandatory bundles have died.

The bundle lost a little luster when the Julia Laulis, the CEO of CableOne, told her investors in February on the 4Q 2018 earnings call that the company no longer cares about the bundle. She said what I’m sure that many other cable companies are discussing internally, which is that the bundle doesn’t have any impact in attracting customers to buy broadband. On that call she said, “We don’t see bundling as the savior for churn. I know that we don’t put time and resources into pretty much anything having to do with video because of what it nets us and our shareholders in the long run. We pivoted to a data-centric model over five, six years ago, and we’ve seen nothing to derail us from that path.”

Her announcement raises two important issues that probably spell the eventual end of bundling. First, there is no real margin on cable TV. The fully loaded cost of the product has increased to the point where the bottom line of the company is not improved by selling cable. The only two big cable providers who might see some margin from cable TV are Comcast and AT&T since they own some of the programming but for everybody else the margins on cable TV have shrunk to nothing, or might even be negative.

I’ve had a number of clients take a stab at calculating the true cost of providing cable TV. The obvious big cost of the product is the programming fees. But my clients tell me that a huge percentage of their operational costs come from cable TV. They say most of the calls to customer service are about picture quality. They say that they do far more truck rolls due to cable issues than for any other product. By the time you account for those extra costs it’s likely that cable TV is a net loser for most small ISPs – as it obviously is for CableOne, the seventh largest cable company.

The other issue is cable rates. High programming rates keep forcing cable providers to raise the price of the TV product every year. We know that high cable prices are the number one issue cited by cord cutters. Perhaps more importantly, it’s the number one issue driving customer dissatisfaction with the cable company.

I have to wonder how many other big cable companies have come to the same conclusion but just aren’t talking about it. Interestingly, one of the metrics used by analysts to track the cable industry is average revenue per user (ARPU). If cable companies bail on the bundle and lose cable customers their ARPU will drop – yet margins might stay the same or even get a little better. If there is a new deemphasis on bundles and cable TV subscription the industry will need to drop the ARPU comparison.

It’s not going to be easy for a big cable company to back out of the cable TV business. Today there is still a penalty for customers who drop a bundle – dropping cable TV raises the price for the remaining products. We’ll know that the cable companies are serious about deemphasizing cable TV when that penalty disappears.

City Authority in Rights-of-Way

The California Supreme Court just joined the fray in the battle over the placement of small cells and other wireless equipment in public rights-of-ways. Currently, there are numerous lawsuits challenging the FCC ruling that wireless carriers can put their devices anywhere in the public rights-of-way. The California lawsuit preceded that order and was asking if a City has the right to dictate the appearance of wireless electronics.

We’ve recently seen wireless carriers hanging some fairly hideous devices on poles. The FCC order allows them to hang devices as large as 28 cubic feet, and that’s large enough to hang devices that sprawl across the sightlines on poles. Cities look at some of the early examples of devices on poles and are fearful of the proliferation of similar devices as each large wireless carrier and others begin hanging small cells and 5G fixed wireless loop devices.

The original suit came from T-Mobile that claimed that San Francisco had no authority to set aesthetics requirements for wireless devices. It is an interesting challenge because government entities have been dictating aesthetics requirements for years – such as cell sites one sees all over Florida that are disguised to look like palm trees – but which never do.

My guess is that T-Mobile has been emboldened by the recent federal law that guarantees wireless carriers access to utility poles, light poles and other locations inside of public rights-of-way. The FCC order effectively tells municipalities that they can’t reject requests to place devices and I’m guessing T-Mobile hoped that meant that cities had no authority over them.

T-Mobile relied on language in section 7901 of the California public utilities code:

Telegraph or telephone corporations may construct lines of telegraph or telephone lines along and upon any public road or highway, along or across any of the waters or lands within this State, and may erect poles, posts, piers, or abutments for supporting the insulators, wires, and other necessary fixtures of their lines, in such manner and at such points as not to incommode the public use of the road or highway or interrupt the navigation of the waters. (I must admit that one of the reasons I like to read legal cases is the language used in laws. This one uses the term incommode which means to inconvenience or impede.)

T-Mobile interpreted that law to mean that they have the right to construct facilities as long as they don’t obstruct the transmission path. They further argued that San Francisco could not regulate anything that is not specifically allowed by this same language.

The courts disagreed with T-Mobile’s reading of the law. The courts said that a city has inherent local authority to determine the appropriate use of land within its jurisdiction. That authority includes the right to establish aesthetic conditions for land use. The Court said the case boiled down to whether Section 7901 somehow divested the city of that inherent authority.

The Courts also said that T-Mobile’s interpretation of the term incommode was incorrect, in that T-Mobile thought they could hang a wireless device anywhere as long as they didn’t impede public road use or the ability of other utilities to use the poles. The Courts said that incommoded generally means inconvenience and that the city could object to a pole placement if it inconvenienced the city in other ways such as generating noise, causing negative health consequences, or creating safety concerns.

While the California ruling was very specific and ruled that the City of San Francisco could require wireless carriers to meet aesthetic requirements, the ruling and the discussion in the decision can be interpreted as being directly in opposition of the FCC order that allows wireless carriers to place small cells anywhere they want, without city interference.

Lawsuits generally rely on precedents and judges often consider rulings made in other courts on similar issues. It seems likely that this California Supreme Court ruling is going to make it into the challenges to the FCC ruling that preempted local control over small cell placement. That FCC ruling loses its teeth if cities can consider things like public safety or the safety of technicians that work on poles.

Wireless carriers are currently acting as if the FCC order is a done deal, even as it is being challenged by numerous states and cities. I’ve heard several people refer to carrier behavior as a land grab, where the carriers are grabbing connection space on poles even when they have no immediate use for them – they are getting on poles before courts might make it harder to do so. This Supreme Court ruling makes it clear that the small cell issue is far from resolved and we’re probably going to be following this in courts for at least a few more years.

Broadband Usage Continues to Grow

The firm OpenVault, a provider of software that measures data consumption for ISPs reported that the average monthly data use by households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. The company also reported that the medium use per household grew from 103.6 gigabytes in 2017 to 145.2 gigabytes in 2018 – a growth rate of 40%. The medium represents the midpoint of users, with half of all households above and half below the medium.

To some degree, these statistics are not news because we’ve known for a long time that broadband usage at homes, both in total download and in desired speeds has been doubling every three years since the early 1980s. The growth in 2018 is actually a little faster than that historical average and if the 2018 growth rate was sustained, in three years usage would grow by 235%. What I find most impressive about these new statistics is the magnitude of the annual change – the average home used 67 more gigabytes of data per month in 2018 than the year before – a number that would have seemed unbelievable only a decade ago when the average household used a total of only 25 gigabytes per month.

There are still many in the industry who are surprised by these numbers. I’ve heard people claim that now that homes are watching all the video they want that the rate of growth is bound to slow down – but if anything, the rate of growth seems to be accelerating. We also know that cellular data consumption is also now doubling every two years.

This kind of growth has huge implications for the industry. From a network perspective, this kind of bandwidth usage puts a big strain on networks. Typically the most strained part of a network is the backbones that connect to neighborhood nodes. That’s the primary stress point in many networks, including FTTH networks, and when there isn’t enough bandwidth to a neighborhood then everybody’s bandwidth suffers. Somebody that designed a network ten years ago would never have believed the numbers that OpenVault is reporting and would likely not have designed a network that would still be sufficient today.

One consequence of the bandwidth growth is that it’s got to be driving homes to change to faster service providers when they have the option. A household that might have been happy with a 5 Mbps or 10 Mbps connection a few years ago is likely no longer happy with it. This has to be one of the reasons we are seeing millions of homes each year upgrade from DSL to cable modem each year in metropolitan areas. The kind of usage growth we are seeing today has to be accelerating the death of DSL.

This growth also should be affecting policy. The FCC set the definition of broadband at 25/3 Mbps in January of 2015. If that was a good definition in 2015 then the definition of broadband should have been increased to 63 Mbps in 2019. At the time the FCC set that threshold I thought they were a little generous. In 2014, as the FCC was having this debate, the average home downloaded around 100 gigabytes per month. In 2014 the right definition of broadband was probably more realistically 15 – 20 Mbps and the FCC was obviously a little forward-looking in setting the definition. Even so, the definition of broadband should be increased – if the right definition of broadband in 2014 was 20 Mbps, then today the definition of broadband ought to have been increased to 50 Mbps today.

The current FCC is ignoring these statistics for policy purposes – if they raise the definition of broadband then huge numbers of homes will be classified as not having broadband. The FCC does not want to do that since they are required by Congressional edict to make sure that all homes have broadband. When the FCC set a realistic definition of broadband in 2015 they created a dilemma for themselves. That 2015 definition is already obsolete and if they don’t change it, in a few years it is going to be absurdly ridiculous. One only has to look forward three years from now, when the definition of broadband ought to be 100 Mbps.

These statistics also remind us of the stupidity of handing out federal subsidies to build technologies that deliver less than 100 Mbps. We still have two more years of CAF II construction to upgrade speeds to an anemic 10 Mbps. We are still handing out new subsidies to build networks that can deliver 25/3 Mbps – networks that are obsolete before they are completed.

Network designers will tell you that they try to design networks to satisfy demands at least seven years into the future (which is the average life of many kinds of fiber electronics). If broadband usage keeps doubling every three years, then looking forward seven years to 2026, the average home is going to download 1.7 terabytes per month and will expect download speeds of 318 Mbps. I wonder how many network planners are using that target?

The final implications of this growth are for data caps. Two years ago when Comcast set a terabyte monthly data cap they said that it affected only a few homes – and I’m sure they were right at the time. However, the OpenVault statistics show that 4.12% of homes used a terabyte per month in 2018, almost double from 2.11% in 2017. We’ve now reached that point when the terabyte data cap is going to have teeth, and over the next few years a lot of homes are going to pass that threshold and have to pay a lot more for their broadband. While much of the industry has a hard time believing the growth statistics, I think Comcast knew exactly what they were doing when they established the terabyte cap that seemed so high just a few years ago.

Cable Subscribers – 3Q 2018

We are now in the second year of real cord cutting. The statistics show the traditional cable industry losing about 1 million customers per quarter. The numbers for the recently ended 3Q of 2018 come from the Leichtman Research Group and I compare to year-end 2017.

3Q 2018 4Q 2017 Change
Comcast 22,015,000 22,357,000 (342,000) -1.5%
DirecTV 19,625,000 20,458,000 (883,000) -4.1%
Charter 16,628,000 16,997,000 (369,000) -2.2%
Dish 10,286,000 11,030,000 (744,000) -6.7%
Verizon 4,497,000 4,619,000 (122,000) -2.6%
Cox 4,035,000 4,200,000 (165,000) -3.9%
AT&T 3,693,000 3,657,000 36,000  1.0%
Altice 3,322,800 3,405,500 (82,700) -2.4%
Frontier 873,000    961,000 (88,000) -9.2%
Mediacom 793,000    821,000 (28,000) -3.4%
Cable ONE 328,921    283,001 45,920 16.2%
  Total 86,096,721 88,788,501 (2,691,780) -3.0%

These companies represent roughly 95% of the entire cable market. Not included in these numbers is WOW with over 400,000 cable customers.

This group of large companies dropped almost 2.7 million customers so far this year, with losses in the third quarter over 1 million – making the third quarter the biggest losing quarter in history. Cord cutting is accelerating and 2018 is certainly going to exceed the 3.1 million cable customers that dropped in 2017.

The big losers are the satellite companies which lost 1,577,000 customers so far in 2018. These losses are offset by the fact that these two companies own the largest online video service, with Dish’s Sling TV now having 2,370,000 customers and DirecTV Now having 1,858,000 customers.

Not reflected in these numbers is the fact that 2018 so far has been a boom year for building new homes, with 1.6 million new housing units added nationally during the year so far. If you assume that new homes buy cable TV at the same rate as older homes, then the estimate of cord cutting would be 1.1 million higher for the first three quarters than is shown in these net numbers shown in the table.

In 2017 Comcast and Charter didn’t fare as poorly as the rest of the industry, but their rate of loss has roughly doubled over a year ago.

Cable ONE looks to a bit of an anomaly, but they had lost over 11% of customers in 2017 due to disputes with programmers, and they seem to have recaptured many of those customers.

The most obvious thing that jumps out from these numbers is that cord cutting is real and is here to stay. Within two short years after the start of the cord cutting phenomenon the big cable providers are on track to lose over 4% of total traditional cable subscribers in a year. That’s a lot of lost revenue for these companies and a lot of lost revenues for the programmers.

Getting Militant for Broadband

My job takes me to many rural counties where huge geographic areas don’t have broadband. I’ve seen a big change over the last two years in the expectations of rural residents who are now demanding that somebody find them a broadband solution. There have been a number of rural residents calling for better broadband for a decade, but recently I’ve seen the cries for broadband grow into strident demands. As the title of this blog suggests, people are getting militant for broadband (but not carrying guns in doing so!)

The perceived need for broadband has changed a lot since the turn of this new century. In 2000 only 43% of homes had a broadband connection – and in those days that meant they had a connection that was faster than dial-up. In 2000 DSL was king and a lot of homes had upgraded to speeds of 1 Mbps. There have always been homes that require broadband, and I’m a good example since I work from home, and when I moved fifteen years ago my offer on a new house was contingent on the home having broadband installed before closing. My real estate agent at the time said that was the first time she’d ever heard about broadband related to home ownership.

As I’ve cited many times, the need for broadband has continued to grow steadily and has been doubling every three years. By 2010 the number of homes with broadband grew to 71%, and by then the cable companies were beginning to dominate the market. By then DSL speeds had gotten better, with the average speeds at about 6 Mbps, but with some lucky customers seeing speeds of around 15 Mbps. But as DOCSIS 3.0 was implemented in cable networks we started seeing speeds up to 100 Mbps available on cable systems. It was a good time to be a cable company, because their rapid revenue growth was fueled almost entirely by adding broadband customers.

Broadband in urban areas has continued to improve. We’re now seeing Comcast, Charter, Cox and other cable company upgrade to DOCSIS 3.1 and offer speeds of up to 1 Gbps. DSL that can deliver 50 Mbps over two bonded copper lines is becoming old technology. Even urban cellular speeds are becoming decent with average speeds of 12 – 15 Mbps.

But during all of these upgrades to urban broadband, huge swaths of rural America is still stuck at 2000 or earlier. Some rural homes have had access to slow DSL of 1 – 2 Mbps at most. Rural cellular speeds are typically half of urban speeds and are incredibly expensive as a home broadband solution. Satellite broadband has been available the whole time, but the high prices, gigantic latency and stingy data caps have made most homes swear off satellite broadband.

Rural homes look with envy at their urban counterparts. They know urban homes who have seen half a dozen major speed upgrades over twenty years while they still have the same lousy choices of twenty years ago. Some rural homes are seeing an upgrade to DSL due to the CAF II program of speeds of perhaps 10 Mbps. While that will be a relief to a home that has had no broadband – it doesn’t let a home use broadband in the same way as the rest of the country.

To make matters feel worse, rural customers without broadband see some parts of rural America get fiber broadband being built by independent telephone companies, electric cooperatives or municipalities. It’s hard for them to understand why there is funding that can make fiber work in some places, but not where they live. The most strident rural residents these days are those who live in a county where other rural customers have fiber and they are being told they are likely to never see it.

This disparity between rural haves and have nots is all due to FCC policy. The FCC decided to make funds available to rural telcos to upgrade to better broadband, but at the same time copped out and handed billions to the giant telcos to instead upgrade to 10 Mbps DSL or wireless. To make matters worse, it’s becoming clear that AT&T and Verizon are intent in eventually tearing down rural copper, which will leave homes with poor cellular coverage without any connection to the outside world.

The FCC laments that they cannot possibly afford to fund fiber everywhere. But they missed a huge opportunity to bring fiber to millions when they caved to lobbyists and gave the CAF II funding to the big telcos. Recall that these funds were originally going to be awarded by a reverse auction and that numerous companies had plans to ask for the funding to build rural fiber.

It’s no wonder that rural areas are furious and desperate for better broadband. Their kids are at a big disadvantage to those living in towns with broadband. Farmers without broadband are competing with those using agricultural IoT. Realtors report that they are having a hard time selling homes with no broadband access. People without broadband can’t work from home. And rural America is being left behind from taking part in American culture without access to the huge amount of content now available on the web.

360-degree Virtual Reality

South Korea already has the fastest overall broadband speeds in the world and they are already working towards the next generation of broadband. SK Broadband provides gigabit capable fiber to over 40% of households and has plans to spend over $900 million to increase that to 80% of households by 2020.

The company also just kicked off a trial of next generation fiber technology to improve bandwidth delivery to customers. Customers today have an option of 1 Gbps service. SK Broadband just launched a trial with Nokia in an apartment building to increase end-user bandwidth. They are doing this by combining the current GPON technology with both XGSPON and NG PON2 to increase the overall bandwidth to the apartment complex from 2.5 Gbps to 52.5 Gbps. This configuration allows a customer to run three bandwidth-heavy devices simultaneously with each having access to a separate 833 Mbps symmetrical data path. This particular combination of technologies may never be widely implemented since the company is also considering upgrades to bring 10 Gbps residential service.

The big question is why SK Broadband thinks customers need this much bandwidth? One reason is gaming and over 25 million people, or a little over half the population of the country partake in online gaming today. There is not another country that is even a close second to the gaming craze there. The country also has embraced 4K video and a large percentage of programming there uses the format, which can require data streams as large as 15 Mbps for each stream.

But those applications don’t alone the kind of bandwidth that the company is considering. The architect of the SK Broadband network cites the ability to deliver 360-degree virtual reality as the reason for the increase in bandwidth. At today’s compression techniques this could require data streams as much as 6 times larger than a 4K video stream, or 90 Mbps.

What is 360-degree virtual reality and how does it differ from regular virtual reality? First, the 360-degree refers to the ability to view the virtual reality landscape in any direction. That means the user can look behind, above and below them in any direction. A lot of virtual reality already has this capability. The content is shot or created to allow viewing in any direction and the VR user can look around them. For example, a 360 virtual reality view of a skindiver would allow a user to follow an underwater object as the diver approaches, and look back to watch is as they pass by.

But the technology that SK Broadband sees coming is 360-degree immersive VR. With normal virtual reality a user can look at anything within sight range at a given time. But with normal virtual reality the viewer moves with the skindiver – it’s strictly a viewing experience to see whatever is being offered. Immersive virtual reality let’s a user define the experience – in an immersive situation the VR user can interact with the environment. They might decide to stay at a given place longer, or pick up a seashell to examine it.

SK Broadband believes that 360-degree VR will soon be a reality and they think it will be in big demand. The technology trial with Nokia is intended to support this technology by allowing up to three VR users at the same location to separately enter a virtual reality world together yet each have their on experience. Immersive VR will allow real gaming. It will let a user enter a 3D virtual world and interact in any manner they wish – much like the games played today with game machines.

This is a great example of how broadband applications are developed to fit the capacity of networks. South Korea is the most logical place to develop high-bandwidth applications since they have so many customers using gigabit connections. Once a mass of potential users is in place then developers can create big-bandwidth content. It’s a lot harder for that to happen in the US since the percentage of those with gigabit connections is still tiny. However, an application developer in South Korea can get quick traction since there is a big pool of potential users.