Categories
Current News Regulation - What is it Good For?

The FCC’s Data Collection Effort

Character for children of FCC"Broadband"
Character for children of FCC”Broadband” (Photo credit: Wikipedia)

The FCC just changed the way that they are going to gather data from carriers about voice and data usage in the US. To some degree they seem to be throwing in the towel and just giving up.

I have blogged before about the massive inadequacies of the National Broadband Map. This is an FCC-sponsored effort to show the availability of broadband on a geographic basis. This sounds like a laudable goal, but the carriers decide what information they want to supply to the mapping process, and so the map is full of what can only be described as major lies from the largest carriers. They claim to have broadband where they don’t and at speeds far greater than they actually deliver.

The FCC announced new rules for their data collection process that is done using FCC Form 477. This revised effort by the FCC is going to make their data gathering more like the process that is used to collect data for the National Broadband Map. They are no longer going to try to collect actual data speeds in tiers, but instead will be collecting only advertised speeds for data – the fastest advertised speed for landline providers and the slowest advertised speeds for wireless providers. For the life of me I can’t imagine how this data can be of the slightest use to anybody.

I just recently worked with a client in a small town in Oregon. The incumbent providers there are the biggest telephone company and cable company in the state. In both cases, they advertise the same speeds in this small town that they advertise in Portland. But in this town, as in most or rural America, the actual speeds delivered are far slower. They think the fastest cable modem speeds in the town are from 3 – 5 Mbps download and the fastest DSL is not much over 1.5 Mbps. And yet both carriers advertise products at many times those speeds.

This would just be a big annoyance if it wasn’t for the fact that the FCC and others use the data gathered to talk about what a great job the carriers are doing in this country to supply broadband. I recently saw an announcement that 98% of households now have broadband availability. And since the FCC’s definition of broadband is now a download speed of 4 Mbps and an upload speed of 1 Mbps, this makes it sound like the country’s broadband problems are being solved. But announcements of this sort are based upon lies and exaggerations by the carriers.

And since the whole point of this data gathering effort is to formulate policies to spur the carriers to do better, letting the carriers self-report whatever they want is like asking the fox to go count the eggs in the henhouse every morning. There is no practical penalty against a carrier advertising any speed they want or reporting falsely to the FCC. And it’s a lot easier, as it is with the Oregon example, for the incumbent providers to gear all of their advertising in a state around the urban markets. I have no idea if those incumbents in Oregon can actually deliver the advertised speeds in Portland, but I know for a fact that they do not do so outside of Portland.

The FCC is also changing the way that they gather information about VoIP lines. But I think the day for them to be able to gather any meaningful data about business phones in the country is over. There is such a proliferation of IP Centrex and other VoIP technologies that the carriers don’t even know what is being delivered. Consider this:

  • It’s now possible to use one number for a thousand lines in a call center or instead to give a thousand numbers to one phone.
  • There is a proliferation of resellers in the market who buy numbers and 911 from larger carriers so that they don’t have to become a CLEC. And these resellers can then deliver a wide variety of business voice services over anybody’s data connection. These carriers will not be reporting what they are doing to the FCC because most of them are not certified as carriers but rely on the certification of the CLEC that gave them numbers.  Nobody in the FCC reporting chain is going to know about or report these kinds of customers and lines. And it gets worse because I know of many cases now of resellers of these resellers. Literally almost anybody can become a carrier overnight reselling these services. It’s back to the wild west days we used to see with long distance resale. I’m expecting to go to a telecom convention soon and see the shark-skin suits again.
Categories
Current News Regulation - What is it Good For?

At Least We are Not Europe

Europe Simulator (Photo credit: wigu)

In this country the FCC has undertaken various policy initiatives to promote broadband. However, except for some universal service funding that will bring broadband for the first time to tribal areas and very rural places, these initiatives come with no federal money. And so the real broadband policy in the country is to wait for the private sector to build the infrastructure. The FCC may make proclamations about creating gigabit cities, but it’s completely up to the private sector to make it happen.

And we all know how that is working out. We have a checkerboard of broadband coverage. At one end of the spectrum are the fiber networks – Google and a few others bringing gigabit fiber, Verizon with FiOS, and many smaller communities with fiber built by municipalities or independent telephone companies. In the middle most metropolitan areas are served by decently fast cable modem service and ADSL2 DSL. And then there are a lot of smaller cities and rural communities where the DSL and the cable modems are a generation or more old and which deliver far less bandwidth than advertised. And we have many rural areas still with no broadband.

But what we have, by and large, is still better than what has been happening in Europe. And this is because our regulatory policy for last-mile connectivity is mostly hands-off while the European markets are heavily regulated. After the European Union was formed the European regulators went for a solution that promoted low prices. They have required that all large networks be unbundled for the benefit of multiple service providers. This has turned out to be a short-term boon for consumers because it has brought down prices in every market where multiple providers are competing.

But there is a big catch and the European policy is not going to work out well in the long-run. Over the last five years the per capita spending on new telecom infrastructure in Europe is less than half of what it is in the US, and this is directly due to the unbundling policy. Network owners have no particular incentive to build new networks or upgrade existing ones because it brings their competitors the same advantages they get.

In the long-run, Europe is going to fall far behind everybody else in fiber deployment because nobody wants to invest in fiber to connect to homes and businesses. There have been several major fiber initiatives in recent years in Europe, but these have largely been driven by large cities who are spending the money on the fiber infrastructure, much as is happening with some cities here. But the normal kinds of companies that ought to be investing in last-mile fiber in Europe, the cable companies and the telcos, are not doing so.

We tried something similar here for a few years. When the Telecommunications Act of 1996 was enacted, one of the major provisions was that the RBOCs (Bell companies) had to unbundle their networks, much as is being done in Europe. This was to spur competition by allowing new competitors to get a start in the business without having to invest in a new network. And this brought short-term benefits to consumers for a while. Companies were leasing RBOC unbundled loops and providing voice and data (DSL at the time) to businesses and residences all over the country.

But the FCC didn’t go the whole way like they did in Europe or else they would have also unbundled the large cable networks in this country. The unbundled telecom network business plans broke apart after cable modem service began winning the bandwidth war. And of course, there was the telecom crash that killed the larger new competitors. There are still a few companies out there pursuing this unbundled business model, but for the most part it didn’t work. And the reason it didn’t work is that it is a form of arbitrage. The business plan only worked because federal regulators made the RBOCs unbundle their networks and then state regulators set the prices for the network elements low to spur competition. But the services the competitors were able to offer were no better than what the RBOCs could offer on the same networks.

It’s always been clear to me that you can’t build a solid business on arbitrage. A smart provider can take advantage of temporarily low prices to make a quick profit when they find arbitrage, but they must be ready to ditch the business and run when the regulatory rules that created the opportunity change.

And Europe is currently engaged in one gigantic arbitrage situation. There are multiple service providers who are benefitting by low network costs, but with no burden to make capital investments. Customers there are winning today due to the lower prices due to competition. But in the long run nobody wins. The same rules that are making prices low today are ensuring that nobody makes any serious investment in building new fiber networks. So the competitors will fight it out on older networks until one day when the arbitrage opportunity dies, the competitors will all vanish like the wind. We know it will happen because it happened here. The CLECs in this country had tens of millions of customers, and they disappeared from the market and stranded those customers in a very short period of time.

The only policy that is really going to benefit consumers here, or in Europe, is one that fosters the building of state-of-the-art networks. The commercial providers have not stepped up nearly enough in this country and there is still not a lot of fiber built to residences. But in Europe it’s even worse. So, as much as I read about people criticizing the broadband policies in the US, I have to remind myself – at least we are not Europe.

Categories
The Industry

Is it Possible to do a Valid Phone Survey?

Telephone surveys have always been a staple of doing research in the business and political arenas. Surveys have been given to random samples of households to find out how the public as a whole feels about various topics. And surveys have been effective. The whole point of a survey is to sample a relatively small number of people and have good faith that the results of the survey represent the opinions of the public as a whole.

But there has been such a large drop in the number of households with landlines that one has to ask if it is possible to any longer do a valid telephone survey. The percentage of households with landlines has declined greatly and nationwide it is estimated to now be below 60%. We recently heard of a community in Colorado that has less than 45% of households with landlines.

The whole point of doing a survey is so that you can rely on the results to tell you something meaningful about the whole population. And there are several aspects to conducting a survey that are mandatory if the results are to be believable. In order to be valid, a survey must be delivered randomly to a sufficient proportion of the universe being sampled.

And therein lies the problem. I think it’s a valid question to ask if households who still use landlines are representative of the universe of all households. I think there is a lot of evidence that they are not representative. Telecom carriers everywhere are reporting that households that drop landlines are younger, more tech savvy and more innovative than households that keep landlines.

And so, in statistical terms, one must ask the hard question if a survey given only to households with landlines is any longer representative of the whole population. And the answer might be sometimes, based upon what is being asked. But for most of the purposes I see surveys used for, my gut tells me that landline households are no longer the same as all households.

For example, say that you wanted to ask how many people in a City wanted to get a gigabit of bandwidth. If you survey households with landlines you are most likely mostly talking to older households and households with kids. You are probably not going to be talking to younger households and tech savvy households who have a lifestyle that eschews landlines. And I think you are going to get a skewed answer that you cannot believe. One would think that a larger percentage of the landline houses would not be interested in gigabit speeds while you didn’t talk to many of the households who would be interested. And so, when you summarize your survey results you are not going to have a believable estimate of the number of people who would be interested in the gigabit speeds – which was the whole point of doing the survey.

There might be a way around this, but it is hard to pull off. If you can find a way to randomly call households in the town that includes landline and cellphone households, then you are again sampling the real universe of households. But this is a problem for several reasons:

  • If you are already in business you are allowed to call any or all of your own customers. But as soon as you try to call in an area of people who are not your customers you must follow the Do Not Call rules, which says that it is illegal to call people who have registered to not get junk calls. You can obtain lists of such people, but it adds expense and cost to the survey.
  • Then you must have access to a database that has a telephone number for everybody, and these rarely exist. Maybe some local government or utility might have such a list, but they can’t share these lists with anybody else due to privacy issues.
  • Even if you have this kind of list it is against FCC rules to call cell phones to conduct a survey. The problem is that there are still plenty of customers on fixed-minute cellular plans and a lot of surveys require 20 minutes or more. If you are going to call cell phones you are strictly breaking the rules, so the first thing you should do is to tell cell phone users they can opt out of the call. But if enough cell phone callers refuse to take the survey, then you are back to having an invalid sample.
  • You can’t solicit cell phone households to give their phone numbers for purposes of conducting a survey. As soon as you do that the sample is not random and we are back to square one.

A non-statistician might think, “As long as the results are close, I am okay with the survey not being entirely valid”. And they would be wrong. If a survey isn’t done properly, then there is no validity to the results. You do not want to make any important business decision based upon an invalid assumption. There are enough ways to fail in business and you shouldn’t add the sin of relying on false assumptions to the list of reasons why your business plan didn’t succeed.

There are other ways to do surveys such as going door-to-door, but other kinds of surveys are usually costlier and they have their own potential pitfalls. We might be soon be approaching the day when surveys are going to disappear from our lexicon of useful business tools.

Categories
Regulation - What is it Good For? Technology

FCC Makes Changes to 60 GHz Spectrum

United States radio spectrum frequency allocations chart as of 2003 (Photo credit: Wikipedia)

On August 12, 2013 the FCC, in [ET Docket No 07-113] amended the outdoor use for the 60 GHz spectrum. The changes were prompted by the industry to make the spectrum more useful. This spectrum is more commonly known as the millimeter spectrum, meaning it has a very short wavelength and operates between 57 GHz and 64 GHz. Radios at high frequencies like this have very short antennae which are typically built into the unit.

The spectrum is used today in two applications, a) as outdoor short-range point-to-point systems used in place of fiber, such as connecting two adjacent buildings, and b) as in-building transmission of high-speed data between devices for functions such as transmitting uncompressed high-definition (HD) video between devices like blu-ray recorders, cameras, laptops and HD televisions.

The new rules modify the outside usage to increase power and thus increase the distance of the signal. The FCC is allowing an increase in emissions from 40 dBm to 82 dBm which will increase the outdoor distance for the spectrum up to about 1 mile. The order further eliminates the need for outside units to send an identifying signal, which now makes this into an unlicensed application. This equipment would be available to be used by anybody, with the caveat that it cannot interfere with existing in-building uses of the spectrum.

One of the uses of these radios is that multiple beams can be sent from the same antenna site due to the very tight confinement of the beams. One of the drawbacks of this spectrum is it is susceptible to interference from heavy rain, which is a big factor in limiting the distance.

Radios in this spectrum can deliver up to 7 Gbps of ethernet (minus some for overheads) and so this is intended an alternative to fiber drops to buildings needed less bandwidth than that limit. A typical use for this might be to connect to multiple buildings in a campus or office park environment rather than having to build fiber. The FCC sees this mostly as a technology to be used to serve businesses, probably due to the cost of the radios involved.

Under the new rules the power allowed by a given radio is limited to the precision of the beam created by that radio. Very precise radios can use full power (and get more distance) while the power and distance are limited for less precise radios.

The FCC also sees this is an alternative for backhaul to 4G cellular sites, although the one mile limitation is a rather short one. Most 4G sites that are already within a mile of fiber have largely been connected.

This technology will have a limited use, but there will be cases where using these radios could be cheaper than installing fiber and/or dealing with inside wiring issues in large buildings. I see the most likely use of these radios to get to buildings in crowded urban environments where the cost of leasing fiber or entrance facilities can be significant.

The 60 GHz spectrum has also been allowed for indoor use for a number of years. The 60GHz band when used indoors has a lot of limitations related to both cost and technical issues. The technical limitations are 60 GHz must be line-of-sight and the spectrum doesn’t go through walls. The transmitters are also very power consumptive and require big metal heat sinks and high-speed fans for cooling. Even if a cost effective 60 GHz solution where to be available tomorrow battery operated devices would need a car battery to power them.

One issue that doesn’t get much play is the nature of the 60 GHz RF emissions. 60 GHz can radiate up to 10 Watts with the spectrum mask currently in place for indoor operation. People are already concerned about the 500mW from a cell phone and WiFI and it is a concern in a home environment to have constant radiation at 10 Watts of RF energy. That’s potentially 1/10 the power of a microwave oven radiated in your house and around your family all of the time.

Maybe at some point in the distant future there may be reasonable applications for indoor use of 60 GHz in some vertical niche market, but not for years to come.

Categories
Regulation - What is it Good For?

Doing Away With Regulations

Seal of the United States Federal Communications Commission. (Photo credit: Wikipedia)

In a process that most carriers probably don’t know about, any carrier can petition the FCC to get rid of or modify any regulation that it no longer thinks is necessary. This is an ongoing process and so the FCC issues a biennial report and every two years produces a summary of the requests that have been made as well as the FCC response to those requests. The latest biennial report DA-13-1708A1 was issued yesterday.

For the most part this is pretty dry regulatory stuff, but some of the changes that carriers request are significant and affects a lot of carriers. While many of the requests are to eliminate reports to the FCC, many requests are more substantial. In reading through this year’s report one will notice that Verizon appears more than any other carrier and one must imagine that they have somebody on staff dedicated to removing regulation.

Here are some of the issues investigated by the FCC in this latest report:

  • CenturyLink and Verizon advocate eliminating continuing property records (CPRs) contained in Part 32. These are detailed asset logs showing the cost, age and type of each asset in a company and must be updated each year for both additions and retirements. For even small LECs the cost of producing CPRs can be expensive.  The FCC has now eliminated the requirement for CPRs for price cap carriers but still require them for rate-of-return carriers.
  • Verizon asked that the Eligible Telecommunications Carrier (ETC) rules in Part 54 b modified so that an ETC is no longer required to serve customers in areas where the carrier gets no USF support, and also in areas where it is unprofitable to serve with landline but where customers have a competitive alternative. Verizon asks to get rid of its lifeline responsibilities in such areas, and effectively be able to walk away from serving customers. The FCC did not agree to removing these rules but instead wrapped the request into the Connect America Fund and the Lifeline and Link-up Reform and Modernization proceedings.
  • USTelecom asked the FCC to remove the requirement to notify the FCC when a carrier wants to replace legacy technology with an IP broadband technology covered by Part 63. For example, this would allow a carrier to stop offering copper services if they offer something else, such as what Verizon wants to do on Fire Island sue to hurricane damage. The FCC declined to accept this request.
  • USTelecom asked that ILECs not be required to have separate subsidiaries for offering in-region long distance as required by Part 64. The FCC concluded that this requirement no longer applied to ILECs subject to price caps. But the rules remain in effect for rate-of-return carriers.
  • Verizon asked the FCC to complete access reform by eliminating originating access charges as required by Part 69. The FCC noted that this was more properly addressed in the ongoing USF/ICC Transformation FNPRM.
  • NTCH asked that the FCC eliminate requirements to notify the FCC of temporary cell phone towers as required by Part 17. Temporary towers are often used during the process of relocating existing towers or when repairing towers after a disaster. The FCC responded by forebearing the existing rules for towers that would be in place for less than 60 days and that which met other conditions.
  • Verizon asked the FCC to eliminate the requirement that it notify the FCC within 120 minutes for major network outages per Part 4. Verizon noted that they have as many as 1,000 such outages every year. The FCC did not agree to the request.
  • The Telecommunications Industry Association (TIA) asked that some of the rules concerning standards for hearing aids and volume controls for hard-of-hearing sets in Part 68 be eliminated due to new TIA standards. The FCC responded by issuing a Public Notice and asking if there should be a rulemaking for the issue.

As you can see by just this sample of the issues that were covered in this docket that the FCC is always being challenged by carriers to eliminate regulations. Any carrier can make such a request and there were dozens of such requests considered in this latest two-year cycle. Sort of like sausage-making, regulation is not always a pretty picture, but the FCC does eliminate regulatory requirements every year that it deems are no longer in the public benefit.

Categories
Current News Regulation - What is it Good For?

Is Wireless Really Better than Wires?

An rural area west of Route 41 and Lowell, Indiana. (Photo credit: Wikipedia)

It is clear that the FCC prefers wireless as the broadband solution for rural areas. It seems like they badly want every rural household in the country to get some kind of broadband just so they can take this issue off their plate. Just about every bit of policy decided in the last few years has a bias towards wireless bias.

For instance, the historic Universal Service Fund which was used to promote rural telephony over copper has been transitioned into a new CAF fund that will instead promote high-speed data in rural areas. There are several aspects of the CAF that clearly will ensure that the funds will go mostly to wireless carriers. The bulk of the funding will eventually be distributed by a reverse auction. This is an auction where the broadband providers in a given area will be able to compete for the funding, and the one who bids for the lowest amount of subsidy per customer will receive the funds.

The first time I read the reverse auction rules my first thought was that this money is all going to wireless companies. The reverse auction rules strongly favor companies who can provide data over large areas. Any smaller company who wants to get CAF funds to help pay for a rural wired network can be undercut by the largest wireless companies. AT&T Wireless and Verizon Wireless are the two richest and most successful companies in the country. They pay many billions of dollars of dividends annually and they can afford to underbid any rural landline company for subsidy, simply because they do not need it. But of course, they will bid in the reverse auctions and take the subsidies because the rules allow them to.

There are also parts of the CAF that can be used to build new broadband infrastructure and these funds also favor wireless companies. The funds get distributed by complicated rules that have a bias to get broadband to customers at the lowest cost per subscriber. And of course, there is no cheaper way to cover a large rural footprint than with wireless. Wireless companies are also going to get a lot of this infrastructure funding.

Meanwhile, AT&T recently told the FCC that they were going to introduce a plan to drop the copper for ‘millions’ of rural subscribers. And if they are successful then their rural subscribers can expect to be told to get cell phones rather than landlines. And for voice telephony this might not be such a bad thing. But do we really want to relegate a large bunch of the US geography to only having cellular data?

Today there is clearly a broadband gap with some rural areas still stuck with dial-up Internet access. And so getting them some kind of faster data seems like a reasonable plan. The FCC has set the definition of broadband to be the capability of receiving 4 Mbps download. And it’s obvious that they set that limit with rural areas in mind.

And so over the next decade more and more of rural America will be getting cellular data that will meet, or come close to meeting the FCC’s definition of broadband. But meanwhile, the cities have already far surpassed those speeds. There are very few cities left where the average home can’t get speeds of between 10 Mbps and 20 Mbps. There are usually cheaper alternatives in the range of 5 Mbps to 7 Mbps, but the faster speeds are widely available. And many places have much faster speeds available.

The FCC itself has promoted the availability of gigabit bandwidth and companies are responding. Google is bringing this speed to Kansas City, Austin and Provo and AT&T has promised to match them in Austin. CenturyLink is bringing a gigabit to Omaha. And a number of smaller municipal and commercial providers have brought gigabit speeds to other towns and cities scattered across the country. And one can expect the gigabit movement to grow rapidly.

It’s universal knowledge that the household use of bandwidth has continued to grow and there is no end in sight for that growth. As networks can provide more data households find ways to use it. Video has been the recent reason for the explosion in data usage, and now we can see that the Internet of Things will probably be the next big bandwidth driver.

Have we really solved the rural bandwidth gap if people in those areas are going to have 4 Mbps data speeds while urban areas have a gigabit? Obviously the rural areas will continue to be left behind and they will fall further behind than today. Just a few years ago the rural areas had dial-up and the cities had maybe 5 Mbps. But a gap between a rural world at single digit megabit speeds with the cities at gigabit speeds is a much larger gap and the rural areas will not be able to share in the benefits that bandwidth will bring.

The only long-term solution is to build fiber to rural America. Obviously nobody is going to build fiber to single homes at the top of mountains or at the end of ten-mile dirt roads, but I have been working on business plans that show that fiber can make sense in the average rural county. But it is really hard to get rural fiber funding since such projects tend to jut pay for themselves and are not wildly profitable.

It’s possible that the FCC’s universal service plans will work and that a lot of the 19 million rural people without broadband will get some sort of rudimentary broadband. But meanwhile, the rest of the country will be getting faster and faster bandwidth. And so, before the FCC declares ‘mission accomplished’ I think we need to have more of a debate about the definition of broadband and what is acceptable. I hate to tell the FCC, but the rural broadband issue is not going to go away even after rural areas all have cellular data.

Categories
Regulation - What is it Good For? The Industry

Spectrum Winners and Losers

AT&T posted a short statement on their public policy blog called ‘Inconvenient Facts and the FCC’s Flawed Spectrum Screen’. In that blog post they complained that the FCC had failed to apply the spectrum screen to Softbank’s acquisition of Sprint and Sprint’s acquisition of the rest of Clearwire. And AT&T is right. The FCC has been incredibly inconsistent in the way it looks at wireless acquisition and mergers.

So what is the spectrum screen? The spectrum screen is a set of internal rules at the FCC that they use to determine if any wireless carrier owns too much spectrum in a given market. Historically the FCC had a generic rule that said that no one company could own more than one-third of the spectrum usable for wireless in a given geographic area. This spectrum screen was applied both to attempts of wireless carriers to buy new spectrum or to mergers between wireless carriers.

The FCC has been very inconsistent in the way they apply the existing screen. Last September they announced that they were going to look at the way the spectrum screen ought to work. But meanwhile, during the last year the screen has been applied (or ignored) in the following ways:

  • When the FCC looked at the proposed AT&T / T-Mobile merger they rejected the merger in part because they said that the acquisition would fail the screen test in 274 CMAs that covered 71 of the top 100 markets and 66% of the US population. However, the FCC fudged the spectrum screen in coming up with those numbers. At that time the spectrum screen set the maximum amount that any one carrier could own in one market at 95 MHz, which was one-third of the spectrum available for wireless carriers. However, in coming up with their conclusion the FCC lowered that threshold to 90 MHz in judging the merger. That might not sound like a big difference, but it lowered the number of markets affected by the merger by 84 and reduced the overall problem to less than 50% of the top 100 markets and 50% of the US population. That is still a lot of places where the proposed merger would have failed the spectrum screen, but AT&T had announced plans to divest of bandwidth as needed to meet the FCC test. The FCC made this change in the spectrum screen without any public input.
  • When Verizon acquired spectrum in the 1.7 to 2.1 GHz band the FCC applied this fully to their spectrum screen band. They did the same when AT&T acquired 2.3 GHz spectrum.
  • And then there is the recently announced approval for Softbank to acquire Sprint and Clearwire spectrum. The Clearwire spectrum at 2.5 GHz is right next to the 2.3 GHz spectrum recently acquired by AT&T. While the FCC fully counted the spectrum AT&T purchased against the spectrum screen, in the Softbank acquisition the FCC counted only 55.5 MHz of the Clearwire spectrum against the new Softbank spectrum screen even though there is an average of 140 MHz available in most of the Softbank markets.

So AT&T has a legitimate gripe. The FCC seems to apply the spectrum screen to get the results they want. It looks a lot more like the FCC is picking market winners and losers than they are protecting the public. The spectrum screen was established in the first place to promote competition. The FCC wanted to make sure that a given carrier did not get so much spectrum in a major market that they could effectively close out competition. They also didn’t want carriers to be able to hoard spectrum for future use. But the FCC no longer seems to be using market protection as the criteria of deciding who can and cannot merge.

It’s clear that the FCC didn’t want AT&T and T-Mobile to merge. They thought that it was bad for competition to lose one of the major carriers in the country. But it was wrong for them to fudge the spectrum screen as a way to justify their position rather than just oppose the merger on pure competitive grounds.

And in the case of Softbank they are going in the opposite direction. They obviously want a new competitor to AT&T and Verizon and they are ignoring the spectrum screen to make sure that happens.

Why does all of this matter? Like anything else it’s a matter of money. Wireless carriers have two ways that they can address congested conditions. They can just add more cell sites, closer and closer to the old ones. In effect spectrum is reusable and each new cell site uses the original spectrum freshly. The other solution is to just layer on a new spectrum in a crowded area so that no new cell sites need to be constructed. That is much cheaper than building cell sites, and so carriers want more and different spectrum in major markets to meet the seemingly insatiable and rapidly growing demand for mobile data.

The issue is going to get a lot worse. President Obama announced a new policy that will release up to 500 MHz of new spectrum for wireless use over the next five years. So there is going to be a new land grab by all of the carriers and the FCC needs to get ready.

It just seems to me like the FCC needs to toss out the spectrum screen and come up with a new way to determine the right amount of competition. In the two biggest merger cases before them in the last few years they blatantly ignored their own spectrum screen rules to get the result they wanted. That is evidence enough that we need to stop having the fiction of a spectrum screen. If the FCC wants to be in the game of picking market winners and losers they just need to be upfront about it.

Categories
Current News Technology

Google and Whitespace Radios

Image via CrunchBase

Last week Google received approval to operate a public TV whitespace database. They are the third company after Telcordia and Spectrum Bridge to get this designation. The database is available at http://www.google.org/spectrum/whitespace/channel/index.html and is available to the public. With this database you can see the whitespace channels that are available in any given market in the country.

The Google announcement stems from a FCC order in April, 2012 in FCC Docket 12-36A1 which is attached. This docket established the rules under which carriers can use whitespace spectrum. Having an authorized public spectrum database is the first step for a company to operate in the spectrum.

You may have seen recent press releases that talk about how Google proposes to use tethered blimps to operate in the whitespace spectrum. They are calling this system ‘SkyNet’, a name that sends a few shiver up the spine of movie buffs, but the blimps are an interesting concept in that they will be able to illuminate a large area with affordable wireless spectrum. By having their database approved, Google is now able to test and deploy the SkyNet blimps.

The whitespace spectrum operates in the traditional television bands and consists of a series of 6‑megahertz channels that correspond to TV channels 2 through 51, in four bands of frequencies in the VHF and UHF regions of 54-72 MHz, 76-88 MHz, 174-216 MHz, and 470-698 MHz. Whitespace radio devices that will work in the spectrum are referred to in the FCC order as TVBD devices.

For a fixed radio deployment, meaning a radio always sitting at a home or business, a TVBD radio must be able to check back to the whitespace database daily to makes sure what spectrum it is allowed to use at any given location. Mobile TVBD radios have to check back more or less constantly. It is important for a radio to be able to check with the database because there are licensed uses available in these spectrums and a whitespace operator needs to always give up space to a licensed use of the spectrum as it arises.

This means that TVBD radios must be intelligent in that they need to be able to change the spectrum they are using according to where they are deployed. Whitespace radios are also a challenge from the perspective of radio engineering in that they must be able to somehow bond multiple paths from various available, yet widely separated channels in order to create a coherent bandwidth path for a given customer.

There are whitespace radios on the market today, but my research shows that they are still not particularly affordable for commercial deployment. But this is a fairly new radio market and this is part of the normal evolution one sees after new spectrum rules hit the market. Various vendors generally develop first generation devices that work in the spectrum, but the long-term success of any given spectrum generally depends upon having at least one vendor that finds a way to mass produce radios so that they can reduce the unit costs. There are some spectacular failures in several spectrums that have been released in the last few decades, such as MMDS, that failed due to never having reached the acceptance level of producing affordable devices.

But one might hope that Google will find the way to produce enough radios to make them affordable for the mass market. And then maybe we will finally get an inkling of Google’s long-term plans. There has been a lot of speculation about Google’s long term plans as an ISP due to their foray into gigabit fiber networks in places like Kansas City and Austin. And now, with SkyNet we see them making another deployment as an ISP in rural markets. If Google produces proprietary TVBD radios that they only use for themselves then one has to start believing that Google has plans to deploy broadband in many markets as an ISP as it sees opportunities. But if they make TVBD radios available to anybody who wants to deploy them, then we will all go back to scratching our heads and wondering what they are really up to.

I have a lot of clients who will be interested in whitespace radios if they become affordable (and if they happen to operate in one of the markets where there is enough whitespace channels available). Like others I will keep watching this developing market to see if there is any opportunity to make a business plan out of the new spectrum opportunity.

Categories
Current News The Industry

The National Broadband Map

Seal of the United States Federal Communications Commission. (Photo credit: Wikipedia)

Last Thursday the FCC voted to take over the data collection for the National Broadband Map. The Map was created as part of the funding for broadband supplied a few years ago by the Stimulus package. The Map was created and administered by the NTIA (National Telecommunications and Information Administration) with input from the states, and that funding is now running out.

Back when the Map was suggested I thought the concept was a good one. But as soon I saw that the data gathered for the Map was to be self-reported by carriers I knew that there were going to be problems. And sure enough, when the first generation Map was produced it was full of errors – big errors.

I work with a lot of rural communities and I have reviewed the maps in many areas of the country and compared it to the actual deployment of broadband. Some communities have developed their own maps – and they did it the hard way. They sent people around to see where broadband was available. A lot of this can be done by somebody who knows how to look up at the cables. It’s easy to know where cable modems are available by the presence of coaxial cable on the poles. And rural DSL generally has repeaters that can be spotted by the eagle-eyed observer. And it’s not hard to look at your cell phone to see how many bars of data you can get. But the best test of where broadband is at is done by knocking on doors and asking people what they are able to buy.

As an example of what I found, let me talk about the issues found in just one county in Minnesota. The Map showed that most of the County had landline broadband availability. The County is very typical of rural areas and the County Seat is the largest town in the County. There are half a dozen much smaller towns and everything else is rural. A large chunk of the rural area is a national forest where very few people live. Most people live in close proximity of the roads in the rural areas.

The reality in this County is that even in the several of the smaller towns the DSL is so slow that it is hard to think of it as broadband. It’s more like dial-up plus. And there was no cable modem service from the cable company outside of the County Seat. And as is typical with DSL, as one goes outside of the towns the quality of the DSL quickly degrades with distance from the DSL hub. We’ve always called this the donut effect with large areas of no broadband surrounding rural towns that have DSL and/or cable modems.

The Map also showed that almost every populated area of this Minnesota County had 3G wireless data available. It’s a very hilly and rugged place and probably half of the county by area can’t even get cellular voice calls, let alone data. But even where voice is available there are many areas that can’t get cellular data. The Map was just wrong about this.

Everywhere that I have helped communities look at the Map we have seen the same thing. The Map shows broadband that isn’t there. It shows cellular data coverage that isn’t there. And it often shows providers that are supposedly serving the counties that nobody ever heard of.

And this is not true for just rural counties. I have helped two suburban counties near large cities look at the Map and they found the same situation. The Map showed areas that are supposed to have broadband where their citizens still have dial-up or satellite. And cellular coverage was exaggerated on the Map.

An obvious question is why this matters? The national Broadband Map has only been around for only a few years and anybody who has ever looked at it knows it us full of inaccuracies. The problem is that the federal government now relies on the Map for several purposes. For instance, if you want to get federal money by loan or grant to deploy rural broadband the assumption is that the Map is good. It is then your responsibility to show where the map is wrong.

And the FCC uses the Map when it talks about the availability of Broadband in rural America. The Map has been overlaid with Census data to count how many households can get broadband. This produces a very distorted picture of who has broadband. There are pockets of people without broadband in even some of the most populated counties in the country and the Map simply misses them. And in rural areas the Map can be very wrong.

The FCC just took over responsibility for the Map. From my perspective they either need to do it right or get out of the mapping business. It’s not easy to get it right, but it can be done. One of the easiest steps they could take would be to give counties the authority to clean up the maps for their areas. Many of them would be glad to do that. And broadband availability is not static. There are areas all of the time getting or losing broadband. If the FCC won’t take the time to get the Map right they should just let it die as another impractical idea.

Categories
Current News Regulation - What is it Good For?

Switching in an IP Environment

FCC HQ (Photo credit: Wikipedia)

In this industry there are always interesting fights going on behind the scenes. In fact, it seems like a lot of the policies made by the FCC are in response to battles being waged between carriers. As the FCC intervenes in these fights they end up creating policy as they help solve issues.

This Letter is a correspondence with the FCC about a current dispute that is going on with Verizon and AT&T disputing the way they are being billed by Bandwidth.com. and Level3. This fight is an interesting one because it asks the FCC to affirm that is supports a migration to an all-IP network.

The dispute is over what is called OTT (Over-the-top) VoIP. OTT in this case means that there are voice calls being made from a service provider’s network for which the service provider is not providing the switching. Instead the service provider is buying switching from a CLEC like Level3. And all of the calls involved are VoIP calls, meaning that they are being delivered from the customers to the switching CLEC using the IP network rather than the public switched telephone network.

Here is how this might happen, although there are other configurations as well. The network in question is clearly an IP network to the customer in order for this to be considered as VoIP. That means it is either a fiber-to-the-home network, DSL over a copper network or a cable system that has been upgraded to send the voice over the data path. In a traditional TDM network the calls from customers are routed directly to a voice switch and that switch will decide what to do with the call based upon the numbers that were dialed. But in this scenario there is not a switch in the subscriber’s network. Instead, when a customer makes a call, a signal is sent to wherever the switch is located telling it where the customer wants to call. That remote voice switch then tells the network owner where to send the call. It is no longer necessary in a smartswitch environment for the call to actually touch the switch, but the switch is still the device that decides how to route the call.

The parties are fighting about whether access charges ought to be charged for an OTT VoIP call. Access charges are fees that long distance carriers pay at both the originating and terminating end of a call to compensate the network owner at each end for processing the call. Verizon and AT&T don’t want to pay the switching component of the access charges for these calls. They are arguing that since there is not a physical switch in the originating network that such charges aren’t warranted.

Broadband.com and Level3 are arguing that the switching is being performed regardless of the location of that switch. They point out that for the FCC to rule otherwise would be counter to the FCC’s desire for the telephony world to migrate to an all-IP environment.

If the FCC rules that AT&T and Verizon are right, they will be saying that a carrier performing a switching function on legacy TDM technology can bill for performing that function but that somebody doing it more efficiently in an IP environment cannot. I just published a blog yesterday talking about ways to share a softswitch and that is exactly what is happening in this case. In an all-IP environment the network can be more efficient and not every carrier needs to buy and operate a switch. They can instead contract with somebody else to switch calls for them which is easy to make happen in an IP environment. Access charges are designed to compensate local carriers for the cost of performing certain functions and one has to think that the network owner in this case is still having to pay for the switching function and should get to recover some of that cost.

In fact, there has been switch sharing for years even in the TDM world. I know several rural LECS who lease switching from their neighbors and who have not owned a switch for decades, and they have always billed the switching access charge element. That element reimburses you for the cost of switching and it really shouldn’t matter if that cost is made up of the depreciation on a box you paid for or else a fee you pay to use somebody else’s box. Cost is cost and the key fact is that calls can’t be made or received from an area if somebody isn’t doing the switching.

I always find arguments by the large RBOCs to be interesting because they wear many hats. AT&T and Verizon are wireless carriers, LECs and long distance companies, and often when one part of the large companies make regulatory arguments it will be contrary to the interest of one of the other branches of the company. In this case the long distance branches of the RBOCs are looking for a way to avoid paying access charges. But the LEC side of both Verizon and AT&T share switching and they do not have a switch any more for every historic exchange area. So to some degree these companies are arguing against something that another branch of their company is doing. And this is often the case in many regulatory arguments since these companies do so many things.

Hopefully the FCC will agree with Broadband.com and Level3. If they rule otherwise they will be telling carriers that it is not a good idea to establish switch-sharing arrangements that are more efficient than having every carrier buying the same expensive boxes. If the FCC really wants the telco world to move to IP they need to get rid of any regulatory impediments that would make an IP network less desirable than a legacy network. Hopefully the FCC sides with efficiency.