Ideas for Better Broadband Mapping

The FCC is soliciting ideas on better ways to map broadband coverage. Everybody agrees that the current broadband maps are dreadful and misrepresent broadband availability. The current maps are created from data that the FCC collects from ISPs on the 477 form where each ISP lists broadband coverage by census block. One of the many problems with the current mapping process (I won’t list them all) is that census blocks can cover a large geographic area in rural America, and reporting at the census block level tends to blur together different circumstances where some folks have broadband and others have none.

There have been two interesting proposals so far. Several parties have suggested that the FCC gather broadband speed availability by address. That sounds like the ultimate database, but there are numerous reasons why this is not practical.

The other recommendation is a 3-stage process recommended by NCTA. First, data would be collected by polygon shapefiles. I’m not entirely sure what that means, but I assume it means using smaller geographic footprints than census blocks. Collecting the same data as today using a smaller footprint ought to be more accurate. Second, and the best idea I’ve heard suggested, is to allow people to challenge the data in the mapping database. I’ve been suggesting that for several years. Third, NCTA wants to focus on pinpointing unserved areas. I’m not sure what that means, but perhaps it means creating shapefiles to match the different availability of speeds.

These ideas might provide better broadband maps than we have today, but I’m guessing they will still have big problems. The biggest issue with trying to map broadband speeds is that many of the broadband technologies in use vary widely in actual performance in the field.

  • Consider DSL. We’ve always known that DSL performance decreases with distance from a DSL base station. However, DSL performance is not as simple as that. DSL also varies for other reasons like the size of the gauge of copper at a customer or the quality of the copper. Next door neighbors can have a significantly different DSL experience if they have different size wires in their copper drops, or if the wires at one of the homes have degraded over time. DSL also differs by technology. A telco might operate different DSL technologies out of the same central office and see different performance from ADSL versus VDSL. There really is no way for a telco to predict the DSL speed available at a home without installing it and testing the actual speed achieved.
  • Fixed wireless and fixed cellular broadband have similar issues. Just like DSL, the strength of a signal from a wireless transmitter decreases over distance. However, distance isn’t the only issue and things like foliage affect a wireless signal. Neighbors might have a very different fixed wireless experience if one has a maple tree and the other has a pine tree in the front yard. To really make it difficult to define the speed, the speeds on wireless systems are affected to some degree by precipitation, humidity and temperature. Anybody who’s ever lived with fixed wireless broadband understands this variability. WISPs these days also use multiple spectrum blocks, and so the speed delivered at any given time is a function of the particular mix of spectrum being used.

Regardless of the technology being used, one of the biggest issues affecting broadband speeds is the customer home. Customers (or ISPs) might be using outdated and obsolete WiFi routers or modems (like Charter did for many years in upstate New York). DSL speeds are just as affected by the condition of the inside copper wiring as the outdoor wiring. The edge broadband devices can also be an issue – when Google Fiber first offered gigabit fiber in Kansas City almost nobody owned a computer capable of handling that much speed.

Any way we try to define broadband speeds – even by individual home – is going to still be inaccurate. Trying to map broadband speeds is a perfect example of trying to fit a round peg in a square hole. It’s obvious that we can do a better job of this than we are doing today. I pity a fixed wireless ISP if they are somehow required to report broadband speeds by address, or even by a small polygon. They only know the speed at a given address after going to the roof of a home and measuring it.

The more fundamental issue here is that we want to use the maps for two different policy purposes. One goal is to be able to count the number of households that have broadband available. The improved mapping ideas will improve this counting function – within all of the limitations of the technologies I described above.

But mapping is a dreadful tool when we use it to start drawing lines on a map defining which households can get grant money to improve their broadband. At that point the mapping is no longer a theoretical exercise and a poorly drawn line will block homes from getting better broadband. None of the mapping ideas will really fix this problem and we need to stop using maps when awarding grants. It’s so much easier to decide that faster technology is better than slower technology. For example, grant money ought to be available for anybody that wants to replace DSL on copper with fiber. I don’t need a map to know that is a good idea. The grant process can use other ways to prioritize areas with low customer density without relying on crappy broadband maps.

We need to use maps only for what they are good for – to get an idea of what is available in a given area. Mapping is never going to be accurate enough to use to decide which customers can or cannot get better broadband.

New Net Neutrality Legislation

On February 7, as hearings were being held on net neutrality, Congressional Republicans said they were going to offer up three different versions of a bill intended to reinstate net neutrality principles. The newest bill, the Open Internet Act of 2019, was introduced by Rep Bob Latta of Ohio. They also offered up bills previously introduced by Rep. Greg Walden of Oregon and Sen John Thune of South Dakota.

All three bills would reestablish rules against ISP blocking web traffic, throttling customers or implementing paid-prioritization, which has been referred to as creating fast lanes that give some web traffic prioritization over other traffic. Hanging over all of these bills is a court review of a challenge of the FCC’s right to kill net neutrality – a successful challenge would reinstate the original FCC net neutrality rules. There are also a number of states poised to introduce their own net neutrality rules should the court challenge fail.

The court case and the threat of state net neutrality rules are prodding Congress to enact net neutrality legislation. Legislation has always been the preferred solution for imposing any major changes in regulation. When there’s no legislation, then rules like net neutrality are subject to being changed every time there is a new FCC or a new administration. Nobody in the country benefits – not ISPs and not citizens – when policies like net neutrality change every time there is a new administration.

These three bills were clearly influenced by the big ISPs. They include nearly the identical talking points that are being promoted by NCTA, the lobbying arm of the largest ISPs, headed by ex-FCC Commissioner Michael Powell. There are two primary differences in these bills and the original net neutrality rules that were established by the last FCC.

The first is a provision that the legislation would allow the ISPs to stray from the net neutrality principles if there is a ‘public benefit’ from doing so. That would allow ISPs to adopt any web practice they want as long as they can concoct a story about how the practice creates a public benefit. Since there are winners and losers from almost any network practice of ISPs, it wouldn’t be hard to identify those that benefit from a given practice. From a regulatory perspective, this is as close as we can come to a joke. If a regulated entity gets to decide when a regulation applies, then it’s not really a regulation.

The other big difference from the proposed legislation and the original net neutrality order is the lack of what is called a ‘general conduct standard’. The original net neutrality order understood that the Internet is a rapidly evolving and that any specific rules governing Internet behavior would be obsolete almost as soon as they are enacted. ISPs and the other big players on the web are able to design ways around almost any imaginable legislative rules.

The original net neutrality order took the tactic of establishing the three basic net neutrality principles but didn’t provide any specific direction on how the FCC was supposed to enforce them. The concept of the general conduct standard is that the FCC will look at each bad practice of an ISP to see if it violates the net neutrality principles. Any FCC ruling would thus be somewhat narrow, except that a ruling against a specific ISP practice would generally apply to others doing the same thing.

The original net neutrality order envisioned a cycle where the FCC rules against bad practices and the ISPs then try to find another way to get what they want – so there would be a continuous cycle of ISPs introducing questionable behavior with the FCC deciding each time if the new practice violates the intent of the net neutrality principles. This was a really clever solution for trying to regulate an industry that changes as quickly as the ISP and web world.

The proposed legislation does away with the general conduct standard. That means that the FCC would not have the ability to judge specific ISP behavior as meeting or not meeting the net neutrality standards. This would take all of the teeth out of net neutrality rules since the FCC would have little authority to ban specific bad practices. This was summarized most succinctly by former FCC Chairman Tom Wheeler who testified in the recent Congressional hearings that if Congress established net neutrality rules it ought to allow for “a referee on the field with the ability to throw the flag for unjust and unreasonable activity.”

The bottom line is that the proposed legislation would reintroduce the basic tenets of net neutrality but would give the FCC almost no authority to enforce the rules. It’s impossible to imagine these bills being passed by a divided Congress, so we’re back to waiting on the Courts or perhaps on states trying to regulate net neutrality on their own – meaning a long-term muddled period of regulatory uncertainty.

Is Broadband ‘Wildly Competitive’?

The FCC is in the process of creating its first report to Congress required by the Ray Baum Act, which is the bill that reauthorized the FCC spending for 2019 and 2020. That bill requires the FCC to create a report every two years that, among other things assesses the “state of competition in the communications marketplace, including competition to deliver voice, video, audio, and data services among providers of telecommunications, providers of commercial mobile service, multichannel video programming distributors, broadcast stations, providers of satellite communications, Internet service providers, and other providers of communications services”.

The FCC accepted comments about what should be included in its first report, and as you might imagine received a wide variety of comments from the industry and other interested parties.

In typical big carrier fashion, the NCTA – The Internet & Television Association, the lobbying group representing the largest ISPs filed with the FCC arguing that the broadband marketplace is already ‘wildly competitive’. The big ISPs have a vested interest in the FCC reaching such a conclusion, because that would mean that the FCC wouldn’t have to take actions to create more competition.

The reasoning the big carriers are using to make this claim is ironic. They argue that the FCC shouldn’t use its own 25/3 Mbps definition of broadband since the FCC is currently spending billions of dollars in the CAF II program to deploy broadband that meets a lower standard of 10/1 Mbps. They say that if US broadband is examined for the amount of competition at the lower 10/1 threshold that most markets in the US are competitive. That’s ironic because the FCC was pressured into giving all of the CAF II money to the big telcos after intense lobbying and the funds were originally intended to be awarded through a reverse auction where ISPs would have been rewarded for building broadband capable of delivering speeds up to 1 Gbps.

Further, if the FCC was to accept the idea that 10/1 Mbps is acceptable broadband then the FCC would probably be obligated to count cellular broadband as an economic substitute for landline broadband since it delivers speeds in the same range as the CAF II deployments.

However, making that same determination is impossible at faster speeds. Even the FCC’s own highly-skewed mapping data shows there are not many households in the country with two options for buying 100 Mbps service. Where households have two choices for buying 25/3 Mbps broadband the second option is almost always DSL, which the big telcos are letting die a natural technological death, and which often delivers speeds much slower than advertised. As I’ve written about in this blog, my firm has done surveys in numerous communities where the delivered speeds for both cable modems and DSL were significantly slower than the advertised speeds and certainly slower than the data in the FCC database that is collected from the big ISPs and used to create the FCC’s broadband coverage maps and other statistics.

The only way to claim that broadband is ‘wildly competitive’ is to count broadband speeds slower than the FCC’s 25/3 Mbps definition. If the FCC was to accept cellular broadband and satellite broadband as the equivalent of landline broadband, then a large majority of homes would be deemed to have access to multiple sources of broadband. I would restate the NCTA’s ‘wildly competitive’ claim to say that a majority of homes in the country today have access to multiple crappy sources of broadband.

We’ll have to see what the FCC tells Congress in their first report. I suspect their story is going to be closer to what the big ISPs are suggesting than to the reality of the broadband marketplace. This FCC already seriously considered accepting cellular and satellite broadband as an equivalent substitute for landline broadband because doing so would mean that there are not many places left where they need to ‘solve’ the lack of broadband.

The FCC finds itself in an unusual position. It gave up regulation of broadband when it killed Title II regulation. Yet the agency is still tasked with tracking broadband, and they are still required by law to make sure that everybody in the country has access to broadband. Let’s just hope that the agency doesn’t go so far as to tell Congress that their job is done since broadband is already ‘wildly competitive’.

Killing FTC Regulation?

NCTA, the lobbying group for the big cable companies filed a pleading with the Federal Trade Commission (FTC) asking the agency to not get involved with regulating the broadband industry. When the FCC killed net neutrality, Chairman Ajit Pai promised that it was okay for the FCC to step away from broadband regulation since the FTC was going to take over much of the regulatory role. Now, a month after net neutrality went into effect we have the big cable ISPs arguing that the FTC should have a limited role in regulation broadband. The NTCA comments were filed in a docket that asks how the FTC should handle the regulatory role handed to them by the FCC.

Pai’s claim was weak from the outset because of the nature of the way that the FTC regulates. They basically pursue corporations of all kinds that violate federal trade rules or who abuse the general public. For example, the FTC went after AT&T for throttling customers who had purchased unlimited data plans. However, FTC rulings don’t carry the same weight as FCC orders. Rulings are specific to the company under investigation. Rulings might lead other companies to modify their behavior, but an FTC order doesn’t create a legal precedent that automatically applies to all carriers. In contrast, FCC rulings can be made to apply to the whole industry and rulings can change the regulations for every ISP.

The NCTA petition asks the FTC to not pursue complaints about regulatory complaints against ISPs. For example, they argue that the agency shouldn’t be singling out ISPs for unique regulatory burdens, but should instead pursue the large Internet providers like Facebook and Google. The NCTA claims that market forces will prevent bad behavior by ISP and will punish a carrier that abuses its customers. They claim there is sufficient competition for cable broadband, such as from DSL, that customers will leave an ISP that is behaving poorly. In a world where they have demolished DSL and where cable is a virtual monopoly in most markets they really made that argument! We have a long history in the industry that says otherwise, and even when regulated by the FCC there are long laundry lists of ways that carriers have mistreated their customers.

One of the more interesting requests is that the ISPs want the FTC to preempt state and local rules that try to regulate them. I am sure this is due to vigorous activity at the state level currently to create rules for net neutrality and privacy regulations. They want the FTC to issue guidelines to state Attorney Generals and state consumer protection agencies to remind them that broadband is regulated only at the federal level. It’s an interesting argument to make after the FCC has punted on regulating broadband and when this filing is asking the FTC to do the same. The ISPs want the FTC to leave them alone while asking the agency to act as the watchdog to stop others from trying to regulate the industry.

I think this pleading was inevitable since the big ISPs are trying to take full advantage of the FCC walking away from broadband regulation. The ISPs view this as an opportunity to kill regulation everywhere. At best the FTC would be a weak regulator of broadband, but the ISPs don’t want any scrutiny of the way they treat their customers.

The history of telecom regulation has always been in what I call waves. Over time the amount of regulations build up to a point where companies can make a valid claim of being over-regulated. Over-regulation can then be relieved either by Congress or by a business-friendly FCC who loosens regulatory constraints. But when regulations get too lax the big carriers inevitably break enough rules that attracts an increase of new regulation.

We are certainly hitting the bottom of a trough of a regulatory wave as regulations are being eliminated or ignored. Over time the large monopolies in the industry will do what monopolies always do. They will take advantage of this period of light regulation and will abuse customers in various ways and invite new regulations. My bet is that customer privacy will be the issue that will start the climb back to the top of the regulatory wave. The ISPs argument that market forces will force good behavior on their part is pretty laughable to anybody who has watched the big carriers over the years.

A Regulatory Definition of Broadband

In one of the more bizarre filings I’ve seen at the FCC, the National Cable Television Association (NCTA) asked the FCC to abandon the two-year old definition of broadband set at 25 Mbps down and 3 Mbps up. NCTA is the lobbying and trade association of the largest cable companies like Comcast, Charter, Cox, Mediacom, Altice, etc. Smaller cable companies along with smaller telephone companies have a different trade association, the American Cable Association (ACA). This was a short filing that was a follow-up to an ex parte meeting, and rather than tell you what they said, the gist of the letter is as follows:

We urged the Commission to state clearly in the next report that “advanced telecommunications capability” simply denotes an “advanced” level of broadband, and that the previously adopted benchmark of 25 Mbps/3 Mbps is not the only valid or economically significant measure of broadband service. By the same token, we recommended that the next report should keep separate its discussion of whether “advanced telecommunications capability” is being deployed in a reasonable and timely manner, on the one hand, and any discussion of the state of the “broadband” marketplace on the other.  We noted that the next report presents an opportunity for the Commission to recognize that competition in the broadband marketplace is robust and rapidly evolving in most areas, while at the same time identifying opportunities to close the digital divide in unserved rural areas.

The reason I call it bizarre is that I can’t fathom the motivation behind this effort. Let me look at each of the different parts of this statement. First, they don’t think that the 25/3 threshold is the ‘only valid or economically significant measure of broadband service.’ I would think the 25/3 threshold would please these companies because these big cable companies almost universally already deploy networks capable of delivering speeds greater than that threshold. And in many markets their competition, mostly DSL, does not meet these speeds. So why are they complaining about a definition of broadband that they clearly meet?

They don’t offer an alternative standard and it’s hard to think there can be a standard other than broadband speed. It seems to me that eliminating the speed standard would help their competition and it would allow DSL and wireless WISPs to claim to have the same kind of broadband as a cable modem.

They then ask the FCC to not link discussions about broadband being deployed in a reasonable and timely manner with any actual state of the broadband marketplace. The FCC has been ordered by Congress to report on those two things and it’s hard to think of a way to discuss one without the other. I’m not sure how the FCC can talk about the state of the broadband industry without looking at the number of consumers buying broadband and showing the broadband speeds that are made available to them. Those FCC reports do a great job of highlighting the regional differences in broadband speeds, and more importantly the difference between urban and rural broadband speeds.

But again, why do the cable companies want to break that link in the way that the FCC reports broadband usage? The cable companies are at the top of the heap when it comes to broadband speeds. Comcast says they are going to have gigabit speeds available throughout their footprint within the next few years. Cox has announced major upgrades. Even smaller members like Altice say they are upgrading to all fiber (which might get them tossed out of NCTA). These FCC reports generally highlight the inadequacy of DSL outside of the cable company footprints and don’t show urban broadband in a bad light.

Finally, they want the FCC to recognize that there is robust competition in broadband. And maybe this is what is bothering them because more and more the cable companies are being referred to as monopolies. The fact is there is not robust competition for broadband. Verizon has FiOS in the northeast and a few other major cities have a fiber competitor in addition to the cable and telephone incumbents. But in other markets the cable companies are killing the telephone companies. Cable companies continue to add millions of new customers annually at the expense of DSL. AT&T and Verizon are currently working to tear down rural copper, and in another decade they will begin tearing down urban copper. At that point the cable companies will have won the landline broadband war completely unless there is a surprising upsurge in building urban fiber.

The only other reason the cable companies might be asking for this is that both Comcast and Charter are talking about getting into the wireless business. As such they could begin selling rural LTE broadband – a product that does not meet the FCC’s definition of broadband. I can’t think of any other reason, because for the most part the big cable companies have won the broadband wars in their markets. This filing would have been business as usual coming from the telcos, but it’s a surprising request from the cable companies.

Forcing Competition

charter-spectrum-logoThere was an FCC requirement in the Charter acquisition of Time Warner Cable and Bright House Networks that wasn’t much mentioned in the press. As a requirement to gain approval for the merger the FCC is requiring that Charter must build new networks to pass one million potential customer passings outside of its existing footprint. Charter must do this where there is already another ISP offering at least 25 Mbps, and Charter is required to offer at least 60 Mbps speeds.

The two main lobbying groups which together represent small cable companies and telcos for smaller ISPs  (the American Cable Association and the Rural Broadband Association) are suing the FCC to stop this.

This seems like a very odd requirement. Charter is being directed to build in markets that already qualify as having broadband under the FCC’s definition, so this won’t bring broadband to anybody new. But I guess the FCC hopes it will create networks that will compete against each other with price and service.

But if Charter expands using coaxial networks, then this ruling will force Charter to build a second coax network in markets, often in places where poles are already getting very full of wires. Charter could instead build with all-fiber, but then they would be creating a pocket of customers using a different technology, and big companies have shown they are not very good at handling one-off situations. For example, Verizon had such a hard time integrating fiber processes into their existing company that they literally created FiOS as a whole new internal organization separate from copper. What I find even more troubling is that any market where Charter builds is not likely to then ever attract another fiber overbuilder.

I know that the FCC is very bothered by the fact that cable companies don’t compete against each other. The FCC reports every year that the majority of residents in the country only have one choice for decent broadband speeds, and I am sure that is what prompted this requirement. But I think there is a reason for that – cable companies are largely a natural monopoly. A cable company has no technological advantage by building alongside of another existing coaxial network. I am sure that cable companies have done the math over the years and if, for example, Charter was to build to compete against Comcast, then in the end both companies will under-earn in the market with dual competition.

I love real competition and it’s always interesting to watch how a cable company reacts when somebody builds a new fiber network to compete with them. But I don’t think that a regulator can force competition. They can require Charter to build new network, but they can’t really make them act competitively in the same manner that some smaller fiber provider would act. A competitor has to be hungry to be competitive and it’s hard thinking that this requirement is going to make Charter show up in new markets and act like a competitive overbuilder.

The smaller ISPs are worried because they suspect that Charter will pick their markets to meet their requirement rather than going up against Comcast or Mediacom. And there is certainly a good chance they are right. I am sure that Charter really does not want to create bad blood between them and the other large cable companies. Together these companies own Cable Labs. Comcast and Charter both own a piece of Hulu. They do not want to be out marketing against each other if that can be avoided. And so Charter is likely to select smaller markets where either small cable companies or telcos are the primary ISP.

I really have to ask what good this requirement does in the long run. If Charter’s heart is not in this they will muddle through competing in the new markets and they won’t do well. Some customers in those markets may benefit by the newly created competition, but then again Charter may decide to not compete on price. One might suspect that in the sixth year of the new venture that Charter might be selling off these new networks to somebody else.

My gut tells me that you can’t force a company to be competitive when it’s against their nature. I am sure there is a lot of groaning in the departments of Charter that are being tasked with completing this requirement. But the company will choose some markets, probably close to where they already have headends, and they will build new networks until they pass a million and one potential customers. They may or may not make enough money to pay for these new networks, and at some point they will walk away from the venture if it’s a financial failure. I may be wrong, but this doesn’t feel like an idea aimed towards success.

Comments to the FCC on Data Speeds

FCC_New_LogoI’ve been reading through the comments in FCC Docket 14-126 that asks the question if the FCC should increase the definition of broadband. The comments are  sticking mostly to the expected script. It seems that all of the large incumbents think the current definition of 4 Mbps download and 1 Mbps upload are just fine. And just about everybody else thinks broadband should be something faster. In the Docket the FCC suggested that a low-use home today needs 4 Mbps download, a moderate-use home needs 7.9 Mbps and a high-use home needs 10 Mbps.

AT&T says that the current definition of 4 Mbps is adequate to define ‘advanced telecommunications capability’ per Section 706 of the FCC rules. They argue that customers don’t use as much bandwidth as the FCC is suggesting. For example, they argue that most of their customers who pay for 12 Mbps service rarely hit a maximum of 10 Mbps during a typical month. They argue that the FCC is trying to change the definition of broadband by only looking at what the heaviest users of broadband are using.

AT&T goes on to say that they and other companies like Google and the large cable companies are now deploying gigabit-capable technology and so the FCC has no reason to worry about data speeds since the industry will take care of the problem by increasing speeds. I obviously disagree with AT&T on this argument. They are using the red herring of what is happening in places like Austin Texas and extrapolating that to mean that the whole country is seeing huge broadband upgrades. As I have written many times, small town America is not getting any of the new broadband investment that AT&T touts in their comments. And rural America is still often stuck with dial-up, satellite or cellphone data. Further, AT&T has been actively saying elsewhere that they want to kick millions of customers off copper and get rid of their DSL option.

Verizon took a different tactic in their filing. They also don’t want the definition increased from 4 Mbps. They first argue that they have made a lot of investments in broadband, and they certainly have done so with their FiOS fiber network in cities and suburbs. But they then go on to argue that cellular data ought to be counted as broadband and that they are offering a great cellular alternative to people. They cite that 97.5% of people in the country have access to LTE with broadband speeds greater than 10 Mbps download and that this should be counted as broadband.

There are a few problems with their claim. First, Akamai collects the speeds from millions of cellular data downloads and they report that the average cellular data speed actually achieved in the country is 4.4 Mbps and not Verizon’s theoretical 10 Mbps. And cellular data is bursty, meaning that it’s designed to be fastest for the first few seconds of download and then normally slows down. More interestingly, a few months back Comcast citied Verizon and AT&T cellular data as evidence that Comcast has robust broadband competition. Verizon Wireless’s CEO countered the Comcast’s claim and said, “LTE certainly can compete with broadband, but if you look at the physics and the engineering of it, we don’t see LTE being as efficient as fiber coming into the home.” Finally, everybody is aware that cellular data plans include tiny data caps of only a few cumulative gigabits of download per month and cellphone users know that they must park on WiFi from landlines data sources as much as possible to make their cellphones usable for video and other heavy data usage.

Verizon goes on to cite the National Broadband Map several times as justification that there is already great broadband coverage in the US today. They say that 99% of households already have access to broadband according to the map. I have written several times about the massive inaccuracies in that map due to the fact that all of the data in it is self-reported by the carriers.

The big cable companies did not make comments in the docket, but there is a filing from the National Cable Telecommunications Association on behalf of all of them. NCTA says that the definition of broadband should not be increased. Their major argument is that the FCC is not measuring broadband deployment correctly and should measure it every year and report within six months of such measurements. They also say that the FCC should take more consideration of the availability of cellular and satellite data which they say are broadband. I haven’t commented on satellite data for a while. Some parts of the country can now get a satellite connection advertised with a maximum download speed of 15 Mbps. It’s been reported to be a little slower than that, but like cellular data a satellite connection has tiny data caps that make it nearly impossible for a family with a satellite connection to watch video.

In a speech last week FCC Chairman Tom Wheeler said that 10 Mbps is too low to be considered broadband and that federal funds like the Connect America Fund should not be funding the construction of any broadband with speeds lower than that. It’s going to be interesting to see where the FCC comes out on this. Because if they raise the threshold too much then a whole lot of households are going to be declared to no longer have true broadband, which is pretty much the truth.