When Metallica Sued Napster

napster11Anybody who reads this blog knows that I am intrigued by the history of technology and I look back periodically at past events when they seem to be relevant to something happening today. This past week I saw an article that mentioned that April was the fifteenth anniversary of the lawsuit between Metallica and Napster. In retrospect, that was a very important ruling that has had implications in a lot of what we allow or don’t allow on the web today.

So let me look back at a few of the facts of that case and then discuss why this was so important. The first thing that surprised me about this is that this was only fifteen years ago. I remember the case vividly, but in my memory it was older than that and was back at the beginning of the Internet (and in many ways it does).

The case was very straightforward. If you recall, Napster was the first major peer-to-peer file-sharing service. It was very simple in operations and it allowed you to see MP3 files on other Napster users’ computers as long as you agreed to make your own files available. Napster users were then free to download any file they found in the Napster system. You could do simple searches by either song name or artist to navigate the system.

Of course, Napster put the whole music industry into a tizzy because people were using it to download millions of music files free every day. This was illegal for anybody who downloaded songs since they were violating copyrights and getting music without paying the musicians or the record companies.

The industry railed loudly against Napster, but I’m not sure they knew entirely what to do about them. While users of Napster were breaking the law, it was not quite so clear that Napster was doing anything wrong, and the industry feared a court case that might give a legal go-ahead to Napster. The industry was looking at legislative fixes to the problem.

But then along came Metallica. The band got incensed when their song ‘I Disappear’ from the Mission Impossible II soundtrack appeared on Napster before it was even officially released. The band decided to sue Napster to stop the practice and in the process became the spokespeople for the whole industry. The Recording Industry Association of America (RIAA) and Metallica offered settlement alternatives to Napster, such as scrubbing their system of all copyrighted material, but this was impossible at that time (and probably still is). In the process of trying to negotiate a settlement, Napster went bankrupt paying to defend itself.

But this lawsuit sparked an ongoing and major debate about ownership rights of content versus the rights of Internet companies to make content available. While it became clear that blatant file-sharing like what Napster did is illegal, there are plenty of more nuanced fights today in the ongoing battle between artists and internet companies.

The fight moved on from Napster to Apple’s battle against Digital Rights Management (DRM), the practice of tying music recordings to a music player. From there the fight migrated to Congress with attempts to pass the Stop Online Piracy Act (SOPA) and the Protect IP Act (PIPA) that were pushed by the music and movie industries to give them a leg up over internet companies.

You still see this same fight today when artists like Taylor Swift are fighting with Spotify to be justly paid for their content. You see this same battle between authors and Amazon for not properly rewarding them for their works. There is also a never ending battle between video content providers and sites like YouTube that allow people to easily upload copyrighted material.

It’s likely that the battle is going to be ongoing. Some visionaries foresee a day when micropayments are widely accepted and users can easily buy content directly from artists. But that is never going to be a perfect solution because people love the convenience of services like Spotify or Beats that put the content they like in an easy-to-use format. And as we saw with Napster, millions of people will grab copyrighted content for free if they are allowed to.

Should You Have a Cord Cutter Package?

rabbit earsIf you are in the cable business is it time to consider a ‘cord-cutter’ product? Obviously Cablevision thinks it’s a good idea as they became the first cable TV company to offer a standalone version of HBO Now to its line-up.

Cablevision has also adding two specific cord-cutter products as well. For $34.90 per month they will provide a 5 Mbps download cable modem, a Mohu Leaf 50 digital antenna to watch network television without a cable subscription, and their Freewheel unlimited text and voice WiFi phone service (more on this below).

For a promotional price of $44.90 per month they will provide a 50 Mbps down/25 Mbps up cable modem and the same free digital antenna. There is no description of what the price will rise to at the end of the promotional period. Both products have an option to add HBO Now for $15 per month.

The Cablevision Freewheel WiFi phone is an interesting product also. It provides unlimited voice and text as long as the customer is on WiFi and inside of the Cablevision service footprint. As long as you buy another Cablevision product it’s priced at $9.95 per month and you have to buy a Motorola Moto G phone for $99.95. The phone does not work on traditional cellular, so it’s only going to be attractive to those who are always around WiFi.

Cablevision says these packages are meant to go after cord cutters or cord nevers and are to provide an alternative for those who don’t want to pay for a traditional cable programming package. This begs the question: should other providers consider the same sort of cord cutter packages? A few weeks ago, the FCC officially announced that cord cutting is real (a little late to the game) since I don’t know that I have any clients that are not losing cable customers in a given footprint.

The Cablevision options are somewhat odd, though. While Freewheel WiFi phones will be attractive to those who stay around WiFi all day, it’s a product that doesn’t work in moving vehicles and which doesn’t revert to traditional cellular when you are out of reach of WiFi. For around $15 per month you can buy a better version of this product from several cellular resellers that partner with traditional cellphone service so that the phone will work anywhere in the US. And the more expensive cord cutter package is basically a naked cable modem with a free digital antenna thrown in.

There are two questions to ask if you want to consider a cord cutting product. What do cord cutters really want? Can you put together such a package?

Cablevision seems to think that people want a naked standalone data product, but most of my clients have offered that for years. They have come to the conclusion that they should never turn away anybody willing to pay for their highest margin data product, especially since most small companies are losing money on cable TV anyway. You can often get standalone cable from the larger cable companies if you fight hard enough for it, but they will spend a lot of effort getting you to buy a bundle of some sort instead.

Companies like Sling TV seem to think that cord cutters want smaller packages of programming, and I am sure some of them do. But recent surveys show that customers are extremely loyal to the few networks they most want, and so a smaller package is only going to be attractive to that tiny sliver of your customers who only want exactly what is in the smaller package you offer. I think what people really want is a la carte programming and the ability to buy only what they want and nothing more. But that is not going to be on the table soon, if ever.

If Verizon is able to wade through the lawsuits and offer their smaller packages, I think they are going to get limited response as well, because their proposed pricing for smaller packages is not much cheaper than normal cable packages. And this highlights the second thing cord cutters want – they want to save money. Unfortunately, as many have warned, when you pull channels out of the bigger line-ups and sell them in smaller piles, the programmers are going to charge a lot more for you to carry them. They still want to be paid as if you are taking their larger line-ups.

I would be shocked if Cablevision sells very many of their smaller package – it’s just too quirky in forcing both a WiFi phone and a slow cable modem together. The number of households who are going to think that is the perfect product can’t be very large. But Cablevision might address this over time by offering a wide array of different cord cutter options. But then they will have violated something that cable companies have learned the hard way – which is to keep the options simple.

I’m not sure that there is any real cord cutter package that will be a killer product to keep your cord cutter customers happy. But perhaps there is a suite of different products that will be attractive to different segments of cord cutters and which will each get a little piece of the market.

New Technology – May 2015

TransistorThis blog will look at some of the coolest new technology that has come across my screen lately.

Ultrathin Transistor. Researchers at Cornell have developed a transistor that is only three atoms thick. The transistor is made from an experimental material called transition metal dichalcogenide (TMD).

The findings were published in Nature and noted as a potentially major breakthrough. We are reaching the limit on the smallness of circuits that can be made from silicon and this possibly portends a generation of ultrathin circuits and sensors. The Cornell team has made a circuit on a 4-inch wafer and they believe this can easily be made commercially viable. TMDs are being discussed along with graphene as the potential breakthroughs that will let us march past the Moore’s law limits on circuit sizes.

Acoustruments. Disney research labs have developed a technology they call acoustruments as a way to interface with physical devices using soundwaves. For example, this could let you set an alarm clock at a Disney resort from an app on any cellphone that has a speaker. As you tell the app what to do, it would emit sounds from your cellphone speaker that would then ‘push’ the appropriate buttons on the alarm clock to set the alarm. Disney sees applications allowing people from around the world to have easier interfaces with devices on the Disney property.

This has potential uses far outside this simple example because it could allow a no-power standard interface between people and electronics. This could become a handy way to interface with IoT devices, for example.

Better Electric Conductors. Scientists at Rice University along with the Tejiin Aramid, a firm from the Netherlands, have demonstrated the ability to use carbon nanotubes to carry up to four times as much electricity for the same mass of wires. The team has found techniques that allow them to spin strong durable wire from carbon nanotubes that can perform as well as copper.

This can lead to specialized wiring for those applications where weight is an issue. For example, this could be used to produce higher efficiency long-haul wires from rural solar power stations. Or it could be used in applications like spacecraft, airplanes, and cars where weight is always an issue.

Wireless Energy Transmission. The Japanese Aerospace Exploration Agency (JAXA) has been able to transmit 1.8 kilowatts of power accurately through the air to a receiver 170 feet away. While this is not very far, nor a lot of power, it is the first practical demonstration of the ability to transmit power in much the same way that we transmit wireless data streams.

Japan’s goal with this project is to eventually be able to beam electricity back to earth from space. They envision large solar plants in space that are more efficient and not dependent upon weather. They envision solar farms set up at 22,300 miles from earth where they would be exposed to the sun continuously.

Breakthroughs in Quantum Computing. Researchers at IBM have made a few breakthroughs that could help to make quantum computers commercially viable. For the first time they have been able to measure the two types of quantum errors (bit-flip and phase-flip) simultaneously, allowing them to now work on an error correction algorithm for quantum computers. Until now, they could only measure one of the two variables at a time. The scientists have also developed a square quantum bit circuit that might make it feasible to mass product quantum chips.

These breakthroughs are important because quantum computing is one of the possible paths that could help us smash past the Moore’s Law limits on current technology. A quantum computer with only 50 quantum bits (qubits) can theoretically outperform a slew of our best supercomputers acting together. Such computers would also allow us to solve problems that are unsolvable today.

Better Atomic Clock. Scientists at the National Institute of Standards and Technology (NIST) and the University of Boulder in Colorado have developed an atomic clock that is accurate to within one second in 15 billion years. This is a vast improvement over the current atomic clock technology that uses a vibrating crystal of Cesium 133 and which is accurate to within a second over 100 million years.

The new clock would be sensitive enough to be able to measure the time differences at different altitudes on earth, a phenomenon predicted by Einstein but which has never been demonstrated.

Live Streaming on the Internet

olympic-rings-on-whiteI wrote recently about how Sling TV had problems with the NCAA basketball games, and particularly with the final game between Kentucky and Wisconsin. I watched the Maryland games in the first two rounds of the tournament and reported how awful my experience was.

But Sling TV is not the only one to have trouble with live streaming. I recall last year when HBO Go had a terrible crash with the streamed season premier for Game of Thrones. And the Oscars last year also failed when ABC tried to stream the event.

Live streaming is just that – it’s when a live event is being put over the Internet in real time. This is opposed to the way that Netflix, Amazon Prime, and other online services stream. When you watch one of those services they send a big burst of data at first and they provide enough download to stay a few minutes ahead of your viewing. As you watch, they stream more and try to stay ahead of you. Since you are watching a cached copy of what you have already downloaded the viewing experience is always a good one.

In these three above examples of live streaming problems the companies blamed the issue on unexpected demand. Certainly there might have been millions watching the Oscars and Game of Thrones, but Sling TV had maybe 100,000 viewers of the final four. And I’ve had problems watching less popular sports events on Sling TV where they probably didn’t have more than few thousand viewers.

I really can’t buy the excuse that the live streams failed because any of these companies had too many viewers. That’s a good excuse to hide behind. But in reality they only send out a small number of live streams to the world – it’s not like they initiate a stream for every viewer who is watching. They instead send a stream to the backbone carrier, such as Cogent or Level3 with whom they are interconnected. A company like HBO might also have direct peering with Comcast and a few other large cable companies and telcos. But most programmers that do live streaming are handing off the live stream to an underlying carrier.

Their problems are going to begin if they hand off everything as routine traffic to an underlying carrier. Unless a content provider requests some sort of priority treatment of their streams then it’s going to be treated like everything else on the web. One would imagine that the stream of a major event is going to end up being sent to nearly every one of the thousands of ISPs in the country. And many of them are far down the Internet food chain and get their bandwidth via numerous hops from one of the main ISP POPs.

There are streaming events that have been successful. Consider the Olympics online. There, NBC transmitted not just one event, but many at the same time, and at least at major ISPs the reports on the quality were very positive. It’s almost certain that NBC paid extra and made arrangements to make sure that the Olympic stream had a high priority through the backbone. In case you are wondering if that is against net neutrality, it is not. Net neutrality looks mostly at the customer side of the network while carriers are allowed to pay for arrangements needed to make their service operate as intended through the backbone.

The reason that you don’t hear ISPs commenting on the issue is that some of the streaming problems come from your local ISP. The issue that most affects streaming video is latency, and ISPs are all over the board when it comes to latency. Latency is the average time it takes a signal to get to you, and ISP networks can have hardware, software, and routing practices in place that result in increased latency to the signal. And as mentioned earlier, one of the biggest sources of latency is the number of hops a signal has to take on the web between its source and a given network/end user.

When I lived in the Virgin Islands the latency was horrendous as we were at the end of the Internet food chain in North America. But a lot of rural places and rural ISPs in the country also suffer from poor latency because they buy their internet from somebody who buys from somebody else and they might be half a dozen carriers deep in the delivery chain.

The final source of a bad viewing experience can come from your home. You may have an old or outdated cable modem, or if you are using WiFi to get internet to your viewing device you might have a lousy WiFi router. So even if a good signal makes it to your house, your own gear might be gumming it up. When Sling TV got a universal thumbs down for doing poorly we know that they had big problems at the originating end, and they probably did not elect to pay for a premium routing of the event. But unfortunately for live streaming companies, there are always going to be customers who have a bad experience for reasons out of the programmers’ control. It might be a long time until the whole Internet is ready for high quality live streaming.

The Homework Gap

Generic-office-desktop2A newly released Pew Research Center poll looks at the impact of household income on the percentage of homes with Internet connectivity. The study shows that homes with children and with annual household incomes under $50,000 have significantly lower broadband penetration than higher income homes.

FCC Jessica Rosenworcel issued a statement after the release of the poll and called this phenomenon the ‘homework gap”. There have been discussions since the 1990s about the digital divide; this survey shows that the divide is still there and that it correlates with household income.

This finding comes at a time when computers are routinely integrated into schools. Most classrooms and schools now have computers. Also, though I was unable to tie down any precise statistic, what I’ve read suggests that a majority of teachers assign homework that requires a computer. There is also a new way of teaching becoming vogue. Referred to as the ‘flipped classroom’, this teaching philosophy requires students to watch videos and other online content at home and be prepared to discuss the materials in class (as opposed to the traditional way of showing content in class).

As somebody who has been helping carriers sell into different kinds of neighborhoods for years, the statistics are not surprising to me. The Pew study shows that over 31% of households with children do not have high-speed Internet at home. This low-income group makes up about 40% of all households with school age children. This contrasts to only 8% of homes with kids who make over $50,000 that lack Internet access.

The study looked at a wide range of incomes and is one of the more complete surveys I’ve seen showing broadband penetration rates. For example, it shows that all households under $25,000 per year have a 60% penetration of broadband while households making more than $150,000 per year have a penetration of 97%.

One thing this study didn’t consider was the other digital divide, which is the urban/rural one. According to the FCC statistics, there are at least 14 million homes in the country that don’t have physical access to broadband. And as I’ve written a number of times, I think that number is too low and skewed due to the underlying statistics being self-reported by the large carriers.

The FCC is considering if it should expand its Lifeline program to include broadband coverage for low-income households. Today that fund will chop a few dollars per month off of a phone for low-income families. The Universal Service Fund spends approximately $1.5 B per year for the program.

I understand the sentiment behind this kind of assistance. But I would be surprised if a few dollars per month will make much impact on whether a household can afford to buy broadband. It’s going to take a whole lot more than $1.5 billion per year to solve the obviously large gap for student homes without broadband. And of course, such a program will do no good in those rural places where no broadband exists.

This is not going to be an easy issue to solve. To close this gap we have to find a way to get broadband into many millions more homes. But we also would need to make sure that those homes have working computers that are up to the tasks required by homework. I’ve seen numerous studies over the years that show that low-income households have an equally low penetration of home computers as they do broadband. There are many school systems today that give laptops to kids for the school year and perhaps that would at least solve half of the issue if this was more widespread. But until all kids in a school can use those laptops at home, the kids without internet access are going to fall behind those that have it.

Adapting to Technology

speakersI’ve read several product reviews lately for smartwatches and other electronics devices where some of the reviewers lamented about how they felt they had to work hard to adapt to the technology, and how they wished things were more intuitive to use. Today’s blog is my own lament about how we have been nudged over the years  to adapt to technology, as opposed to technology adapting to us.

As an example, look at our music. There was a time back in the 70s and 80s when anybody serious about music was at least a bit of an audiophile. Anybody who loved music loved it even more when it sounded great. And people who were serious about music, which was a lot of people, did what they could within their budget to buy the best listening experience they could afford.

I am not a very materialistic person; I drive my wife crazy at Christmas and my birthday because I really don’t want things. But when I was younger I wanted better speakers. I remember that perhaps the happiest purchase I ever made was the day I got my first pair of Boston Acoustics speakers. I sat in front of them all day listening to my favorite albums. I heard things in the music I had not noticed before with cheaper speakers. I was hooked as an audiophile.

But our music world started changing with technology. First came cassette tapes. To an audiophile cassette tapes were crap. After that came CDs which had the potential to be great for new music that was mixed directly for the CD format. But CDs did a lousy job of capturing older album music unless that music was mixed again from the original source (and thus the whole genre of re-mastered CDs).

Then along came the iPod and internet download music files and the music world was turned on its head. Every audiophile groaned because MP3 files are generally of much lower quality than any original music source. The process of converting music to a new format will, by definition, chop off the highs and the lows, robbing the recording of the nuance and crispness of the original.

Worse than that, the iPod came with crappy earbuds that further degraded our music experience. But let’s face it – an iPod let you listen to your own music collection in an airport or on an airplane, and so we adapted to accepting inferior music for the sake of convenience. Next, the industry took another huge left turn and everything went to online streaming services like Spotify. The quality is nowhere near as good as a quality CD, but who can resist the fact that there are millions of songs to listen to? And we listen to these songs on our computers or with bad earbuds or with tiny Bluetooth speakers. The audiophile in me cries for the good old days.

The same thing has happened to video. There was a time in the 90s when we all went out and got the best large screen we could afford with the best resolution. Even for people like me who didn’t watch a lot of TV, seeing my first football game on an HD TV was impressive.

But now streaming video technology is luring us away from that quality and into watching video on our computer screens, on tiny cellphone screens, or on tablets and laptops. And we will suffer through some really crappy video and audio quality to watch the latest funny cat video on YouTube.

I just find it interesting how marketing has changed over the years. In the old days the expensive marketing went to lure us to upgrade – get better stereos, better speakers, better TVs. But today we are lured to accept much lower quality content on NetFlix, YouTube, and Spotify, and we do it because the range of content available makes us give up quality for quantity.

I expect that after we have all gotten used to this new world where we can watch anything, anywhere, at any time, eventually the desire for quality will creep back into the conversation. For now, we (and I include me) are happy enough with the huge selection of music and video available to us that we will tolerate a poor experience for the joy of watching things that used to be impossible to find. I notice that there are already young people who grew up with ubiquitous video and music who are rediscovering the beauty in the old stereo systems. So perhaps over time we will realize what has been lost and the move back towards quality will start all over again.

Too Many Boxes

slingbox-SOLO-angleProgrammers who want to put web video on wireless devices have a fairly easy task because they can capture almost all of the app market by working with either Android or iOS. But developing things for home TV is a lot more complicated due to the proliferation of different devices used in the home today to watch web video.

Viewers of web TV have a huge array of potential devices that act as the interface between the web and their televisions. First there are the streaming devices like Roku, Apple TV, Amazon Fire, Google Chromecast, and Google Nexus. Then there are the game consoles that support TV such as the Microsoft Xbox, the Sony Playstation, and even older consoles like the Wii. Finally, there is a huge array of smart TVs from every major TV manufacturer like Samsung, Vizio, VG, Sharp, and Sony.

The problem with this plethora of boxes is that there is no standard for the interface, so each one of them has come up with a different interface between the Internet and the big TV screen. There doesn’t seem to be any push in the industry for standardization, probably due to most of the manufacturers figuring they won’t be the big winner if all of the interfaces are made the same.

This is just as confusing for customers because there are nuances to each of these devices that are hard to understand before you buy and use them. Even comparative reviews aren’t helpful because they usually tell you very little about the day-to-day differences between each platform.

One might think that this is a simple issue and that there shouldn’t be much difference. After all, each ofttimes devices is just emulating the same role as the settop box in a traditional cable system. Each system contains what the industry calls ‘middleware’, which is the software that defines the viewer experience. In some of the devices the box plays the role of the tuner (channel changer), the channel guide that lets you decide what to watch, and the general navigation guide that lets you change settings and choose preferences.

There is a wide array of different software platforms for the various boxes, game consoles, and smart TVs. As you might expect, the Google boxes use Android and the Apple boxes use iOS. Samsung uses a Tizen platform that is based on Linux. Sony has developed a proprietary platform used for both their TVs and the Playstation. Panasonic uses Firefox OS. Amazon Fire uses a custom OS called Fire OS.

There has been some shakeout in the industry as boxes that were popular just a few years ago have fallen out of favor with the public. For instance, Boxee and Slingbox were the primary devices used just a few years ago (and many techies still love the Slingbox). But the proliferation of boxes and platforms is inviting a still larger shakeout.

The problem is that every one of these boxes sells enough units to make them profitable and to ensure that nobody controls a big enough slice of the industry to drive other companies to a common platform. The demand for watching web TV is exploding and all of these devices are selling a lot of units every year. Perhaps we are going to have to wait for the market to mature before we see any consolidation or shakeout.

While all these options can be confusing for consumers, the biggest issue with the plethora of boxes is with programmers. Developers of web-based TV packages have the issue of trying to make sure that they work with each of these different devices and operating systems. Once would think that web TV is standard, but it is not. The whole process is software driven and so a web programmer must customize the interface to each of these platforms. That sounds like a lot of lab time and a lot of integration, and worse, it’s never done because each programmer needs to then keep up with software changes on each of the various platforms.

Further, many of these boxes see major upgrades frequently and those upgrades are often not backwards compatible. For example, several CCG staff have Roku boxes, and these have undergone a major upgrade at least every six months, so programmers don’t just have to work on Roku, but they have to work on multiple generations of Roku.

This issue is not part of the investigation at the FCC on how to promote web television, and it probably shouldn’t be. But the issue is a real one and until the day comes when we have standards or until there are only a handful of market winners this is likely to stay a jumbled mess.

Premium Telephone Numbers

4cb1f2dc96040One of my industry colleagues sent me an article that shows that there is a big market for premium telephone numbers. By that I mean that there are people willing to pay significant sums for numbers from ‘premium’ area codes, or numbers that they think give them a marketing advantage for some other reason.

The article I read interviewed the owner of PhoneNumberGuy.com. Ed Mance, who operates the business, got into it years ago when he opened up a new business and was unable to obtain a contiguous block of numbers for all of his new telephones. He spent years looking into telephone numbers and in doing so found that there are people and companies willing to pay a premium for certain numbers.

As an example, the old area codes for major cities, like 212 for New York City, 202 for Washington DC, and 415 for San Francisco have been full for years and new subscribers in those areas are given different area codes today when they get a landline or a cellphone. When new people enter these markets they often have a reason to want the old, nationally-recognized numbers, as opposed to the new, lesser-known codes. If you are a new DC attorney, or a tech person moving to San Francisco, then I am sure there is some cachet and advantage to having the older area codes. And there has always been a market for telephone numbers that translate to words such as 800-FLOWERS (I have no idea if that is a real number).

It’s the amounts that some numbers go for that I find intriguing. A regular number from one of the ‘old’ city area codes ranges from $299 to $799. But specialty numbers can go for a lot more. For example, the biggest sale quoted was a sale where all ten digits had the same number, which sold for $95,000.

One of the things our consulting firm does is to help service providers obtain telephone numbers. In order to get numbers from NANPA/Neustar (the company that controls telephone numbers) you must be a certified telephone company, a wireless company, or a CLEC. The whole process is highly regulated and has been designed by the carriers, under FCC oversight, to make sure that numbers are available to all carriers.

Ed Mance says that he gets numbers by buying them from companies that no longer use them. If so, he is violating industry rules because customers don’t ‘own’ their number, and so a company cannot really sell a number they are no longer using. A phone number is yours to use for as long as you keep paying for it, but when you stop using the number it is supposed to go back to the carriers that gave you the number originally. And ultimately all numbers belong to the folks who run the North American numbering plan. Unused or abandoned numbers are supposed to go back into the general pool of numbers that are available to all carriers.

I find it interesting that this underground economy for numbers exists. Certainly the idea of selling ‘vanity’ numbers started back with Ma Bell who would charge you a few dollars extra per month if you wanted to specify the number you wanted (and if it was available). But phone companies never tried to charge large fees of hundreds, or thousands of dollars for a number, partially because they never thought of it, but more because they were willing partners in the numbering process.

It’s possible that the number guy is actually getting his numbers from some carrier, and not directly from customers who are abandoning them. While there is no specific rule that I can find against such a practice, it still flies in the face of regulatory rules to charge a premium price for a commodity that came into your possession through an extremely regulated process. If you recall, just a few years ago we were worried as a country about running out of telephone numbers, and with the explosion of cell phones we might get to that point again. It just doesn’t feel right to think that there are businesses that are hoarding numbers, which is why the whole numbering administration was created – to make sure we would always have numbers available.

This could get worse. There is a petition at the FCC for VoIP providers to be able to get telephone numbers directly instead of having to go through a certified telephone carrier. If the FCC grants this petition we are liable to see a proliferation of these kinds of practices because some of the VoIP providers are a lot looser about following industry rules than regulated companies. Many of them largely ignore regulatory rules unless forced to follow them.

I’ve had a 202 cellphone number forever and I can’t imagine ever giving it up. So I can see the lure of wanting a number that people recognize. But part of me has to ask: in today’s world of cellphones and dialer apps, do people really care much what number they are calling – or if they even know what number they are calling? But obviously if somebody is willing to pay $95,000 for a number, then there must be somebody who cares. It wouldn’t be hard to find the guy who bought that number to ask him if it was worth it. There are only 7 possible such numbers in North America (can’t do it with 0, 1 or 8).

Recovering Television Spectrum

Rabbit_Ears)Lately, the FCC finds itself in sales mode as it works to convince television station owners to sell their existing spectrum. For those not familiar with what the FCC is doing, this process is being referred to as an incentive auction for the 600 MHz band of spectrum. This spectrum today is owned by UHF TV stations.

This is spectacular spectrum and probably has the best characteristics for delivering wireless data. The spectrum easily carries to the horizon and it blasts through just about anything. I remember as a kid watching TV in a basement from a transmitter that was on a mountain on the far horizon. There is no better spectrum for the cellular companies than these bands.

This is called an incentive auction because TV stations are not being mandated to leave this spectrum. So the FCC is now engaged in a series of regional meetings to try to convince the stations to sell their spectrum. The auctions are expected to be lucrative, and station owners and the FCC will share the auction revenues. The AWS auction last November was wildly successful for the FCC. The FCC had set a minimum threshold on the spectrum at just over $10 billion and the final auction raised over $34 billion, and AWS spectrum is not even close to the great coverage characteristics of the 600 MHz spectrum. The TV spectrum should be far more lucrative since this is basically the holy grail of spectrum.

But many stations are hesitant to sell their spectrum, even at the billions they are likely to reap. The FCC has put together a complicated proposal to ‘repack’ the spectrum so that a station that sells its spectrum can stay on the air. But that is the part of the whole process that has stations nervous. It’s possible that a station could be given a slice of spectrum that is used by somebody else, such as sharing the space with wireless microphones. The repacked spectrum also doesn’t have as much of a cushion around each channel as exists today, which makes stations worried about out-of-band interference.

Having no interference is vital for television stations for several reasons. Historically, local stations got their revenues from advertising, and the rates they can charge are based upon how many theoretical eyeballs can watch them plus their rating in the local market. TV transmission is a tricky thing. For homes near the base transmitter, the power of most TV stations can overpower most interference. But, as you get to the further edges of the transmission path interference becomes a real issue. And in TV, interference is manifested by poor reception and pixelization. So TV stations are worried that their effective delivery circle will get smaller and that there will be significant interference in parts of that area.

The financial issue is further complicated by the fact that local stations (or their corporate owners) today make a lot of money from local transmission agreements. These are fees that are charged to cable providers that want to retransmit their station on cable systems. The fear here is the same in that they are worried that cities near the fringes of their service area might argue that they no longer owe retransmission fees due to degraded quality.

Unfortunately there is no way to pre-test the delivery in one of the repacked blocks. Spectrum engineering is really complicated stuff and the quality of a transmission will vary widely in different pockets of a spectrum delivery area based upon local conditions. The only way to test it is to send out the signal and see what kinds of complaints you get from viewers.

The FCC is putting everything they have into these meetings with Chairman Tom Wheeler attending most of these regional meetings to talk with television station owners. There are already a number of stations that have said that they are interested in joining the auction, but the FCC needs a significant number of them to join before the auction can proceed.

Big cellular companies won’t be the only ones to benefit from the spectrum; the FCC has promised that there will be slices of this spectrum set aside for WiFi and other public uses. So the whole country is on hold waiting to see if the FCC can convince enough stations to move. The billions that the stations can collect from the auction is certainly an incentive, but we are going to have to wait to see how many of them actually make the big leap. It ought to be an interesting summer.

The Gigasphere

cheetah-993774If you haven’t already heard it, you will soon be hearing the term ‘gigasphere’. This is the marketing term that the large cable companies are adopting to describe their upward path towards having faster data speeds on their cable systems. The phrase is obviously meant as a marketing counter to the commonly used term of gigabit fiber.

The gigasphere term is being promoted by the National Cable Television Association (NTCA) as the way to describe the new DOCSIS 3.1 technology. This is a technology that can theoretically support cable modem speeds up to 10 Gbps download and 1 Gbps upload.

The large cable companies are all starting to feel consumer pressure from fiber, even in markets where fiber is not readily available. Google and other fiber providers have excited the public with the idea of gigabit speeds and I am sure cable companies are being asked about this frequently.

Right now the term gigasphere is largely marketing hype. If you have fiber to your home or business, then with the right electronics you can get gigabit speeds. But cable systems have a long way to go before they can offer gigabit speeds over coaxial cable. There is already talk of cable companies offering gigabit products, such as the recent announcements from Comcast. But these speeds are not being achieved using coaxial cable and DOCSIS 3.1, they are using fiber – something Comcast doesn’t highlight in their marketing.

With enough upgrades and money, the cable systems can eventually achieve gigabit speeds on their coaxial networks. But for now their speeds are significantly less than that. A cable company faces a long and complicated path to be able to offer gigabit speeds over coaxial cable. Their biggest hurdle is that the bandwidth on their cable systems is mostly used by TV channels, and only empty channel spaces can be used for data. DOCSIS 3.1 allows a cable system to join together the spare channels on their network into one larger data pipe.

In order to get to gigabit speeds a cable company has to convert all of the channels on its network to digital, something most of them have already done. But further, they are going to need to treat them the same as TV on the web – transmitting them as raw data instead of as individual channels. Cable systems today use a broadcast technology, meaning they send all of the channels to customers at the same time. But if they convert to IPTV they can send each home just the channels they want to watch, which would massively condense the system bandwidth needed for television.

But this conversion is going to be costly and the equipment to do it is not yet readily available. CableLabs is working on this technology and it ought to be on the market in a few years. But that change alone is not the whole price of conversion. An IPTV system will require all new settop boxes, and in many systems a major reworking of the power taps and other components of the outside cable network. I don’t see many cable companies rushing towards this expensive conversion unless they are in a market where it is competitively necessary.

So for now, the gigasphere is mostly a marketing phrase. But it’s one that you are going to start hearing all of the time in relation to cable system data capabilities. This will obviously confuse the public who will assume that gigasphere means that they will be able to buy gigabit speeds from their cable companies, when they almost certainly cannot.

It’s not like cable companies don’t have fairly fast data capabilities. Most urban systems today are already capable of speeds in excess of 200 Mbps download. And there are systems working to get to 500 Mbps, which is probably about as fast as you can go without converting to IPTV. But it seems the marketing folks in the industry are counting on the fact that customers won’t know the difference between the various flavors of fast and will be happy with their gigasphere products. And they are probably right. Where’s my 500 Mbps cable modem?