The Industry

25 Years Since the First Web Site

aol-dial-up-630x354Today I was reading and thinking about all of the different ways that various governments around the world are trying to somehow control and regulate content on the web. And while doing so I saw an article that said that we just passed the 25th anniversary of the first web site. That honor belongs to the CERN research facility in Switzerland. Tim Berners-Lee, a scientist there, posted the first web site on December 20, 1990. This first web page explained how the web worked and provided a few links on how to use the new World Wide Web.

Before web sites there was already a very active online community. These were the days of dial-up ISPs and bulletin boards. It’s easy today to complain about the price of broadband, but most of us have forgotten when you had to pay a monthly fee plus rates of between $1 and $6 per hour to gain access to online dial-up services.

This was all before AOL was large. At the time the predominant web services available were Prodigy and CompuServe. These were the first two services that offered a wide range of different services from email and chatrooms, to gaming and news services.

But the real fun online was to be had using the hundreds of different private servers throughout the country. Guys with a computer and an interest in something would start their own service and their own community. I remember joining an online baseball fantasy league in the early 80s and also playing trivia online at a private server.

What is most amazing is how far we have come in just 25 years. At that time the fastest dial-up modems were at 14.4 kbps. The number of people who were active online was a few million at most. Since then we have already gone through several different major ages of the Internet. First was the AOL age where AOL aggregated so much content that they killed most of their competitors and became the predominant way to get online.

Then came web pages and all of a sudden everybody was adding ‘content’ to the web. We all ‘surfed’ the web looking at the huge variety of pages that people would post. Both businesses and people created their own web pages. I can recall a period where I was amused by using a service that would show random web sites.

Web sites also unleashed e-commerce and all of a sudden shopping became a big thing. eBay was huge for a while until it was eclipsed by Amazon. This was followed by the age of social media and social interactions. Facebook obviously won the social media war but in the early days there were many smaller social media sites that were more specialized and fun. And things like dating online became a big phenomenon.

Today we have entered the age of web video and the hours spent watching video on the web eclipses everything else being done.

It’s easy to think of the web as some sort of fixed thing, but the fact is that there have been major changes every few years since 1990 and there was always something new being done or something new being trendy. Kids today would die if they had to endure the web experience of the mid-90s.

During each of these various phases of the web there have been major issues that everybody was concerned about. For every new innovation that came along something negative was also introduced. We’ve suffered through piracy, spam, hackers, trolls, and viruses at various stages of the web’s development.

This flood of memories brings me back to my original thought about regulators trying to control what happens on the web. Mostly I find the idea amusing, because whatever it is they think they are regulating will change faster than any laws they can formulate – their efforts are always going to be a few years behind web innovation. The Internet genie is out of the bottle and I can’t imagine any way to put it back inside.

The Industry

Dial-up is Still Around

Since most people in the country can get some form of broadband many people think that dial-up is dead. We all remember those days of trying to get a connection to a modem and then listening for the beeps and boops. But I looked and found there is still a significant dial-up business in this country.

At the end of 2014 AOL still claims to have 2.3 million paying dial-up customers. That is obviously way down from their peak when they had 126 million customers, but it’s still a very impressive number. AOL said those customers account for $155 million in revenue, which still exceeds the company’s next biggest revenue source which is advertising at $144 million.

AOL is not the only one still in the business. Some other big names from the past are still around like EarthLink and NetZero. EarthLink advertises that it has the most dial-in numbers in major markets like 50 in Miami and 45 in San Diego. And then there are dial-up companies that you probably never have heard of including Basic ISP,, Turbo USA and Finally, many telephone companies like AT&T still offer dial-up. A surprising number of my smaller telco clients also still operate small pockets of dial-up customers.

It’s hard to get industry figures since most of these companies don’t publish their customer counts, but if AOL still has 2.3 million customer then nationwide there must be more than 4 million households still using dial-up. The FCC says that about 2% of households are still on dial-up, but AOL alone is slightly more than 2%.

Dial-up has gotten better than what most of us remember due to the use of compression techniques where the ISP will compress whatever is being sent to the dial-up customer. But it’s still agonizingly slow compared to other broadband and the realized speed of dial-up is still capped at 56 kbps on good copper. And much of the copper that is left is not very good. With compression techniques dial-up can appear to be twice that base speed.

The low speeds keep dial-up customers relegated to using very basic Internet functions such as email. Browsing the web can be incredibly slow since many website now include advertising and video and take a long time to open. Since shopping on the web is now very image oriented that can also be too slow for dial-up speeds. And obviously dial-up households can’t get streaming video of any kind since it requires anywhere from  steady 1 Mbps at the lowest quality up to 6 – 8 Mbps for the new 4K HD video.

So who is still using dial-up? It appears that there are three distinct communities. First are people everywhere who barely use the Internet and want the cheapest connection possible. Such people don’t do a lot more than check email and do basic tasks. Second is in immigrant communities where one would suppose that the low price is also important.

Finally are rural people who have no other alternative except maybe satellite. For those who have never used it, satellite broadband is not a great product. It’s very expensive with base plans between $60 and $80 per month. It is faster than dial-up, but it still has latency issues which make it hard to use for any real time purposes such as web voice or streaming video. It also comes with low and strict ceilings on monthly data usage. WildBlue has a monthly cap of 17 GB in total downloads, HughesNet is 20 GB, Exede is 25 GB and Dish is 30 GB.

One would think that if AT&T is really able to cut down millions of rural copper lines like they want that a lot of dial-up customers will disappear. All of those rural houses that use dial-up today as their most affordable option will end up with either satellite or cellphone data plans.

Writing this blog made me pause to marvel at how fast our technologies change and grow. The heyday of dial-up was only twenty years ago, and we have come so far since then. We think of dial-up as something ancient and yet twenty years is nothing in terms of mankind’s history. But in that very short time we have grown from having over half of the country on dial-up to seeing some cities connected with gigabit speeds. I remember when I was on dial-up and I envied a few of my friends who were on a shared T1 at their office. I would have called somebody crazy if they said then that within twenty years that people would be able to get a gigabit at their house.

The Industry

Who Owns Internet Ad Space?

Google made a very interesting announcement a few weeks ago that led me to find out more about the ad space on web sites. Google announced that for $2 per month they would block all ads on web sites for a customer as long as they browse through the Chrome browser.

I find this fascinating because it means that Google thinks that they have the ability to block an ad, even when they are not the one to have placed the ad in the first place. Google sells a lot of ads, and so it makes sense that they can block ads that they have placed on a web page. But when they say they can block all ads it also means that they think they have the ability to block ads placed by somebody else.

Just to be clear about what I mean by ads, look at this web page. At the top is a banner ad. At the top right of the story is an ad. And across the bottom of the article are four ads. After loading this web site multiple times I noticed that the ads changed.

It turns out that there are two kinds of ads on a web page. There are fixed ads and remnant ads. Fixed ads are placed there by the web site owner or somebody they partner with to advertise for them. Fixed ads embedded into the web page and can only be accessed by the website owner. The other kind of ads are called remnant ads. These are coded in such a way as to be available to outsiders, and anybody that has access to a website before it reaches a customer can change what is in the remnant ad space.

And as you would expect, these remnant ad spaces get changed all of the time. There are a lot of companies that sell advertising into the remnant ad space including Google (DoubleClick), Yahoo, Amazon, Facebook, AOL, AppNexus, Openx, Adroll, RightMedia and dECN. It was very easy for me to spot remnant ads in the recent election season, because I swear that every web page I looked at here in Florida had a political ad for Rick Scott who was running for reelection as Governor. So somebody was being paid in Florida to put those ads onto Florida computers.

The first question this raised for me is: who owns this ad space? The web page example is from the TechCrunch web site. TechCrunch chose to make the ads open to the public and I assume they gets revenues from at least some of the parties that use that space, which is their motivation to use remnant ad space. Google thinks they have a right to go in and block whatever is on the remnant ad space on that page, so they are sure that it is theirs to grab. I know that some of the larger ISPS like cable companies are also in the advertising business, through partners, and I wouldn’t be surprised if it was Comcast that gave me all of the Rick Scott ads.

I was shown a recent legal opinion by one of the companies that advertises in the remnant space who was gracious enough to share it with me as long as I don’t publish it. The opinion says basically that nobody owns the remnant ad space. The legal opinion says that the act of a web site owner in making this available to the public means just that, and it can be used by anybody who somehow has access to the website before it reaches a customer. That generally is going to mean some company who is part of the chain between a web site and the customer. Obviously the web site owner can hire somebody to place ads in the remnant space. If you reach the web site through a browser then the browser owner can place the ad in there. If you get to a web site through a link on another web site like Yahoo News then they can place ads there. And your ISP also would have access to this ad space.

I really like the Google product that blocks ads. I think there are plenty of customers who would love to avoid all of those ads. Further, blocking ads means a faster Internet experience for a customer. I know there are web sites I go to that have multiple videos automatically running that seems like an extravagant use of my bandwidth. I have a 50 Mbps Internet connection and there are still web sites that load very slowly due to all of the extra videos that have been layered into the ad spaces. I also learned that remnant ads are one of the most common sources today of adware and malware and I will talk about that more in tomorrow’s blog.

Current News Regulation - What is it Good For?

Should the FBI Be Able to Wiretap the Internet?

There is currently a government task force that is working on proposed legislation that would give the FBI the ability to ‘wiretap’ data. This is very different from what is available today. Today, ISPs are required to comply with the ability to turn over electronic records by a series of laws referred to as CALEA, which is from the Communications Assistance for Law Enforcement Act. Under a CALEA an ISP might turn over emails or a list of the web sites that a given customer has visited. ISPs generally retain such data for 60 – 90 days for all customers and it is then automatically deleted unless law enforcement requests it. But CALEA requests generally are for historical data and are not ‘wiretaps’ when it comes to data usage. This new proposal would give law enforcement live access to a customer’s data in the same way that phones have been historically wiretapped. And this is a law with teeth. The proposal includes a $25,000 per day fine for companies who aren’t wiretap capable, with those fines doubling after 90 days for non-compliance.  There are a number of issues with this idea.

It Goes Against the Direction of the Industry

The business world is rapidly heading to the cloud with data. There is a long list of benefits of using the cloud and businesses get it. But before a business will send sensitive data out of their control into the cloud they generally encrypt it (or they should). Companies are not going to put sensitive financial data, trade secrets and things like legal correspondence into the cloud if there is any chance that other parties can somehow crack and read the data. The whole point of encryption is that only the parties involved can unencrypt it.

It seems like the FBI law would forbid this kind of encryption. This would have a ton of ramifications on the industry. Businesses are going to refuse to put sensitive information into the cloud if it can’t be encrypted. This means that they will probably continue to use company-specific LAN storage rather than the more efficient cloud. Further, company lawyers are going to advise companies to not use the cloud if everything there can be wiretapped. Today a subpoena is required to get information that a company keeps on their own servers. But a wiretap at an ISP could be done without the knowledge of the person or company being investigated. No corporate attorney is going to agree to let a company expose themselves to being investigated through the back door just to gain the advantages of using a cloud service.

The FBI’s idea will also put all of the companies that supply encryption out of business. There are a number of businesses that sell encryption to cell phones such as Cryptocat, Silent Circle, Red Phone and Wickr. There are many software packages that can be used to encrypt data files such as Folder Lock, SensiGuard, Safehouse, SecureIT, Cryptoforge and many others. And almost every maker of carrier class transmission equipment, servers and related software has an encryption product.

It’s Costly

One of the biggest issues with the proposed bill is that it casts a far wider net of companies who must comply with a wiretap than who must comply today with CALEA. Today CALEA applies to the companies that supply a basic data pipe to a customer, to whoever is the physical ISP. This may be a telephone company, cable company, wireless ISP or cellular provider. But every firm who must meet CALEA today is a carrier of some sort. They have a physical hub where they perform ISP functions. These hubs are the sort of places where CALEA makes sense.

But the proposed law would impose a more complex obligation on other web-based platforms like Facebook, Google, Yahoo and AOL. Those are all big companies and one might assume that they can all afford to do this, and you might be right. But the same requirements would apply to much smaller firms and start-ups who store and or process customer data. It’s going to be technically challenging for a web-based platform to give live access to data. They just are not configured that way. And the cost to design a system to enable that is going to be costly and inefficient.

The cost of compliance will deter future small start-ups. And if you don’t think that is true, let me give you a real life example of when CALEA costs became an issue for a small carrier. It is very difficult for a small ISP to comply with CALEA on their own, so there are companies who sell CALEA compliance. If you get a CALEA request they overnight you a black box that rides next to your core servers and captures the data that law enforcement wants. This kind of service costs about $600 per month. I have a small City client who wanted to become an ISP just to serve themselves, some other local government agencies and some non-profits. Since they were facility-based using their own servers then CALEA applied to them. They almost decided against doing this since the CALEA fees ate up most of the monthly savings they were trying to bring to their town. I know that is a very tiny dollar example, but I foresee the new requirement to be much more costly than CALEA. Small firms will have a very difficult time creating the ability of live data wiretaps and this is going to stifle small web firms.

It Goes Against the Basic Premise of the Internet

The main premise of the Internet is that it is a decentralized network. The wiretap proposal relies on some of centralized hub in order to implement a wiretap. There has to be a place where you can guarantee that the data the government wants to see will flow. That is a whole lot harder than it sounds and it would end up resulting in some fundamental changes in the way that Internet traffic flows. And that could be the costliest impact of all.

The traffic on the Internet keeps growing at nearly exponential rates. Carriers have been able to keep up with the bandwidth demands because they have upgraded the networks to be more and more efficient over time. This change would go in the opposite direction and would make the network more inefficient.

I fully understand and appreciate the needs of law enforcement. But this could be one of the biggest unfunded mandates ever if it ends up impeding the efficiency of the Internet. The Internet is now a fundamental part of everyday life and is a lifeline for most businesses.

It just seems like a colossally bad idea to me to impose a costly change on everybody that is intended to only catch a few bad guys. Particularly when the smart criminals will avoid these wiretaps. They will find a black market way to self-encrypt their data or they will avoid the web altogether. So this is really just a proposal to catch the dumb criminals. It seems like too great a cost for such a paltry goal.

Improving Your Business Technology

Data Mining – It’s Not What Customers Think

I know that when the public hears that their ISP is engaging in data mining that they assume this means that the ISP is reading their emails and monitoring their website viewing. And ISPs do have the ability to do those things although I don’t know any who spy on their customers in that way.

I can certainly understand why data mining scares the average consumer. Supermarkets get you to sign up for their loyalty programs so that they know everything you buy from them. And I know I get a spooky feeling when I express an interest about some product in one place on the Internet and then see ads for that product pop up on Facebook or my Google search.

But data mining is a valuable tool and every ISP should be using it – just not in the same way that the supermarkets and Facebook do it. In fact, we probably need to come up with a better terminology for doing the things I am suggesting below.

There are a number of tools around that let you look at data about customer usage and these tools allow an ISP to do the following:

  • Spambots. There is a wide array of spambots and other malware on the web that can infect customers’ computers. The worst of these, from a network perspective are spambots, which take over your customer’s computers and use it to send out spam. Most ISPs monitor email usage from their own domain and can spot when one of their users has been taken over by a spambot. But most customers these days do not use the email names and domains assigned by their ISP. Instead they web email addresses such as gmail or even the older AOL. And some spambots create new email addresses that the customer doesn’t even know about. And so data mining can be used to look for customers with unusual upload traffic. No customer is going to be offended if you ask them if they are uploading traffic 24 hours per day if in the process you help eliminate Trojan horses and spambots from their computer.
118 – Another File Sharing Session (Photo credit: erickespinosa)
  • Web servers. Most ISPs do not want a customer to be using a residential ISP account to run a commercial web server. A web server is a device that is being used to run a website or service that drives a large amount of download traffic. Such a website might be used for e-commerce for example. But far too often web servers are used to run porn sites. ISPs are not against web servers, but they do expect people who operate them to buy the proper business level service. A web server can be full 24-hours per day, and that is generally not the level of service that is intended for a shared residential product. Data mining can be used to identify web servers and the customer can be directed to a more appropriate (and appropriately priced) service.
  • Data Caps. Most ISPs have set some cap on the amount of usage that a customer can download in a month. And these caps do not have to be small. I have one client that has a 2 terabyte cap each month for residential downloads. But there is no sense in having a data cap if you can’t actually measure how much bandwidth each customer is using. Data mining tools are the way to measure customers’ usage.
  • File sharing. Most ISPs have terms of service that prohibit customers from sharing copyrighted materials with others. But realistically an ISP is not going to know what customers are sharing with each other unless you get a complaint from a copyright holder. But many ISPs still like to get a handle on file-sharing because such traffic can eat up a lot of system bandwidth. Data mining can help you identify customers who are probably involved in one of the common file sharing programs.  An awful lot of file sharing is done by teenagers. I have clients who send out friendly reminders to customers who they think are file sharing that say something like: “We notice by your internet usage that you are probably running a file sharing program. We would just like to remind you that it is illegal to share copyrighted material and that there have been cases where copyright owners have gotten significant settlements by suing people who were sharing their property.” Such notices cut down on a lot of file sharing traffic as parent pressure kids into doing the right thing.

So you should be data mining. But perhaps the things I have described could all better be classified as network management, a term that would not dismay your customers.

Exit mobile version