What if We Never Get Net Neutrality?

Network_neutrality_poster_symbolI’ve been reading a number of summaries of the million plus comments made in the net neutrality docket at the FCC. Since we are sitting today with basically no rules in place on the issue I wondered a bit about what would happen if we never put any effective rules in place. I think we know the answer based upon the behavior of various carriers in the past during times when there were rules in place. Following are some of the most publicized examples of carrier behavior for the past decade in the US and Canada that would be in violation of net neutrality rules:

2005. Madison River Communications, a small telco, blocks customers from using Vonage.

2005. Comcast blocks numerous peer-to-peer sites like BitTorrent.

2005. Telus, the second largest ISP in Canada blocks hundreds of web sites associated with a union they were having disputes with.

2006. AOL blocks emails that contained the words ‘DearAOL’ which is the name of a group that was protesting AOL’s spam policies.

2006. Clearwire Communication and Bell Canada block VoIP from competitors of Clearwire.

2007. T-Mobile blocks customers from using other vendors for VoIP or text messaging.

2007. Both Cisco and Nokia announce products that make it easy for ISPs to block specific web traffic. While this in itself not a violation, I think it demonstrates that these companies found a market demand for blocking of content. I suspect there are a lot of net neutrality violations that never made it to the press.

2008. Apple blocks Skype on iPhones subject to a secret contract with AT&T.

2010. Windstream hijacks search queries using Google Toolbar and Firefox and redirects the searches to its own search engine.

2011. MetroPCS announced it would block all streaming video except YouTube from its 4G platform.

2011. Verizon, AT&T and T-Mobile all block Google Wallet from phones in favor of their own payment system called Isis.

2011. Large ISPs like Comcast, AT&T and Verizon are accused of slowing down Netflix.

2012. The FCC caught Verizon Wireless blocking people from using data tethering from their cellphones after the company had promised in 2008 to stop the practice.

2014. Verizon Wireless says it will throttle customers with older unlimited data plans first whenever there is contention on its network.

2014. Virgin Mobile offers plans for access to popular web sites that don’t count against data caps.

I know to me that some of these events, like Comcast blocking peer-to-peer sites feels like ancient history – but this has all happened in the last decade. And this was a decade where the FCC was actively trying to police and stop net neutrality violations. It’s not hard to envision that carriers will go much farther in the next decade if there are no restriction on them. I’ve read estimates that in Europe about 1 in 5 customers is affected in some way by lack of network neutrality rules. I think it’s clear in this country that ISPs large and small will restrict customers’ choices if there is no consequence for doing so.

It’s a slippery slope. But it doesn’t take much to imagine going from Virgin’s plan to give special access to Facebook to ISPs who only give access to certain sites if that makes them more money. The counter-argument to net neutrality is that the market will stop the worst abuses. But the vast majority of us only have access to the Internet through one or two major ISPs, and that makes it nearly impossible for consumers to vote with their feet.

Getting the Biggest Bang from Your Web Presence

GooglelogoSearch Engine Optimization (SEO) is the art of getting your web presence noticed by web search engines. There are two ways to get noticed – buy your way to the top of a search or else maximize SEO if you don’t pay. Google will move anybody to the head of the search results if they are willing to pay for the privilege since that is one their major sources of revenue. For example, if you start typing in the word ‘American’ into Google, by the time you have typed the first three letters ‘American Airlines’ will be at the top of the search results.

But hopefully your company website, blog or other web pages don’t share a name that is similar to one of the big companies that are willing to spend the money to be first. SEO is about the steps you can take to make sure that non-pay web content can get noticed.

Every search engines is a bit different in how they rank web content in the search process, and those definitions change all of the time as the search engines tweak their algorithms. But there are enough similarities that SEO is able to make some decent generalizations about those steps you can take to improve the ranking of a web page.

In writing this article I took a look at how my own company, CCG Consulting, is ranked in a Google search. As it turns out, there are a whole lot of companies with the name CCG and even a number of them who are also consultants. There is CCG in a number of different fields such as engineering, networking, banking and cardiac research – there is even another CCG Consulting in the telecom field that specializes in helping MDUs and similar properties with telecom issues.

Because there are a lot of similarly named firms, and none that have paid to be at the top of the list, it is hard for any firm named CCG to get noticed. This blog you are reading appears in the middle of the first page of the Google search but my web site doesn’t appear until the third page of the Google search results.

This makes sense to me when you consider the way search engines do their rankings. One of the best ways to understand SEO is through something clever called the Periodic Table of SEO Success Factors. This shows the various factors that influence a search result and also ranks those factors by influence, from strongly positive to strongly negative.

Things like quality of content, use of keywords, the ability of a search engine to ‘crawl’ or read the page all are factors that have a strong influence on the ranking in Google. And there are also things that lower a search engine ranking such as having hidden pages that only a search engine can see but not humans, or buying links to try to improve your ranking.

I am not disappointed in the rankings for my web site because I don’t count on people finding me through a Google search. I maintain a web site mostly as a way for others to verify that our company exists and I don’t count on it as much of a marketing tool. Also, if somebody already knows me and searches for CCG Consulting and also types in the world telecom, half of the first page of search results are about my firm. Since I work in a very specialized market I know that a lot of my targeted potential clients already know about me from other sources such as this blog or Linked-In.

But these search results matter a whole lot to any company selling products to people that they don’t know. Let’s consider the example of a competitive carrier that sells the triple play services in Akron, Ohio. One would hope that when somebody searches Google for ‘Internet’ and ‘Akron’ or ‘cable TV’ and ‘Akron’ that this company would appear high on the search results. This is important because this kind of search is how a potential customer moving to that market might find their new ISP. If this carrier wants to be considered by new customers then they need to understand the way that search engines work. Otherwise they will miss out on the opportunity to sell to new customers.

It’s easy to find ways to communicate with people who already know you or who are already your customers. But it’s important to also think about how people who don’t know you are going to find out about you. I know that companies use things like billboards and newspaper in the hope of getting noticed and for building brand awareness. But I also recommend that you go to Google and Bing today and search for your company in the same way somebody who doesn’t know you would do. Don’t search by your company name, but rather by your products and market.

If your company pops up near the top of the first page then you are in good shape. But if you don’t, then consider what this means. If you want customers to find you on search engines you have to learn more about SEO. You might even consider buying priority at the search engines. I have several clients who have told me that they never seem to get business from people who are just moving to town, and not having web pages that get noticed is probably one of the reasons why.

A Little Accounting Nostalgia

Generally_accepted_accounting_principles,_GAAPThe FCC is looking to modify and simplify Part 32, which is their rules on how regulated telephone companies must keep their books. While every company must keep a set of books for tax purposes, the specialized accounting that is unique to the telephone industry is becoming less relevant over time.

There was a time when the way a telco accounted for things was of paramount importance. This was due to the fact that telcos were rate-of-return regulated, meaning that they were guaranteed a modest profit. And the rates they charged and the profits they were allowed were determined by how they spent money and by the way they kept their books. I spent the first twenty years of my career knee deep in these accounting issues, and looking back at the historic telco accounting practices is nostalgic for me.

This process of increasing or changing customer rates was referred to as rate-making and the formal proceedings at Commissions for doing so was known as rate cases. Every state Commission, and even the FCC had a slew of rate case experts on staff. This process is just about extinct since all of the big telcos have elected price-cap regulation. This means that they have agreed to not raise residential telephone rates more than a certain amount, and for that agreement they are mostly free to charge what they want for other products and services.

But until price-cap regulation came along, rate cases were the lifeblood for large telcos. Telcos scrutinized how they accounted for everything, because different accounting categories had a different impact when it came time to calculate the costs of their products. For example, a phone company might closely scrutinize the way that time spent by repair technicians since there were often options of accounting for that time in several different expense accounts or even capitalizing of it, meaning that the expense could be considered as part of an assets. By constantly reviewing their accounting, telcos could maximize the amount of money they could justify charging for rates.

Accounting was even more critical for smaller telcos. In addition to rate-making for local rates small telcos also used their accounting to drive the separations process. These are rules that are detailed in the FCC Part 67 and Part 69 rules. Part 67 determined how much of their costs were associated with long distance and Part 69 defined how those long distance costs were allocated to various access charges that were billed to long distance companies that wanted access to their networks.

Accounting and separations often drove the behavior of telcos. I can remember many instances when separation rules would indicate doing something one way while good engineering practices would suggest doing it a different way. And this still happens today. Just recently NECA (the National Exchange Carriers Association), which is a voluntary pool where small telcos still pool their costs and access charges, told members telcos that the separation rules still strongly favor a company owning their own standalone voice switch. However, it has become far more economical in many cases to share switches in the cloud. And yet companies will forego moving their switching to the cloud as long as the arcane accounting and separations rules will pay them more to stick with the less efficient practice.

The separations process was a huge deal for small telcos. Before Part 67 came along in the 1960’s the small rural LECs were often operated on a shoestring by farmers or other rural folks who just wanted their neighbors to have telephone service. But Part 67 required AT&T, who had a monopoly on long distance, to chip in to cover the long distance share of the operating costs for these small companies. For the first time ever these small rural companies made enough money to be able to expand their networks and to improve technology.

And this was all driven by the way they kept their books. There were specialized consultants and accountants who advised small telcos on the best way to keep their books in order to make the most money. To some extent, the world of small telcos was a world of accountants.

To a large degree this has either gone away, as in the case of the large telcos, or has shrunk in importance for smaller telcos. The industry is in the midst of a multi-year phase-down of access charge rates with the expectation that access will probably eventually go to zero, and when those revenues are gone the need for separations will also be gone. But it was an interesting time. I helped hundreds of telcos navigate the accounting and separations rules and to file local rate cases. It is an interesting part of the history of the industry that helped to build a robust rural telephone environment, but those old practices will be completely gone in a few more years.

The First Transcontinental Call

StampLast week I highlighted a number of events in the history of telecom and over the next few months I am going to look at a few of them in more detail. One of the best stories in the industry’s history is about the completion of the first transcontinental phone call.

AT&T and a few other phone companies had begun building long copper routes between cities as early as 1885. For example, the route between New York and Chicago was completed in 1892. In the early days of the copper technology calls could be extended for some distance using load coils, a technology still in use today on rural copper networks. But that technology had a distance limitation of about 900 miles for a given call.

In 1908 Theodore Vail, the president of AT&T made it a company priority to have transcontinental calling even though the technology didn’t exist to make it work. The stakes were made higher in the following year when John J. Carty, the chief engineer at AT&T announced that AT&T would have the ability to make transcontinental calls in time for the 1915 exposition planned in San Francisco to celebrate the completion of the Panama Canal.

So the race was now on. AT&T announced that they would pay well to anyone who could bring them a technology that would help them extend calls farther. In 1912, inventor Dr. Lee de Forest brought an audion to AT&T’s engineering department. This was a three element vacuum tube that amplified the telephone signal enough to get AT&T’s attention. So AT&T bought the patent for the technology. By the following year in 2013 Dr. Harold Arnold, a physicist employed by AT&T perfected the invention by increasing the amount of vacuum and other tweaks, and the audion became a practical amplifier.

Now AT&T needed to build the copper routes to connect east and west and construction began in earnest. One June 17, 1914 the company completed the last connecting pole in Wendover, Utah that made the east-west connection. That event is commemorated by the US stamp shown above. The connection was made six months before the opening of the San Francisco exposition and so AT&T waited until January 15, 1915 to make the first official call on the network. (Obviously they tested it first!)

The first official call was completed at the opening of the Exposition. A large loop was created from Jekyl Island Georgia, the location of Theodore Vail, the president of AT&T, through Washington DC, New York, Boston, and on to San Francisco. Anybody at each of these sites could hear whoever was talking on the line. The first speakers on the call were Alexander Graham Bell in New York City and Thomas A. Watson, his former assistant in San Francisco. Dr. Bell repeated the same words he had first spoken over the telephone, “Mr. Watson, come here, I want you.” To which Mr. Watson replied, “It would take me a week now.” They were then joined on the line by Theodore Vail, President Woodrow Wilson, the Mayors of New York and San Francisco and others like J.P. Morgan.

The network involved in coast-to-coast calls was impressive. One call involved two physical and one phantom circuit and each circuit consisted of two wires, so it required six wires to complete and carry a call. And this was heavy gauge wire that used 870 pounds of copper per circuit mile. In those days, before the invention of multiplexing, only one call could be handled at a time over a given set of circuits. In the main line between coasts there were 130,000 poles. It was estimated at the time that one call tied up $2 million dollars of network investment.

After the first call the service was opened to the public. But calling was not cheap and a long distance call from New York to San Francisco cost $20.70 for the first three minutes and $6.75 for each subsequent minute. To put that in perspective, in today’s dollars that is equivalent to  $488 for the first three minutes and $159 per minute after. It generally took around ten minutes for operators to arrange a coast-to-coast call.

The aggressive extensions of the backbone networks connected more and more parts of the country to each other over time. Long distance in those early days was quite expensive and there was a lot more demand for use of the lines than the technology could handle. It wasn’t until well into the 1920s when rudimentary telephone switches, rotary dial telephones and multiplexing dropped the cost of long distance calls from the stratosphere to just reasonably expensive. And we all know where it went from there.

Telcos and Taxes

windstreamWindstream recently got approval from the IRS to restructure their company and spin off their copper and fiber networks as a REIT. This stands for Real Estate Investment Trust, a form of investment that has been around since 1960. REITs were created by Congress as a way to bundle together income producing real estate in such a way as to create a marketable security.

By IRS rules, REITs must invest 75% of the value of their company in real estate assets, cash and cash equivalents and government securities. What the IRS has done with this ruling is to declare that copper and fiber networks are real estate. This seems like an odd ruling since common sense would tell you that copper and fiber are not real estate in the traditional sense.

The whole purpose for Windstream to do this is to avoid taxes. They estimate that being treated as a REIT will save them about $100 million per year in income taxes. Wall Street immediately boosted Windstream’s stock and also boosted the stock of other telcos on the assumption that they will all follow suit. The stocks of telcos, CLECs and cable companies rose on the news.

The number of REITs has grown significantly in recent decades. In 1971 there were only 34 REITs and by June of this year they have grown to 210. The classification of Windstream as a REIT is not without precedent since American Tower and all of its cellphone tower assets were classified as a REIT a few years ago. The IRS had recently clarified that the definition of real estate for purposes of REITs to include land, permanent structures and structural components. One can only assume that the IRS believes that wire networks are structural components.

Large corporate tax avoidance has been in the news lately with a number of corporations undertaking ‘inversion’ to become classified as foreign corporations to avoid paying US income taxes. In the last few weeks the Walgreens drugstore chain had announced that they were going to undergo inversion but then changed their mind after they got a lot of public pressure. Congress has been looking to close the inversion loophole.

And now, all of a sudden it is the telcos that are going to be avoiding taxes. I must say that this dismays me personally. For several decades I have tried to do my best to buy local and to buy American whenever I can. In my mind, an American company that refuses to pay its fair share of income taxes might as well be a foreign company and I try to vote with my pocketbook and boycott such companies whenever I can.

I’ve always felt that American corporations ought to pay their full share of income taxes just like the rest of us. The corporations that are taking advantage of these tax loopholes all became successful due to being in the US. It’s our American laws and the American business environment that helped these companies get started and thrive. It feels un-American when a corporation suddenly turns their back on all of us to save from paying their fair share of taxes. Obviously corporations are in the business of maximizing their return to shareholders, but at the same time I think corporations have a moral obligation to be good citizens and do their fair share to support the country that supports them.

I understand that there are multinational corporations that do business all over the world. It’s not entirely clear where a company like Apple ought to pay taxes since they manufacture their phones overseas and sell many of them overseas. But there is no ambiguity with a company like Windstream. All of their poles and copper and fiber are sitting in the United States and deriving revenues that are all from the United States.

I understand that Windstream has not taken the inversion option and declared themselves to be a foreign corporation. But they might as well have done so since they instead are taking advantage of a very questionable loophole that has the same effect as being a foreign corporation. The percentage of federal tax revenues that the federal government receives from corporations has dropped precipitously over time from a high of nearly 40% during the 1940s and projected to be only 13.5% for 2015.

I say shame on Windstream. This might be good for their shareholders as witnessed by the boost in their stock price after the announcement. But in the long run this is bad for the country and bad for all of us. If Walgreens had decided to take the inversion and declare themselves to be a foreign corporation I was prepared to move my pharmacy business to CVS. I’ve always voted with my pocketbook and I wish more people would do so. But it’s going to be hard to vote with your pocketbook if all of the major telcos declare themselves as REITs. Unfortunately the vast majority of us don’t have any options for telecom services other than these giant companies.

What’s Next?

Bell_Labs_1954I had the opportunity this week to visit CableLabs. CableLabs is a non-profit research laboratory founded in 1988 that is funded by the largest cable companies in the US and Europe. CableLabs works on both practical applications for cable networks while also looking ahead into the future to see what is coming next. CableLabs developed the DOCSIS standards that are now the basis for cable modems on coaxial networks. They hold numerous patents and have developed such things as orthogonal frequency division and VoIP.

I also had the opportunity over the years to visit Bell Labs a few time. Bell Labs has a storied history. They were founded by Alexander Graham Bell as Volta Laboratories and eventually became part of AT&T and became known as Bell Labs. They were credited with developing some of the innovations that have shaped our electronic world such as the transistor, the laser and radio astronomy. They developed information theory which has led to the ability to encode and send data and is the basis for the Internet. They also developed a lot of software including UNIX, C and C++. Bell Labs employed scientists who went on to win seven Nobel prizes for their inventions.

Both of these organizations are full of really bright, really innovative people. In visiting both places you can feel the energy of the places, which I think comes from the fact that the scientists and engineers that work there are free to follow good ideas.

When you visit places like these labs it makes you think about what is coming in the future. It’s a natural human tendency to get wrapped up in what is happening today and to not look into the future, but these places are tasked with looking both five years and twenty years into the future and trying to develop the networking technologies that are going to be needed then.

Some of this work done in these labs is practical. For example, both labs today are working on finding ways to distribute fast internet throughout existing homes and businesses using the existing wires. Google has helped to push the world into looking at delivering a gigabit of bandwidth to homes, business and schools, and yet the wiring that exists in those places is not capable with today’s technology to deliver that much bandwidth, short of expensive rewiring with category 5 cable. So both places are looking at technologies that will allow the existing wires to carry more data.

It’s easy some time to take for granted the way that new technologies work. What the general public probably doesn’t realize is the hard work that goes into to solving the problems associated with any new technology. The process of electronic innovation is two-fold. First scientist develop new ideas and work in the lab to create a working demonstration. But then the hard work comes when the engineers get involved and are tasked with turning a good lab idea into practical products. This means first finding ways to solve all the little bugs and challenges that are part of every complicated electronic medium. There are always interference issues, unexpected harmonics and all sorts of issues that must be tweaked and fixed before a new technology is ready to hit the street.

And then there are the practical issues associated with making new technology affordable. It’s generally much easier to make something work when there is no constraints of size or materials. But in the world of electronics we always want to make things smaller, faster, cheaper to manufacture and more reliable. And so engineers work on turning good ideas into workable products that can be profitable in the real world.

There are several big trends that we know will be affecting our industry over the next decade and these labs are knee-deep in looking at them. Yesterday I talked about how the low price of the cloud is bringing much of our industry to a tipping point where functions that were done locally will all move to the cloud. Everyone also predicts a revolution in the interface between people and technology due to the Internet of Things. And as mentioned earlier, we are on the cusp of bringing really fast Internet speeds to most people. Each of these three changes are transformational, and collectively they are almost overwhelming. Almost everything that we have taken for granted in the electronic world is going to change over the next decade. I for one am glad that there are some smart scientists and engineers who are going to help to make sure that everything still works.

A Tipping Point for the Telecom Industry

Cloud_computing_icon_svgThe new ‘Law’ in the industry to go along with Moore’s law has been dubbed Bezo’s Law which measures the cost of cloud computing over time. This law says that over the history of cloud that the cost of a unit of computing power has reduced by 50% every three years.

Bezo’s Law is based upon the Total Cost of Infrastructure Ownership (TCIO) which is a calculation of the cost of owning and operating a data center. There are a lot of cost components to TCIO including hardware, software, power, labor and overheads. Most companies that operate data centers don’t publish their costs in enough detail for an outsider to calculate such costs accurately, so there is no easy way to measure the current cost metric precisely

But it is obvious that the overall costs of operating data centers continues to drop. For example:

  • There has been a lot of emphasis on lowering the power consumption of data center equipment and power is a significant component of data center operating costs.
  • The cost or routers and switches has continued to drop, and companies like Amazon and Google have developed their own hardware which has supposedly cut their costs more than companies buying commercial equipment
  • Software costs are getting cheaper through the advancement of software defined network techniques. And this same software should lower labor costs over time.

Bezo’s law is lately in the news because we are now reaching the point where it is probably cheaper to buy cloud services from one of the large providers like Amazon than it is to build your own data center. Amazon has essentially achieved an economy of scale that a smaller provider can’t match by building from scratch. This is not a unique economic phenomenon. Economic theory predicts that industries that benefit from economy of scale will eventually be dominated by the most efficient firms.

There are a lot of major implications for the telecom industry if it’s now cheaper to buy cloud services than it is to operate your own data center. I would expect to see all of the following within a few years:

  • One can expect the smaller and less efficient data center providers to fade away over the next few years as their margins get squeezed and finally killed by more efficient operators. One would expect to only find a small number of data center providers within a decade.
  • It no longer makes sense for corporations and governments to have their own data centers. Eventually cost savings will become compelling enough that you can expect a faster and faster migration of corporate IT functions to the cloud. This also implies dire consequences for employment for IT people who have specialized in providing services that can be replaced in the cloud.
  • Makers of routers and switches are at risk because as the number of companies left in the data center business decrease, the only vendors that survive will be those that sell to those handful of companies. This also implies a squeeze on the margins of IT equipment since the large cloud companies will have the bargaining power to insist on low prices.
  • ISPs will have a hard time justifying operating their own routers and servers and one can expect ISP functions like DNS, email and storage to move to the big cloud providers.
  • One would expect a proliferation of specialized cloud software companies. The big cloud providers are likely to offer generic software. While the hardware will all be owned by a few large companies like Amazon and Google, the specialized needs of different industries is going to be met by specialized software that works with the cloud.
  • This will accelerate the shift to software defined networking and one can expect to see things like cable TV headends, cellular base stations, voice switches and other hardware platforms all migrate to the cloud as well. Makers of those kinds of hardware also face a bleak future.
  • This will put even more pressure to have very fast Internet connections. We will soon not have just a digital divide, but a cloud divide where there are those who can benefit by the cheap cloud and those that can’t.

This shift is gigantic for the telecom industry and we have reached a significant tipping point. In the future when we show a timeline of the history of the telecom industry (like I did yesterday), we will see and entry saying ‘2014 – When the Cloud Won’. One can debate if we have already reached the tipping point or if it’s next year. But what can’t be debated is that cheap cloud resources are going to change our industry in major ways. A lot of vendors we are used to working with will disappear and companies like Amazon and Google will become entrenched in telecom (and in myriad other industries).

A Look Back

Black phoneWe take our communications networks for granted today and it’s easy to forget the history that brought us here. I thought today I would highlight a few of the key dates in the history of our industry. And in future blogs I am going to write more about a few of these important events. I am amazed at how much of this happened during my life time. It’s very easy to forget how recently cell phones appeared, for example.

1915 – First Transcontinental Phone Call. Alexander Graham Bell placed that call in San Francisco to Thomas Watson in New York.

1919 – Telephone Switches and Rotary Dial. Rotary dial phones and switches took the operators out of the business of completing local calls. This took many years to implement in some rural areas.

1920 – Frequency Multiplexing. Frequency multiplexing allowed different calls to be carried at different frequencies, meaning that telephone lines could now carry more than one call at the same time.

1947 – North American Numbering Plan. AT&T and Bell Labs came up with the 10-digit numbering that we still use today in the US, Canada and much of the Caribbean.

1948 – ‘A Mathematical Theory of Communications’. Claude Shannon of Bell Lab’s published a paper of this name which founded information theory and that outlined how the copper telephone network could also be used to transmit data and not only voice.

1951 – Direct Long Distance. Customers could now dial 1+ and make long distance calls without an operator.

1956 – First Transatlantic Call. A call was placed on the first undersea cable that was constructed from Nova Scotia to Scotland.

1962 – First Digital Transmission. The first call was transmitted from a customer over a T1.

1963 – First Touch-tone Telephone. The familiar key pad replaces the rotary dial phone.

1968 – First 911 Call. Was implemented in Haleyville, Alabama.

1973 – First Portable Cell Phone Call. I think this date will surprise younger people.

1975 – First Use of Fiber Optics. The US Navy installed the first fiber optics link aboard the USS Little Rock.

1978 – First Public Test of Cell Phones. 2,000 customers in Chicago got the first trial cellphones. This was followed by another trial in Baltimore in 1980 with commercial service launched nationwide in 1982.

Mid 1990’s – Voice over IP. Commercial providers began offering telephone calls that could be completed over Internet connections.

2000 – 100 Million cell phone subscribers in the US, up from 25,000 in 1984.

How Do You Hire?

MTS_Technician_VanCBS News did an interview with Warren Buffet a few years ago where he talked about how he hires new employees. He said he finds ways to check on their intelligence, energy and integrity. He said that he when he is looking for somebody who can help him grow his business he wants a problem solver and that he wouldn’t hire somebody who lacks one of these three traits.

Buffett tests intelligence by asking applicants to solve tests or puzzles of various types. For energy he finds out about the candidates personal habits on eating, exercising, meditation etc. he also gives them an interesting test. He asks candidates to prepare a presentation for ten minutes that describes some particular business topic they should be familiar with. He then gives them two minutes to chop it down to a five-minute presentation. After that he gives them two more minutes to chop it down again to a one-minute presentation. Buffett says that integrity is impossible to assess in an interview and so for any finalist candidate for an important position in his company he does a full background check.

The whole premise for this hiring process, according to Buffett is that you can’t believe resumes.   They are obviously only going to pick out highlights of a career and will not tell you about negatives. Numerous studies have shown that a significantly high percentage or resumes include half-truths or outright lies. And he thinks asking questions about resumes is a waste of time because that focuses on what people did in the past instead of understanding what they might be able to do for you in the future.

All businesses rely on good people to make them operate and it can be a huge setback to your business if you hire the wrong people for a key role. Most companies have the experience of having made a bad hire and know how traumatic that can be for your business. So it is vital that you find the right people during the interview process. I think almost anybody will agree that the normal way that we hire often doesn’t uncover everything you want to know about a person. We typically sift through resumes and then interview the top few candidates for an hour or two. We don’t often dig very deep past the resume.

I’m not saying that we should all change to Buffett’s method because you can find many other non-traditional hiring methods that other people will swear work equally well as Buffett’s. But you really should consider changing your hiring process if it is not finding you the people you need. Finding something that works for you will take some work on your part. The traits Buffett lists as most important for his company might not be the same traits you think are most important. And certainly you have different needs to meet if you are hiring a new CFO, a help desk technician or an installer. You must determine for each job what you most want out of that position and then find a way to test for those traits.

For example, if you are hiring somebody who says they are an expert in something you need, then grill them hard about what they know. If somebody is supposed to have physical or technical skills, then get out of the interview room and into the central office or into the field and have them demonstrate what they know. If you need a good writer, have them write something on the spot. One of my own favorite tools is to ask candidates to solve a real life problem. Every company has real-life examples of problems you have recently encountered – asking them how they would have solved it will tell you a lot about how they think.

There are a number of companies around that offer tools for non-traditional hiring. There are on-line tools that offer the kinds of games that Buffett administers and many other kinds of tests. I have one client who makes everybody take a test that provides a detailed profile of their personality traits. They think it’s important to know if somebody is an introvert or an extrovert, is likely to work better alone or in teams and similar traits.

But I would caution against administering any test if you don’t feel qualified to interpret the results. I know I would not feel comfortable trying to understand a personality profile since I don’t know how different personality traits affect job performance. As an example, I recently read a university study that found that high-energy introverts often make better salespeople than extroverts. They conjectured that it’s because they have to try harder to communicate and since they are introverted they tend to stick to the basics instead of filling in quiet time with a lot of empty talk. That sounds reasonable but is counterintuitive to the way most people hire salespeople. If I was hiring a salesperson I would have a hard time trying to do so using a personality profile and I think I might find myself quickly second guessing my own judgment.

To some degree, identifying and hiring the right person is itself a talent and some people are good at it and others are not. I have one friend in the industry who has made numerous poor hires, and my advice to him was to find somebody else to hire for him. So perhaps the first place to look at hiring better is to look at yourself. I suspect that many people are uncomfortable in being the sole decision maker in the hiring process and this is why many companies use teams to interview people.

I don’t have any generic device because this is one area where everybody had different ideas, and I have seem many different ideas be effective. But I also know that just reading resumes and judging people by what they tell you about resumes is often ineffective and can lead to some terrible hires. So I strongly recommend that you find ways to test people on those traits that you think are most important for the job you want to fill. If you take some time to think about that before you leap into the hiring process you probably are going to do a better job at finding the right fit for your company.

Should an ISP Offer Fast Upload Speeds?

Speed_Street_SignOne question I am often asked is if clients should offer symmetrical data speeds for residential customers. I’ve noticed lately a number of fiber networks that are advertising symmetrical speeds, and so this option is gaining some market traction. This is not an easy decision to make and there are a lot of different factors to consider:

The Competition. Most fiber networks are competing against cable networks, and the HFC technology on those networks does not allow for very fast uploading. The number one complaint that cable companies get about upload speeds is from gamers who want fast low-latency upload paths. But they say that they get very few other complaints from residential customers about this issue.

So this leads me to ask if residential customers care as much about upload speeds as they do download speeds. I know that today that household use the bulk of their download capabilities to view video and there are very few households that have the desire to upload videos in the same manner or volume. One of the questions I ask clients is if they are just trying to prove that their network is faster. Because to promote something heavily that most customers don’t care about feels somewhat gimmicky.

Practical. At the residential level there are not many users who have enough legal content to justify a fast upload. There are a few legitimate uses of uploading, but not nearly as many as there are for downloading. Some of the normal uses for uploading include gaming, sending large files, sharing videos and pictures with friends and family, doing data backup and other related activities into the cloud. But these uses normally do not generate as much traffic as the download bandwidth that is used by most households to watch video. And so one must ask the practical question if offering symmetrical bandwidth is just a marketing ploy since customers are not expected to use the upload nearly as much as they download.

Cost. Another consideration is cost, or lack of cost. A lot of ISPs buy symmetrical data pipes on their connection to the Internet. To the extent that they download a lot more data than is uploaded, one can almost look at the excess headroom on the upload side as free. They are already paying for that bandwidth and often there is no incremental cost to an ISP for customers to upload more except at  the point where upload becomes greater than download.

Technical. One must ask if allowing symmetrical bandwidth will increase demand for uploading over time. We know that offering faster download speeds induces homes to watch more video, but it’s not clear if this is true in the upload direction. If uploading is stimulated over time then there are network issues to consider. It requires a more robust distribution network to support a network that has significant traffic in both directions. For example, most fiber networks are built in nodes of some sort and the fiber connection to those nodes needs to be larger to support two-way traffic than it would be if the traffic is almost entirely in the download direction.

Bad Behavior. One of the main arguments against offering fast upload speeds is that it can promote bad behavior or can draw attention from those with malicious intents. For example, fast upload speeds might promote more use of file sharing, and most of the content shared on file sharing sites is copyrighted and being illegally shared.

There has always been the concern that customers also might set up servers on fast connections that can upload things quickly. And one of the few things that requires a fast upward connection is porn. So I’ve always found it likely that having fast upload connections is going to attract people who want to operate porn servers.

But the real concern is that fast networks can become targets for those with malicious intent. Historically hackers took over computers to generate spam. That still happens today, but there are other more malicious reasons for hackers to take over computers. For instance, hackers who launch denial of service attacks do so by taking over many computers and directing them to send messages to a target simultaneously. Computers are also being hijacked to do things like mine bitcoins, which requires frequent communication outward.

One would think that a hacker would find a computer sitting on a network that allows 100 Mbps or 1 Gbps upload to be worth a whole lot more than a computer on a slower network. And so they might well be targeting customer on these networks.

What this all means to me is that if you offer fast upload connections that you ought to be prepared to monitor customer to know which ones upload a lot. If such customers are operating server businesses they might be directed to use business products. Or you can help them find and remove malware if their computers have been hacked. But I find the idea of allowing fast uploads without monitoring to be dangerous for the ISP and for customers.