Faster Internet for Schools and Libraries

On July 23 the FCC released a Notice of Proposed Rulemaking in WC Docket No 13-184 that asks questions about modernizing the E-rate program for schools and libraries. The E-rate program has been around for a few decades and has been used to bring broadband to schools and libraries.

But last month President Obama announced a ConnectED initiative that has the stated goal of bringing a minimum of 100 Mbps and a goal of 1 Gbps to 99% of students within five years. This NPRM is in response to that initiative.

A 2010 FCC survey showed that only 10% of schools had speeds of 100 Mbps or greater. 48% of schools had speeds less 10 Mbps. 39% of schools reported cost as the barrier to better speeds while 27% cited the cost of installation as a barrier. And the situation is worse in our libraries. In a 2011 survey by the American Library Association only 9% of libraries have speeds of 100 Mbps or faster while 25% still have speeds of 1.5 Mbps or less.

There is clearly a need for revised E-rate funding. In the most recent year there were requests for funding from schools of over $4.9 B from a fund that is at an annual cap of $2.25 B. The E-rate program is funded today as part of the Universal Service Fund that gets fund by a surcharge put on a wide variety of telecommunications end-user bills.

The FCC has laid forth new goals for the E-rate program and also suggested a number of specific changes. The new goals include 1) That schools and libraries have affordable access to broadband in order to meet the goals of ConnectED; 2) that the effectiveness of the E-rate funding is maximized, and 3) that the administration of the program is streamlined.

The FCC seeks comments on the specific speed requirements needed for schools and libraries. They offer the target established by the State Education Technology Directors Association (SETDA) which suggests that K-12 schools should have at least 100 Mbps per 1,000 students by 2015 and 1 Gbps for every 1,000 students by 2018. For libraries they offer the State Library of Kansas recommendation that all libraries should have 1 Gbps connectivity by 2020.

One of the issues that the NPRM looks at is how to get the bandwidth around the school once it’s delivered to the side of a school. This is a significant issue because today’s wiring technologies and wireless technologies have a steep drop-off in data speeds over even short distances. So the NPRM looks for comments on how to best get the bandwidth to classrooms.  The State E-Rate Coordinators Association (SECA) has suggested that this issue is of high enough importance that it ought to be at the top of the priority list for E-rate funding.

The NPRM asks questions about increasing the efficiencies of buying broadband. This includes consortium purchasing and other bulk buying opportunities. The larger school districts are able to negotiate better rates today than small school districts due to the fact that they serve a significant number of schools. There must be ways for neighboring districts to band together for efficiency (although local politics is often a barrier to this process).

The NPRM also asks what the funding should be used for. It suggests that funding be transitioned to support only broadband. The funding is currently used for a number of other purposes which were allowable under the old rules.  For example, in the most recent funding year there were requests for $260 M to subsidize telephone lines.

Finally, the NPRM looks at who is eligible for the E-rate program. Today the program pays for some portion of eligible costs based upon the percentage of student enrollment that is eligible for a free or reduced price lunch in a given school. The school gets a discount based upon that factor and must then match between 10% and 80% of the cost. The NPRM looks at alternate eligibility requirements including (1) revising the discount matrix to increase certain applicants’ matching requirements; (2) providing support on a district-wide basis; (3) revising the approach to supporting rural schools and libraries; (4) incorporating a per-student or per-building cap on funding into the discount matrix; (5) providing more equitable access to priority two funding; and (6) allocating funds to all eligible schools and libraries up front.

Comments in the NPRM are due to the FCC by September 16, 2013. CCG Consulting will probably be making some comments in the Docket, so if you have anything you want to say let me know and it can be included in our filing.

Is the Internet Changing the Way We Think?

Cover of "The Shallows: What the Internet...

Cover via Amazon

Nicolas Carr published a headline piece in the Atlantic in 2008 that asked ‘Is Google Making Us Stupid?’. He expanded this article into a book, The Shallows: What the Internet is Doing to Our Brains. The basic hypothesis of the book is that the nature of the way that we use Internet is changing the way we think.

Carr looks at how we use the Internet and compares it to the way we learned before the Internet. Everybody who loves books knows that feeling where you sink so deeply into a book that your mind is fully immersed in the book’s world. To quote Carr:

Even the earliest silent readers recognized the striking change in their consciousness that took place as they immersed themselves in the pages of a book. The medieval bishop Isaac of Syria described how, whenever he read to himself, “as in a dream, I enter a state when my sense and thoughts are concentrated. Then, when with prolonging of this silence the turmoil of my memories is stilled in my heart, ceaseless waves of joy are sent me by inner thoughts, beyond expectation suddenly arising to delight my heart.” Reading a book was a meditative act, but it didn’t involve a clearing of the mind. It involved a filling, or replenishing, or the mind. Readers disengaged their attention from the outward flow of passing stimuli in order to engage it more deeply with an inward flow of words, ideas, and emotions. That was—and is—the essence of the unique mental process of deep reading.  

By contrast, using the Internet is the opposite of reading a book and the experience is a million miles wide and an inch deep. The Internet purposefully interrupts you, distracts you, gives you constant reasons not to delve deeply and think hard. Rather it is easy to flit from topic to topic, from distraction to distraction. Described by Carr:

What the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

As somebody who has probably read a few thousand books in my lifetime I doubt that I am ever going to want to give up the experience of reading books. But I also now spend a lot of time on the Internet and I acknowledge that Carr is right about how I use my brain during that experience. I know when reading web news stories that I rarely even read a complete article, but rather quickly skim for the gist of it and move on.

This book makes me wonder about two things: if there is anything negative in the way that most people use the Internet, and particularly is this bad for our kids?

Carr suggests that there is something wrong with using the Internet the way we do. He says we are essentially using the Internet as our long-term memory and that it doesn’t force is to undergo the mental processes necessary to place new ideas into our long-term memories. Reading and thinking about the ideas in a book establishes a series of long-term memories in a way that skimming news stories on the web does not. Think back on how you feel about your favorite book and you will see that you have retained a lot of details of the book, but you also will have retained the thoughts the book invoked in you. Observing myself I see that this same thing is not true from web browsing. But this ignores the huge benefit of the web which is that the information of the human race at our fingertips, meaning we can find out things faster and more accurately than ever before.

Reading books and talking to people about ideas lets you take the time for your brain to process ideas, form conclusions and to gain a deeper understanding of your life. Says Carr, “We become, neurologically, what we think.” I have benefitted by my lifelong love of reading books and I now look at the Internet as an enhancement to the way my brain already works.

But my real concern with the Internet is the effect on our kids. We are now creating the first generation of humans 2.0 who are being raised with the Internet as a constant background. Are we raising a generation of kids who cannot or will not be deep thinkers because they are not being forced to think deeply? Like any other human act, the very act of thinking deeply trains our mind in how to think even more deeply in the future. Are we creating a generation of kids whose brains will mimic the shallowness of Internet and who will constantly flitter from one topic to another, always ready for the next distraction? I really don’t know, but it is certainly possible and it is a bit scary.

Telecommunications Enters a New Marketing Era!

Today’s guest blog is written by Mindy Jeffries the President of Stealth Marketing. She will be writing a series of blogs that will appear here on Fridays for a while. If you want to contact Mindy you can call her at 314 880-5570. Tell her you saw her here!

In this blog post, my intent is to examine the history of telecommunications marketing so we can all have an appreciation of the work we have today, the products and the marketing solutions in the fast paced environment we find ourselves. From the day I started in 1978 until today, one thing is certain and that is change. So this post will provide solutions and ideas on how to make that change fun and manageable.

Cable started as a technical product that solved a problem for people in places that could not get the new invention called ‘television’.  The cable industry solved a need. Today those needs are rarely present with products that telecommunications companies market. So, what started as a technical-needs-based product became more of an everyday consumer product, and a story had to be told in an effective and compelling manner which would help new consumers choose which product fit their needs the best. This is when it got a lot more fun for marketers.  But wait, telecom companies had no marketers!

Telecom began to get more competitive and a need emerged to tell the ‘how are we different?’ story in an increasingly compelling way. Competitors came in on the television side, on the phone side, and on the Internet side. All of a sudden, telecom companies had competitors emerging at every door.

In the early days of cable television we told the story through products. HBO, ESPN, and other similar companies would help pay for the marketing. Our competitors started marketing with those same brand names. Cruel. Products became ubiquitous, available through all competitors. Those premium product offerings were no longer a differentiator.

Of course, a few other things happened in the world of marketing in the last 30 years. A truckload of marketing options started to become available to us. The marketing industry was introduced to new technology, new research entities, new methods, new philosophies, etc. In the end, that yielded options, more than one way to skin a cat. More marketing options means more places to spend your money with a lot of variation in response rates to different audiences with different marketing methods.  Sophisticated, targeted, analytical marketing became very important.

The problem became: how do we effectively differentiate in a quickly emerging telecom world . . . how do we tell our story, what is the target market, who is the target demo and what is the best way to place that communication? How do we utilize all of these marketing innovations? Those are the questions we will answer over the next few weeks. Hopefully these blogs will explain the process behind the curtain and I hope to show you the processes and strategies behind effective marketing.

The Future of TV

Kicking Television

Kicking Television (Photo credit: dhammza)

Laura Martin and Dan Medina of Needham & Company, a branch of an investment banking and asset management firm have issued an analysis on  the Future of TV. There has been a lot of other reporting about this report, most of which zeroed in on the fact that ESPN would need to charge $30 in an a la carte environment. I’ve written several other blogs about the a la carte issue and instead want to highlight some of the interesting facts from the report.

They say that TV is a bargain and that the average family spends 30 cents per hour to watch TV. This is based upon an average cost of $75 for a cable subscription and a family watching TV eight hours per day. I think they miss two points with this. The price of cable has grown much faster than inflation and there are now more and more homes who feel they can’t afford the cost of the subscription. If cable rates keep climbing 6% per year, in only five years this same subscription is going to cost over $100 per month. Also, there are many households who do not watch TV eight hours per day. It is these two groups that are leaving the cable system, the first reluctantly and the second because it no longer feels like a bargain.

TV content is expensive to produce. The four main broadcast networks (ABC, CBS, FOX and NBC) spend an average of $2.5 million to create a prime time hour of programming. To contrast, all of the other 130 or so cable networks spend an average of about $100,000 per hour. But there are new rivals now producing programming. There are a number of companies now producing content for the web and this is expected to grow rapidly. For example, YouTube is spending about $100 million, NetFlix $200 million, Hulu $500 million. And both AOL and Yahoo have created web ‘channels’.

They say that about 80% of content never pays for itself. The TV world is driven by hits since they draw the bulk of the advertising revenue. But hits are ephemeral and unpredictable. The broadcast networks have been geared for decades to product hits and it’s obvious that even with the money that they spend that it’s very hard to do. But the top shows garner the lion’s share of ad revenues. To show the power of hits, the top 1% of movie hits account for 18% of movie rentals / views.

They recognize that TV viewing is shifting in a digital age. They cite the following statistics:

  • 72% of viewers watch content only on a TV set.
  • 11% watch content only on some digital medium such as computer, pad or smartphone.
  • 17% of viewers watch some content in both ways.
  • 61% of TV watchers now use the Internet while watching TV and 10 – 25% of those viewers go to the website of the show being watched (depends upon the network being watched).
  • 29% of the viewers who use the web while watching TV are on Facebook.

The report estimates that over 1 million jobs are dependent upon the TV sector. These are mostly middle class jobs and include cable TV installers, customer service reps, people who work in various roles at the networks. Comcast alone has 126,000 employees. By contrast the new companies trying to make money from web content have very few employees. Hulu has 420 employees, YouTube has 650 and NetFlix has 2,348. The report thinks that most of the traditional cable TV jobs are at risk if we move to an a la carte system.

The public companies in the TV sector have about $400 billion in market cap (investable securities). The report estimates that at least half of that market cap would disappear under a la carte programming. They warn that even having the government looking at a la carte programming puts these investments at risk.

These are just a few of the many facts cited in the report, which is why I have included link to the full report for anybody who wants to read more. Oh, and at the end of the report they recommend buying CBS and AOL stock. If you buy them and it doesn’t work out, you didn’t hear it here.

Do You Understand Your Chokepoints?

Almost every network has chokepoints. A chokepoint is some place in the network that restricts data flow and that degrades the performance of the network beyond the chokepoint. In today’s environment where everybody is trying to coax more speed out of their network these chokepoints are becoming more obvious. Let me look at the chokepoints throughout the network, starting at the customer premise.

Many don’t think of the premise as a chokepoint, but if you are trying to deliver a large amount of data, then the wiring and other infrastructure at the location will be a chokepoint. We are always hearing today about gigabit networks, but there are actually very few wiring schemes available that will deliver a gigabit of data for more than a very short distance. Even category 5 and 6 cabling is only good for short runs at that speed. There is no WiFi on the market today that can operate at a gigabit. And technologies like HPNA and MOCA are not fast enough to carry a gigabit.

But the premise wiring and customer electronics can create a choke point even at slower speeds. It is a very difficult challenge to bring speeds of 100 Mbps to large premises like schools and hospitals. One can deliver fast data to the premise, but once the data is put onto wires of any kind the performance decays with distance, and generally a lot faster than you would think. I look at the recent federal announced goal of bringing a gigabit to every school in the country and I wonder how they plan to move that gigabit around the school. The answer mostly is that with today’s wiring and electronics, they won’t. They will be able to deliver a decent percentage of the gigabit to classrooms, but the chokepoint of wiring is going to eat up a lot of the bandwidth.

The next chokepoint in a network for most technologies is neighborhood nodes. Cable TV HFC networks, fiber PON networks, cellular data networks and DSL networks all rely on creating neighborhood nodes of some kind, a node being the place where the network hands off the data signal to the last mile. And these nodes are often chokepoints in the network due to what is called oversubscription. In the ideal network there would be enough bandwidth delivered so that every customer could use all of the bandwidth they have been delivered simultaneously. But very few network operators want to build that network because of the cost, and so carriers oversell bandwidth to customers.

Oversubscription is the process of bringing the same bandwidth to multiple customers since we know statistically that only a few customers in a given node will be making heavy use of that data at the same time. Effectively a network owner can sell the same bandwidth to multiple customers knowing that the vast majority of the time it will be available to whoever wants to use it.

We are all familiar with the chokepoints that occur in oversubscribed networks. Cable modem networks have been infamous for years for bogging down each evening when everybody uses the network at the same time. And we are also aware of how cell phone and other networks get clogged and unavailable in times of emergencies. These are all due to the chokepoints caused by oversubscription at the node. Oversubscription is not a bad thing when done well, but many networks end up, through success, with more customers per node than they had originally designed for.

The next chokepoint in many networks is the backbone fiber electronics that delivers bandwidth to from the hub to the nodes. Data bandwidth has grown at a very rapid pace over the last decade and it is not unusual to find backbone data feeds where today’s data usage exceeds the original design parameters. Upgrading the electronics is often costly because in some network you have to replace the electronics to all nodes in order to fix the ones that are full.

Another chokepoint in the network can be hub electronics. It’s possible to have routers and data switches that are unable to smoothly handle all of the data flow and routing needs at the peak times.

Finally, there can be a chokepoint in the data pipe that leaves a network and connects to the Internet. It is not unusual to find Internet pipes that hit capacity at peak usage times of the day which then slows down data usage for everybody on the network.

I have seen networks that have almost all of these chokepoints and I’ve seen other networks that have almost no chokepoints. Keeping a network ahead of the constantly growing demand for data usage is not cheap. But network operators have to realize that customers recognize when they are getting shortchanged and they don’t like it. The customer who wants to download a movie at 8:00 PM doesn’t care why your network is going slow because they believe they have paid you for the right to get that movie when they want it.

The Internet of Things is Here Today

Consider the following pricing chart from Vivint, one of the nationwide leaders in home security. This particular pricing chart happens to come from Fort Wayne, Indiana.

 Comparison Chart

This may not look like it, but this is the beginning of the Internet of Things and I think the way that Vivint has packaged this is brilliant. Just a few years ago this company and every company in the security business would have been selling only the features in the first column. But now they have added on energy management and home automation which are the first steps into the Internet of Things. To make this work they will install a gateway in the home that is capable of monitoring or communicating with the devices in the home and also communicating back with the cloud.

This is just the beginning. As more home-ready services are created Vivint will certainly add some of them on as enhancements to the packages listed or will create new packages. The next big field is already hinted in the last item, the medical pendant. We are not too far away from the time when sensors will be able monitoring your health and keeping a constant record of your heart beat, blood pressure and other vital signs. And a few years after that, micro sensors will be in your blood looking at your blood chemistry, looking for cancer etc.

A company like Vivint will have to decide what things they will support because the scope of the Internet of Things will become immense. It’s been predicted that much of the Internet of things will be done with Apps. But households still need the gateway and will want an expert to make sure things like security and smoke alarms are connected properly. I see a prominent role for businesses willing to go into the home to make sure that everything works well together.

Since there will be so many options in the Internet of Things it’s likely that a carrier will choose a few standardized packages that will fit a large percentage of the population and will leave customized packages to somebody else. For example, even today there are a ton of other options available in the energy management field and Vivint has chosen a few common options. Today a household can also do things like control blinds for allowing or blocking sunlight, coordinate ceiling fans, change the hot water heater settings dynamically during day, and interface with external solar panels.

I believe a lot of homes are going to want these services. I also know that customers will choose somebody they know and trust if given a choice of vendors. The Internet of Things is going to grow over time while traditional services like voice and cable TV wane. If you are going to survive as a carrier selling to households, then selling the Internet of Things needs to be in your portfolio.

Remember the White Pages?

NSW Telphone Directory_March 1944_042

NSW Telphone Directory_March 1944_042 (Photo credit: MargaretBee)

Earlier this year the Virginia State Corporation Commission granted an interim waiver for Verizon to be able to stop distributing residential white pages. This makes Virginia one of the last states to do this. This waiver came with the same kinds of requirements that we’ve seen in other states. Verizon must make sure that the information that was available in the white pages is available on its website and on the website of SuperMedia. Consumers who still want the white pages must be able to order them either in paper of CD format.

Most of the states have allowed the larger LECs like Verizon and AT&T to stop delivering white page directories with the same sorts of caveats. AT&T has reported that in all of the states where they have been able to get out of the white page business that only about 2% of customers still ask for a paper copy of the books. All of the phone companies are still publishing business white pages and there they report there is good demand for those listings.

The drive to ban the white pages was driven by both the phone companies and by consumer groups. Thinking of the big push to ban the white pages made me remember this funny YouTube video from 2008:

We certainly are only a few years away from a time when white pages will be a memory shared only by us old timers. Back in 2008 there was a Harris poll that showed that only 11% of households had any interest in the white pages in paper or even on-line format. One has to imagine that the growth of cell phones since then has to have nearly eliminated that requirement since our cell phones now act as our personal directories of people we want to remember.

Consumer groups have now turned their attention to the yellow pages. Since the yellow page industry makes a huge profit the telcos don’t agree with any push to ban yellow pages. The Local Search Association (formerly the Yellow page Association) is the national trade group representing the publishers of yellow pages. It has created a system in most places where customers can opt-out from receiving yellow pages. Consumers can go to https://www.yellowpagesoptout.com/ and can opt out of yellow pages for three years at a time.

Unless a telco seeks permission from the state commission to get out of the white page business or else shares white pages with a larger LEC it is still required to publish the white pages. I still have a lot of clients that publish their own directories that include residential white pages. But most of these directories are not the giant doorstops that are published in metropolitan areas. Instead they are small local books that include the white and yellow pages combined and are mostly still well-received by customers.

A lot of the yellow-page business has moved on-line and the industry is now embroiled in the same kinds of issues that affect other companies that live on advertising like Google. A big current push this year is for Do Not Track legislation that would allow consumers the ability to opt-out of being tracked by web advertisers. One thing about the yellow pages was that it didn’t track who you were and what you searched for.

Is There any Life Left in Copper?

RG-59 coaxial cable A: Plastic outer insulatio...

RG-59 coaxial cable A: Plastic outer insulation B: Copper-clad aluminium braid shield conductor C: Dielectric D: Central conductor (copper-clad steel) (Photo credit: Wikipedia)

Copper is still a very relevant technology today, and when looked at on a global scale nearly 2/3 of all broadband subscribers are still served by copper. That percentage is smaller in the US, but this country has a far more widely deployed cable TV system than most of the rest of the world.

The most widely deployed DSL technologies today are ADSL2 and VDSL. In theory these technologies can get speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps.

ADSL2 and VDSL technology has been widely deployed by AT&T in its U-verse product which serves over 7 million data customers and over 4.5 million cable customers. AT&T has made the product available to over 24 million homes. AT&T can support the product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs.

And ADSL2 is a pretty decent product. It can deliver IPTV and still support an okay data pipe. However, as the cable companies are finding ways to get more bandwidth out of their coaxial cable and as new companies are deploying fiber, these DSL technologies are going to again fall behind the competition.

So what is out there that might resurrect copper and make speeds faster than ADSL2? Not too long ago I wrote a blog about G.Fast, which is Alcatel-Lucent’s attempt to find a way to get more speeds out of legacy copper networks. In recent field tests ALU achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

However, the G.Fast distance limitations are far shorter than ADSL2 and G.Fast is really more of a drop technology than a last mile technology and it would require a telco like AT&T to build a lot more fiber to get even closer to houses. You have to wonder of it makes any sense to rebuild the copper network to be able to get up to 500 Mbps out of copper when fiber could deliver many gigabits.

There are other technologies that have been announced for copper. Late last year Genesis Technical Systems announced a scheme to get 400 Mbps out of copper using a technology they are calling DSL Rings. This technology would somehow tie 2 to 15 homes into a ring and bridge them with copper. Details of how the technology works are still a little sketchy.

In 2011 the Chinese vendor Huawei announced a new technology that will push up to 1 gigabit for 100 meters. This sounds very similar to G.Fast and sounds like a way to use existing copper within a home rather than rewiring.

There is one new technology that is finally getting wider use which is bonded VDSL pairs that use vectoring. Vectoring is a noise cancellation technology that works in a way similar to how noise-cancelling headphones work to eliminate sound interference. Vectoring eliminates most of the noise between bonded pairs of copper. Alcatel-Lucent hit the market with bonded pair VDSL2 in late 2011 that can deliver up to 100 Mbps. However, in real deployment speeds are reported to be 50 Mbps to 60 Mbps on older copper. That is enough speed to probably give another decade to DSL, although to do so requires a full replacement of old technology DSL technology with VDSL2. One has to wonder how many times the telcos will keep upgrading their copper electronics to get a little more speed rather than taking the leap to fiber like Verizon did.

One only has to take a look at the growth rate of the data used at homes and ask how long copper can remain relevant. Within a few short decades we have moved from where homes could get by on dial-up and now find a 20 Mbps connection too slow. Looking just a few years forward we see the continued growth of video sharing and a lot of new traffic from cellular femtocells and the Internet of Things. It’s hard to think that it won’t be long until people are bemoaning the inadequacy of their 50 Mbps connections. But that day is coming and probably is not more than a decade away.

Home Automation as a Carrier Product

Savant Home Automation Control Media Room

Savant Home Automation Control Media Room (Photo credit: Gramophone Maryland)

As a gadget guy I am interested in home automation. I stayed in the Hyatt in the Dallas airport last year which has automated rooms. I spent a great hour playing with the blinds, lighting and temperature from my bed. For a gadget nut this hotel gained a wow.

And a lot of people are interested in automating their homes to some degree. The problem they run into is that once they start investigating home automation they find a ton of different devices on the market, almost all from brands that they never heard of. And so they have no idea how to get started.

And this is why there is a product for carriers. As you probably know from reading this blog, I think that if you are a full-service provider that you need to take every opportunity to get into your customer’s homes. Meeting and talking with your customers benefits you in many ways. First, every time you meet them is an opportunity to upsell them. Second, they are able to put a face with your company so that you are not just another person they send monthly checks to. And this leads to loyalty from customers and less churn.

I have one client who has already seen the wisdom of installing home automation systems. He did his research and he picked a platform that is able to handle a number of devices and that looks expandable into the future. But this is the early days of home automation and he is not wedded to that system and he will consider a better one of it comes along.

And here is how he sells it. He will sell the equipment directly to a customer, but he would prefer that they lease it over time by signing a term contract. He makes more money on the lease and customers find it easier to pay over time. He then charges a fee to install the system to cover his technician’s time. Finally, he offers a monthly fee that will cover the labor cost of adding additional devices onto the system later. This fees basically lets the customer pay you to have you come and sell them more hardware in the future.

So what does he automate? There are a few obvious things. You connect this to the thermostat so that customers can easily change the temperature by time of day for comfort and to save money. And there is the old standby of putting light switches on the system so that they can be set to turn on and off when you wish.

But with a good home automation you can also tie in to security systems, irrigation systems, audio-visual systems, and a host of other devices like alarm clocks, smart door locks, blinds, coffee pots, you name it. With the advent of the Internet of Things, more and more devices in your house are going to have a WiFi or bluetooth interface.

A home automation system can save customers money. For instance, along with controlling the thermostat a customer can tie the system into smart blinds. The blinds can raise and lower at pre-set times to welcome the day, but more importantly to save energy by selectively blocking or letting in the sun depending upon the time of day and time of year.

You can also use motion detectors in the system so that a room responds when you enter by turning on the lights and playing your streaming Frank Sinatra. The number of options for a customer is almost unlimited and this is what homeowners find intriguing but also what they find daunting.  There are a ton of home automation systems on the market that will easily do stuff like automate the lights. But it takes programming to do the more complicated (and fun!) stuff. It’s a little more complicated if you want your house to remind you that tomorrow is your anniversary.

And the systems can all be accessed from the customer’s smart phone. The beauty of this is that you can also pre-set alarms. For instance, a customer can have the house tell them if the temperature goes warmer or colder than the pre-set temperature range. They can have the house send them a text every time somebody comes to the front door. They can check in to see that the pets or the kids aren’t killing each other just yet.

Not all home automation is serious. There are silly devices available that can be tied into these systems. Just last week I saw an egg tray that will tell you how many eggs you have left in the fridge. Doesn’t make sense to me, but if a customer wants that, then let’s make it work!

Is Wireless Really Better than Wires?

An rural area west of Route 41 and Lowell, Ind...

An rural area west of Route 41 and Lowell, Indiana. (Photo credit: Wikipedia)

It is clear that the FCC prefers wireless as the broadband solution for rural areas. It seems like they badly want every rural household in the country to get some kind of broadband just so they can take this issue off their plate. Just about every bit of policy decided in the last few years has a bias towards wireless bias.

For instance, the historic Universal Service Fund which was used to promote rural telephony over copper has been transitioned into a new CAF fund that will instead promote high-speed data in rural areas. There are several aspects of the CAF that clearly will ensure that the funds will go mostly to wireless carriers. The bulk of the funding will eventually be distributed by a reverse auction. This is an auction where the broadband providers in a given area will be able to compete for the funding, and the one who bids for the lowest amount of subsidy per customer will receive the funds.

The first time I read the reverse auction rules my first thought was that this money is all going to wireless companies. The reverse auction rules strongly favor companies who can provide data over large areas. Any smaller company who wants to get CAF funds to help pay for a rural wired network can be undercut by the largest wireless companies. AT&T Wireless and Verizon Wireless are the two richest and most successful companies in the country. They pay many billions of dollars of dividends annually and they can afford to underbid any rural landline company for subsidy, simply because they do not need it. But of course, they will bid in the reverse auctions and take the subsidies because the rules allow them to.

There are also parts of the CAF that can be used to build new broadband infrastructure and these funds also favor wireless companies. The funds get distributed by complicated rules that have a bias to get broadband to customers at the lowest cost per subscriber. And of course, there is no cheaper way to cover a large rural footprint than with wireless. Wireless companies are also going to get a lot of this infrastructure funding.

Meanwhile, AT&T recently told the FCC that they were going to introduce a plan to drop the copper for ‘millions’ of rural subscribers. And if they are successful then their rural subscribers can expect to be told to get cell phones rather than landlines. And for voice telephony this might not be such a bad thing. But do we really want to relegate a large bunch of the US geography to only having cellular data?

Today there is clearly a broadband gap with some rural areas still stuck with dial-up Internet access. And so getting them some kind of faster data seems like a reasonable plan. The FCC has set the definition of broadband to be the capability of receiving 4 Mbps download. And it’s obvious that they set that limit with rural areas in mind.

And so over the next decade more and more of rural America will be getting cellular data that will meet, or come close to meeting the FCC’s definition of broadband. But meanwhile, the cities have already far surpassed those speeds. There are very few cities left where the average home can’t get speeds of between 10 Mbps and 20 Mbps. There are usually cheaper alternatives in the range of 5 Mbps to 7 Mbps, but the faster speeds are widely available. And many places have much faster speeds available.

The FCC itself has promoted the availability of gigabit bandwidth and companies are responding. Google is bringing this speed to Kansas City, Austin and Provo and AT&T has promised to match them in Austin. CenturyLink is bringing a gigabit to Omaha. And a number of smaller municipal and commercial providers have brought gigabit speeds to other towns and cities scattered across the country. And one can expect the gigabit movement to grow rapidly.

It’s universal knowledge that the household use of bandwidth has continued to grow and there is no end in sight for that growth. As networks can provide more data households find ways to use it. Video has been the recent reason for the explosion in data usage, and now we can see that the Internet of Things will probably be the next big bandwidth driver.

Have we really solved the rural bandwidth gap if people in those areas are going to have 4 Mbps data speeds while urban areas have a gigabit? Obviously the rural areas will continue to be left behind and they will fall further behind than today. Just a few years ago the rural areas had dial-up and the cities had maybe 5 Mbps. But a gap between a rural world at single digit megabit speeds with the cities at gigabit speeds is a much larger gap and the rural areas will not be able to share in the benefits that bandwidth will bring.

The only long-term solution is to build fiber to rural America. Obviously nobody is going to build fiber to single homes at the top of mountains or at the end of ten-mile dirt roads, but I have been working on business plans that show that fiber can make sense in the average rural county. But it is really hard to get rural fiber funding since such projects tend to jut pay for themselves and are not wildly profitable.

It’s possible that the FCC’s universal service plans will work and that a lot of the 19 million rural people without broadband will get some sort of rudimentary broadband. But meanwhile, the rest of the country will be getting faster and faster bandwidth. And so, before the FCC declares ‘mission accomplished’ I think we need to have more of a debate about the definition of broadband and what is acceptable. I hate to tell the FCC, but the rural broadband issue is not going to go away even after rural areas all have cellular data.