Regulatory Alert: Rural Call Completion

Seal of the United States Federal Communicatio...

Seal of the United States Federal Communications Commission. (Photo credit: Wikipedia)

The FCC took action on October 28 to address a growing problem of calls that are not completed to rural areas. The Commission adopted new rules that are aimed to remedy a growing problem of calls that are not completed.

The FCC noted that the situation was “serious and unacceptable” and that every call that is placed should be terminated. The FCC note that “Whatever the reason, the consequences of failed calls can be life-threatening, costly, and frustrating. Rural businesses have reported losing customers who couldn’t call in orders, while families attempting to contact elderly relatives have worried when they hear a ring – but no one picks up on the other end because the call never actually went through.”

The FCC surmises several reasons for uncompleted calls:

  • They think that some providers are not routing to rural areas to avoid higher than average terminating access charge rates. The access rates in rural areas are still much higher than rates for major metropolitan areas, which reflects the higher cost of doing business in rural areas. Terminating rates can still be as much as two cents per minutes higher. However, the FCC has always said that it insists that every call must go through, and if they ever got evidence of a specific carrier boycotting an area due to high rates I suspect they would levy high fines.
  • They think that much of the problem is due to the fact that calls can be routed through multiple carriers. They note that the best industry practice is to limit to two the number of intermediate carriers involved in routing a call. I know there are a lot of new carriers in the market today, such as multiple new companies marketing voice services like IP Centrex who search for the lowest cost way to route calls. One has to suspect that the long distance carriers beneath some of these carriers have gotten very creative in terms of routing calls to save costs.
  • Some carriers have been sending a ring tone to the calling party before the call has actually been completed. One has to suspect that this is done so that the caller can’t hear all of the intermediate switching going on to get the call completed. The problem with doing this is that the caller will hang up after a few unanswered rings, often before the call has even been completed.

The FCC took several concrete steps to fix the problem. These new rules will be effective in a few weeks once the final rules are published. The new rules are:

  • False audible ringing is prohibited, meaning that a telephone provider cannot send a ringtone to the caller until the call has actually been answered.
  • Carriers with over 100,000 voice lines, and who are the carrier that determines how calls are routed must collect and retain calling data for a six month period.
  • Carriers who can certify that they follow best industry practices, such as not routing calls through more than two intermediate carriers, will be able to get a waiver for some or all of the storage and reporting requirements.
  • Carriers who can demonstrate that they have all of the mechanisms in place to complete rural calls can also ask for a waiver from the storage and reporting requirements.

Do the Cloud Guys Get It?

English: Cloud Computing Image

English: Cloud Computing Image (Photo credit: Wikipedia)

I just read an article this week that cites five reasons why cloud computing isn’t taking off as fast as the companies selling the solution were hoping for. The reasons unfortunately make me feel like the cloud industry folks are out of touch with the real world. This is not an uncommon phenomenon in that high-tech industries are run by innovators. Innovators often don’t understand why the rest of the world doesn’t see things with the same clarity as they do.

Following are the five reasons cited in the article about why cloud computing is not selling as fast as hoped, with my observations after each point.

The Organization. Organizations often are structured in a way that does not make the kind of shift to cloud easy. For instance, IT shops are often organized into separate groups for compute, network and storage.

Changes that affect people are never easy for companies. Going to the cloud is supposed to save a lot of labor costs for larger companies, but that is not necessarily the case for smaller companies.  But even larger companies are going to take a while to make sure they are not walking off a cliff. Every old-timer like me remembers a few examples of where major technology conversions went poorly, and nobody wants to be the one blamed if a big conversion goes wrong.

Security. Companies are afraid that the cloud is not going to be as safe as keeping all of their data in-house.

Everything I have read says that if done right that the cloud can be very secure. However, the fear is that not every conversion is going to be done right. You can place your bets with me now, but sometime in the next year or two there is going to be a major ugly headline about a company that converted to the cloud poorly which led to a major breach of customer records. The problem is that everybody is human and not every cloud company is going to do every conversion perfectly.

Legacy Applications. Cloud companies want you to get rid of legacy systems and upgrade to applications made for the cloud.

This is where cloud companies just don’t get it. First, almost every company uses a few legacy systems that are not upgradable and for which there is no cloud equivalent. Every industry has some quirky homegrown programs and applications that are important for their core business. When you tell a company to kill every legacy application most of them are going to rightfully be scared this is going to create more problems than it solves.

Second, nobody wants to be automatically upgraded with the latest and greatest software. It’s a company nightmare to come in on a Monday and find out that the cloud provider has upgraded everybody to some new Microsoft version of Office that is full of bugs and that everybody hates and that brings productivity to a halt. Companies keep legacy systems because they work. I recently wrote about the huge number of computers still running on Windows XP. That is how the real world works.

Legacy Processes. In addition to legacy software, companies have many legacy processes that they don’t want to change.

Honestly this is arrogant. Companies buy software to make what they do easier. To think that you need to change all of your processes to match the software is really amazingly out of touch with what most companies are looking for. Where a cloud salesman sees ‘legacy system’ most companies see something that works well and that they took years to get the way they want it.

Regulatory Compliance. Companies are worried that the cloud is going to violate regulatory requirements. This is especially true for industries such as financial, health and the power industries.  

This is obviously a case-by-case issue, but if you are in one of the heavily regulated industries then this has to be a significant concern.

I hope this doesn’t make me sound anti-cloud, because I am not. But I completely understand why many companies are going to take their time considering this kind of huge change. There is no product ever made that should not be taking their customers into consideration. When I see articles like this I feel annoyed, because the gist of the article is, “Why won’t these dumb customers see that what I have is good for them”. That is never a good way to get people to buy what you are selling.

Converting to IPv6

By now most of you know that there is a new version of Internet addressing that has been introduced known as IP version 6 (IPv6). The process to integrate the new protocol into the network has already begun and it’s now time for smaller ISPs like my clients to begin looking at how they are going to make the transition. I call it a transition because the planned process is for the old IPv4 and IPv6 to coexist side-by-side until the old protocol is eventually phased out of existence. Some experts predict that the last vestiges of IPv4 addressing will survive until 2030, but between now and then every part of the Internet will begin the transition and will begin using the new address scheme.

The IPv6 specification makes major changes to internet addressing. Not only has the IP address length been extended to 128 bits but also the IP header format and the way header information is processed have been modified. Thus, transitioning from IPv4 to IPv6 is not going to be straightforward and it is going to take some work to go from old to new.

I think it is time to start thinking about how you are going to make the transition to enable both kinds of routing. Any small ISP will want to do this in a controlled and leisurely manner and not wait until there is an urgent need for it on your network. There are already some new kinds of hardware and software systems that are going to prefer to use the new protocol, and so small ISPs ought to get ready to make the change before you get a frantic call from a large customer asking why this doesn’t work on your network.

The basic process to get ready to migrate to IPv6 is to make certain that your core routers and other host systems in your network are able to handle IPv6 routing. There are three different transition techniques that are being used around the country to make the transition.

Dual-stack Network. This approach allows hosts and routers to implement both IPv4 and IPv6 protocols. This will let your network support both IPv4 and IPv6 during the transition period. This is currently the most common technique being used to introduce IPv6 into legacy networks. The biggest downside of the approach is that you must create a mirror-image IPv4 address for every new IPv6 address, and the whole point of moving to IPv6 was due to the scarcity of IPv4 addresses.

Tunnelling. This technique essentially hands off all new IPv6 routing to somebody else in the cloud. To make this work your network would encapsulate IPv6 packets while they are crossing your existing IPv4 network and decapsulate the packets at the border to the external cloud. This is somewhat complex to establish but reports are that it can work well when configured correctly.

Use a Translation Mechanism. This method is necessary when an IPv6-only host has to communicate with an IPv4 host. At a minimum this requires translation of the IP header packets, but it can get a lot more complicated.

And, as one would suspect, you can mix and match these various techniques as needed. It’s obvious to me that this could become very complex and there appears to be a lot of chances to mess up your network routing if you don’t do it right. Because of this we think it makes sense to start planning early on how you are going to make the transition. You do not want to wait until one or more of your largest customers are breathing down your neck demanding a transition, so you should start early and make a plan. Contact us at CCG and we can help you make a plan for an orderly transition to IPv6.

Make Being Local Work for You

market 1

market 1 (Photo credit: tim caynes)

Today’s guest blog is written by Mindy Jeffries of Stealth Marketing. She will be writing a series of blogs that will appear here occasionally. If you want to contact Mindy you can call her at (314) 880-5570. Tell her you saw her here!

I look at small telephone companies and as a marketer I see tremendous marketing potential due to their advantage of being local. I would have a blast with marketing in these markets. Here are the questions I would ask myself and my team:

  • What is going on local in my community?
  • Can I create something that would be a resource to my community?
  • What could I do to bring my community together in the new virtual world? What could you do that is useful from the customer’s perspective?

I would find something the community needs, such as listing of local events and get it on the web. Then, using social media you have to advertise news of the local application that you have created and the content within.  This will start bringing people to your site to check out the latest news on what to do around town or the weather or whatever you choose.

Once you get your current and potential customers coming to your website or social media site for useful information, then the next step is to ask them for their email addresses. At this point, you don’t care if these people are customers or not, just provide each person with useful information. As you create value, your prospects and potential prospects will give you their information including email addresses because they want to interact with you.

Then you start housing this information in a database application that can automate, score and deliver very customer-specific news and offers to your prospects and current customers.  Your prospects get offers for new services, and your current customers get retention offers, or news on programing or movies or VOD coupons.

Start simple, but there are lots of ways to take a program like this to the next level.  You can incorporate your advertising clients and distribute their offers as well. This could be your retention program!  Some examples might be coupons for the pizza provider in town or coupons for the local theater.

Your imagination can run wild, but today digital environment exists to help you organize and filter messages and marketing.  Social media have changed the world.  Instead of always talking about you and your services, you need to look at the world through a customer/potential customer lens and asking the question from their perspective – what can I do through the resources I have, to make myself useful to them?

It’s an exciting times to be a marketer!


The End of Special Access?

Image representing EarthLink as depicted in Cr...

Image via CrunchBase

For those not familiar with the term, special access refers to selling traditional data pipes on the TDM telecom networks. These are circuits like T1s and DS3s. While one might think the world had transitioned to ethernet circuits there are still huge numbers of these traditional circuits being sold in the world.

In many cases the traditional circuits, especially T1s are being sold because of lack of fiber in the distribution plant. TDS data circuits can still be delivered over copper in many cases and often are the only way for a business stuck on copper to get faster data speeds.

AT&T recently announced that they were going to do away with all of their long-term discounts on these traditional TDM circuits. Customers and other carriers have been used to buying these products with a significant discount for signing up for long periods of time. There have been discounts offered for agreements to buy for up to seven years. And these discounts have teeth since there are significant penalties for breaking the contracts. As of November 9 AT&T will not be signing any contracts with terms longer than three years.

AT&T says the reason they are doing away with the discounts is due to the fact that they are going to be discontinuing TDS special access by 2020. However, that rings untrue since somebody can still sign a 5-year or 7-year contract today and still have that contract finished on or before 2020.

Some of the competitors of AT&T filed a letter of complaint with the FCC this month complaining about the cessation of the term discounts. This included Sprint, tw telecom, CBeyond, EarthLink, Level3 and Megapath. These carriers say that eliminating the discounts is anticompetitive since they are the in direct competition with AT&T and they are the primary purchasers of special access circuits.

Sprint says that eliminating the term discounts will increase the prices they pay and ultimately affect what customers pay. They say that in the worst case examples that their costs will rise 24%.

If you have been following this blog I have reported that AT&T has been positioning itself to get out of the TDM business. They want to convert all data circuits to ethernet as part of their ‘Project VIP’ initiative. But they also want to get homes and small business off of copper and in many cases replace them with cell phones. The FCC has not given AT&T the permission to do this anywhere, yet they keep moving towards that goal.

The biggest problem I see with trying to eliminate TDM data circuits, particularly T1s, is that the customers who use them often are in parts of the network that don’t have fiber alternatives. It’s nice for AT&T to be able to talk about offering only ethernet, but in many cases this is going to result in customers losing what little data they are able to buy today.

There are still huge numbers of T1s that are used to support PBXs and small company WANs for functions like data back-up. It’s hard to picture what a customer will do if the copper goes away and they are expected to somehow perform those functions using cellular data – with data plans that are likely to be capped. We tend to think of a T1 these days as a small data pipe. But if you are using it for data backup, a T1 can transmit a lot of data during a month’s time.

The FCC is in the middle right now of looking at special access issues. They have issued a request for data from the industry that will hopefully help them understand the state of the current TDM data market. I think they are going to find that the market is still a lot larger than AT&T wants them to think.

The Future of Interconnection

A Verizon payphone with the Bell logo.

A Verizon payphone with the Bell logo. (Photo credit: Wikipedia)

AT&T and Public Knowledge both testified yesterday at a House Communications Subcommittee hearing about the transition of today’s PSTN to an all-IP network.

Both parties agreed that there were five areas that must be addressed to maintain a functional telephone network:

  • Service for everybody
  • Interconnection and competition
  • Consumer protection
  • Reliability
  • Public Safety

I want to look a little more at the issue of interconnection and competition. Today a large percentage of my clients have interconnection agreements with the incumbent telephone companies. Most of my clients are CLECs but a few are wireless carriers, and each negotiates interconnection under a different set of FCC rules.

Interconnection is vital to maintain competition. Interconnection basically covers the rules that define how voice traffic gets from one network to another. The agreements are very specific and each agreement defines precisely how the carriers will interconnect their networks and who will pay for each part of the network.

For the most part, the rules of Interconnection adopted as part of the Telecommunications Act of 1996 work well and there are probably over 2,000 companies using these agreements to interconnect with each other.

There is a lot of danger that changing the interconnection rules could harm and force competitive companies out of the market. Let me just revisit a little bit of history to talk about what I mean. A long time ago the FCC decided that interconnection for local calls between incumbents should be free, and so incumbent telephone companies don’t charge each other to exchange local minutes. However, I can think of at least five times during my career when the RBOCs like AT&T tried to put in reciprocal charges for this traffic. That means that both parties would pay each other the same amount for terminating local calls from the other. Sounds okay until you recall that AT&T basically serves all of the metro areas in the country while smaller telcos serve the rural areas. Still today there is a lot more calling made from rural areas into metros than in the other direction, and if such a change was made the rural companies would be sending big checks to the RBOCs for ‘free’ calls

And the RBOCs have tried to do similar things to competitive carriers with interconnection. The FCC’s interconnection rules say that a competitive carrier can choose to interconnect with a larger company at ‘any technically feasible point’, and yet every few years the RBOCs try to change interconnection agreements to force carriers to carry the traffic to the RBOC hubs. Again, this is a matter of money and the RBOCs want the competitive carriers to pay for everything.

Changing to an all-IP network is likely to open up the same battles. Rather than maintain a system today of many tandem offices in a state, it is not impossible that the RBOCs will have only one hub in each state, or even only one hub in each region of many states. And if they make that kind of change you can expect that they will then expect competitive carriers to pay to carry all if their traffic to and from such hubs. I can tell you that such a change would devastate the business plan of many competitive carriers and would greatly reduce competition in the country.

The FCC has to be diligent in making the changes to IP. Everybody agrees that the technological change needs to be made. It’s more efficient. But we can’t let a technology change be grounds for a land-grab by AT&T and Verizon in an attempt to quash competition. They will, of course, claim that they are not trying to do that, but during my 35-year career I have seen them try exactly that kind of change a whole lot of times. And there is no doubt in my mind they will try to do it again.

The (Business) Field of Dreams

English: Panorama of Busch Stadium.

English: Panorama of Busch Stadium. (Photo credit: Wikipedia)

Today’s guest blog is written by Mindy Jeffries of Stealth Marketing. She will be writing a series of blogs that will appear here occasionally. If you want to contact Mindy you can call her at (314) 880-5570. Tell her you saw her here!

I love the St. Louis Cardinals and they are going to the World Series!  That’s great news for a lot of us in the central United States, especially for our office just down the road from Busch Stadium.

All this World Series hype has me asking: Why am I such a fan? What is it about baseball in October that makes it so different? I realized one of the major reasons is the Cardinals’ approach to teamwork.

If you look up the definition of teamwork it’s this: the combined action of a group of people, esp. when effective and efficient.

I think a lot of people give lip service to the word “teamwork”, but few achieve it. If you Google “How can I get my team to work more efficiently and effectively together”, this is what you get:

  • Pick the right people
  • Understand their strengths and weaknesses
  • Align goals
  • Don’t allow anyone in that is subversive
  • And get everyone to communicate

But if you look at the Cardinals it’s so much more than that.  Here are some of my favorite examples of how they work as a team:

  • Trust – Not only do they trust that everyone will do their job and that no one will fail, they back each other up to make sure no one fails.
    • Example: The opposing team hits a fly ball to center field and the pitcher begins his walk to the dugout.  He knows Jon Jay will make the catch and if he doesn’t, Holliday will cover from left.
    • Business Translation:  Make sure everyone has a back up and is cross trained.  We have partnerships at Stealth, with everyone double checking everyone else’s work.
  • Mentoring- Senior Birds mentor the younger players.
    • Example:  Senior players are matched with more junior players to teach them the ropes, invest in their success.
    • Business translation:  Match new employees to more senior players for learning and sheep dogging.  Works every time.
  • In it for the team!
    • Example:  Teamwork trumps high payrolls as evidenced in the Cardinals /Dodgers series.
    • Business translation: Teamwork makes your junior players play like high-profile business executives—without the grandstanding.    Everyone is in it for the team.
  • Keeping Score and Analyzing
    • Baseball is a very statistical game, with the flurry of numbers affecting the most important thing—the score.  But baseball is a slow game with plenty of time to reflect and analyze what just happened.  Somebody once told me that baseball is a cerebral sport because of all the statistics, to which my football cronies say, “That’s (insert your favorite expletive) incredulous!”
    • Business translation:  Take time to reflect.  Think about what’s going well so you can do more of it. More importantly, be honest with yourself about what’s not going so well and start moving the curve in a more positive direction.

That’s why I love baseball and especially my beloved St. Louis Cardinals.  They have the teamwork I idolize and seek to emulate as a business owner.

The Internet of Things Today

Image representing Electric Imp as depicted in...

Image via CrunchBase

I’ve written a number of blogs about the future potential for the Internet of Things. But there are a number of devices on the market now that make the first steps of the IoT a reality today. I look at these devices and the approaches they are taking to the word of connected things to be the precursor to where we are headed over the next few decades.

SmartThings SmartThings are selling a box of smart sensors that you can use for functions today like home security and energy management. But they also provide a developer kit for those who want to program custom applications and there is a huge range of possible functions for the system. One would thing that soon that custom apps will begin appearing on the web allowing you to do cool things without having to be a coder.

MobiPlug. Mobiplug is a start-up that promises to be able to integrate all of your wireless devices regardless of the platform or wireless protocol used. It’s most unique feature is that it makes your smart phone the hub device. Most other platforms supply a separate box as a hub and I am just picturing a closet full of old hubs one day in the same way that I gathered a pile if WiFi routers. Most IoT systems allow your smart phone to control settings, so why not just make it the hub too?

FitBit.  By now you have probably seen your Facebook friends with the annoying posts showing how fast and where they ran today, brought to you by FitBit. But FitBit has it in their sights to become a lot more than just a training aid and monitor and they are hoping to evolve their system into everything fitness and health related in your life. FitBit is already storing data on you that can become the future basis for a heath monitoring system.

AllJoyn. AllJoyn is not a device, but rather a platform of software being created by Qualcomm. They are taking a very different approach to the IoT and developing a platform that will work independently of the Internet. This has some basic merit in that many of the other platforms store at least some of the central core in the cloud and be non-functional during an Internet outage. But it also is a bold step in separating our IoT data from the general internet for privacy reasons. Do you really want your medical monitor data or security system to be hackable?

Evrythng. This company is looking at a totally different aspect of the IoT, in how you interact with your devices and with the outside world. Evrythng is a software platform that will let you more dynamically interface with your IoT devices in a Facebook-type of platform. However, one aspect of this system is that your devices can ‘suggest’ additional purchases to you and this platform brings advertising into your life and your smart fridge might be suggesting what you should purchase to create a recipe with what you already have stored inside.

Electric Imp. And let us not forget the geeks among us who want a fully customized IoT. Electric Imp has developed a SD Card-sized WiFi node that can then be used with any device. A user can program it to do anything they wish. And the cards are swappable because the programming is stored in the cloud. Think of this as the never-ending coding party that lets you program your toaster to perform amazing feats.

Freescale.  This is still under development, but Freescale is looking at swallowable monitors for inside of the body. Nobody is entirely sure yet just what this is going to be looking at, but the guess is that this will be partnered with some other system such as FitBit as additional health monitors. Probably one of the most promising long-term use of the IoT is in-blood monitors that will head you off from being sick from the first signs of an infection and stopping pre-cancerous cells before they get started. This technology has to start somewhere and hopefully this is the first step.

Europe Has the Right Goals

The European Commission issued a press release yesterday that announced that 100% of the households in Europe now have access to broadband.

Most households have some sort of wired access with 96.1% of homes having access to copper, coax or fiber. Wireless coverage with 2G, 3G or 4G covers 99.4% of houses. And all remote homes are now covered by satellite broadband using a network of 148 satellites.

Before anybody argues that we have the same thing here in the US due to satellite, we need to distinguish between the satellite broadband that is available here and what is available in Europe. Basic satellite service in Europe is only $13 per month. I can’t find the speed for but assume this is a few Mbps download speeds. But customers can get 20 Mbps download from satellite for $33 per month.

In the US there are two major satellite providers. ViaSat Exede offers a 12 Mbps download service. The amount you pay is based upon the usage cap you choose. For $50 per month you can get 10 GB per month, for $80 you can buy 15 GB and for $130 you can get 25 GB. Hughesnet offers 5 Mbps down and 1 Mbps up for $50 per month, 10 Mbps down and 1 Mbps up for $60, 10 Mbps down and 2 Mbps up for $80 and 15 Mbps down and 2 Mbps up for $130. The four Hughesnet products also have data caps of 10 GB, 20 GB, 30 GB and 40 GB respectively.

Speed isn’t everything and the caps matter. Just to put those data caps into perspective, a 2-hour HD movie will range between 3 and 4.5 GB. So homes in the US using satellite are very limited in using their satellite connection to view video.

The US satellite companies are also limited since they only have a few satellites capable of delivering the above products. If those satellites get oversubscribed then actual speeds will be slower than advertised in the same way that a cable modem system can bog down in the evening hours. But with more satellites in Europe the speeds can be faster and there is a lot less chance of congestion and oversubscription.

The Europeans also have goals to speed up Internet access. They have the goal by 2020 of getting all citizens the ability to have 30 Mbps download speeds, with at least half of them having access to 100 Mbps.

This is pretty easy to contrast with the US where the current national definition for terrestrial  broadband is 4 Mbps down and 1 Mbps up. Both stimulus grants and borrowing from the RUS have recently financed networks that are able to deliver those speeds.

If we don’t set high goals in the US, and if we are content to finance rural broadband that delivers slow speeds when it is brand new, we are relegating the rural areas to having slow broadband for decades to come.

In the US we are more given to grand announcements that don’t come with any funding or mandates. For example, earlier this year the FCC set a goal of having a Gigabit City in every state of the country. That means a network that is capable of delivering a gigabit of download speeds to customers.

Don’t get me wrong, I would love to live in one of those few places where you can get a gigabit. But this is a completely voluntary system, and a Gigabit City might only be actually selling that much speed to a few customers to be given the designation. Rather than trying to get one City in each state to provide a few customer with a gigabit download speed we ought to instead be concentrating on making our basic broadband a lot faster than 4 Mbps. When that lowly speed is our national goal, we are telling rural America to not expect anything better.

The Europeans have it right and we have it wrong. And a decade from now when we are far behind them in terms of productivity we can look back on the crappy national goals we set for ourselves.

Hello Siri . . .

Image representing Siri as depicted in CrunchBase

Image by None via CrunchBase

Gartner, a leading research firm, issued a list of the ten top strategic technology trends for 2014. By strategic they mean that these are developments that are getting a lot of attention and development in the industry, not necessarily that these developments will come to full fruition in 2014. One of the items on the list was ‘smart machines’ and under that category they included self-driving cars, smart advisors like IBM’s Watson and advanced global industrial systems, which are automated factories.

But I want to look at the other item on their list which is contextually aware intelligent personal assistants. This essentially will be Apple’s Siri on steroids. This is expected to be done at first mostly using cell phones or other mobile device. Eventually one would think that this will migrate towards something like Google Glass, a smart phone, a bracelet or some other way to have this always on you.

Probably the key part of the descriptive phrase is contextual. To be useful, a person’s personal assistant has to learn and understand the way they talk and live in order to become completely personalized to them. By contextual, the current Siri needs to grow to learn things by observation. To be the life-changing assistant envisioned by Gartner is going to require software that can learn to anticipate what you want. For example, as you are talking to a certain person your assistant ought to be able to pick out of the conversation those bits and pieces that you are going to want it to remember. For example, somebody may tell you their favorite restaurant or favorite beer and you would want your assistant to remember that without you telling it to do so.

Both Apple and Microsoft’s current personal assistants have already taken the first big step in the process in that they are able to converse some in conversation language mode. Compare what today’s assistants can already do to Google’s search engine, which makes you type in awkward phrases. Any assistant is going to have to be able to be completely fluent in a person’s language.

One can easily envision a personal assistant for life that helps you learn when you are young and who then sticks with you for life. Such an assistant will literally become the most important ‘person’ in somebody’s life. An effective assistant can free a person from many of the mundane tasks of life. You will never get lost, have to make an appointment, remember somebody’s birthday or do many of the routine things that are part of life today. A good assistant will free you from the mundane. But it still won’t take out the trash, although it can have your house-bot do that.

In the future you can envision this assistant tied into the Internet of things so it would be the one device you give orders to. It would then translate and talk to all of your other systems. It would talk to your smart house, talk to your self-driving car, talk to the system that is monitoring your health, etc.

The biggest issue with this kind of personal assistant is going to be privacy. A true life-assistant is going to know every good and bad thing about you, including your health problems and every one of your ugly bad habits. It is going to be essential that this kind of system stay completely private and be somehow immune to hacking. Nobody can trust an assistant in their life that others can hack or peer into.

One might think that this is something on the distant horizon, but there are many industry experts who think this is probably the first thing on the smart machine list that will come to pass, and that there will be pretty decent versions of this within the next decade. Siri is already a great first step, although often completely maddening. But as this kind of software improves it is not hard to picture this becoming something that you can’t live without. It will be a big transition for older people, but our children will take to this intuitively.