What is the difference between mobility and portability? 

From 1970 to 2010 we saw computing power move from large, expensive back-office infrastructure to smaller, less expensive consumer devices that we all use every day.  This unlocked the content we need from our back-office systems (e.g. medical records) to be available on progressively smaller screens.  This was the era of information portability

Since 2010, we have entered the era of true mobility, where the mobile device is complemented by external data sources that enrich a process or user experience.

Wearables, Nearables, Hearables & Invisibles 

An example in the first phase is a ‘Wearable’ device that tracks our heart-rate and exercise and sends that information to a fitness app. 

We call the second phase “Nearables” as that is the collection of beacons, RFID, NFC and other proximity devices that provide very precise data in specific situations.  An example might be a prompt to a nurse to scrub up as she walks through the hallway into theatre. 

Nearables will be around us in many ways, both in our work lives and in our homes.  In our work lives, these devices will be used to detect our presence, simplify log-in procedures, capture inventory details, provide reminders and prompt specific actions based on what we are doing.   

In our private lives, these little devices will be in our household goods, our sports equipment, our vehicles and perhaps even in our clothing.  They will be embedded in almost every product we buy, silently monitoring how those products are used (with consent of course) and notify us, and maybe the manufacturer, when there is a fault, or a software update is required.  Collectively, they will help with comfort, safety and personal well-being.

An exciting new category is emerging that we now called “Hearables“.  Apple’s EarPods are the best example and together with other wireless in-ear devices, we believe this category has enormous potential for a highly personal mobility experience.  We expect the evolution of this category to draw inspiration from both the music and healthcare industries.

For years, professional musicians have received a highly curated feed to their in-ear monitors providing layers of information from simple click-tracks and subtle prompts on when to change key etc.  Add a few more miniature sensors to an in-ear device and it will soon be able to monitor heart-rate, blood pressure, body temperature, oxygen saturation and possibly even blood sugar. Integrate this with your favourite health app and soon you will have notifications and advice whispered in your ear with a level of personalisation that was previously unimaginable.

We are predicting that the volume of all these devices will grow exponentially, and the cost of connected sensors will decline to approximately $1 per device by 2020.

Collectively, we refer to these tiny sensors as “Invisibles” since they will become tiny, inexpensive and will simply be part of the product, part of the mobile experience and part of us as we become more mobile.

Context makes content relevant and timely

All these external data sources provide context that makes the content relevant and timely.  Innovation flourishes when context meets content and this is what we have seen happening with the explosion of apps and digital services over the past few years.

We are living through the final phase of this 50 year journey from portability to mobility.  It is exciting to see the rate of change, the explosion in smart gadgets and the innovative apps and services that people can now create at the intersection of content and context.