Healthy mobility

Mobility is an interesting term. Here in the UK, I've grown up seeing mobility as something to do with getting old and grey, when you need mobility aids around the home, or even a mobility scooter. Which is why I was curious about Audi (who make cars) hosting an innovation summit at their global headquarters in Germany to explore the Mobility Quotient. I'd never even heard of that term before. The fact that the opening keynote was set to be given by Audi's CEO, Rupert Stadler and Steve Wozniak (who co-founded Apple Computers) made be think that this would be an unusual event. I applied for a ticket, got accepted and what follows are my thoughts after the event that took place a few weeks ago. In this post, I will be looking at this through the lens of what this might mean for our health. 

[Disclosure: I have no commercial ties with the individuals or organisations mentioned in this post]

It turns out that 400 people attended, from 15 countries. This was the 1st time that Audi had hosted this type of event, and I didn't know what to expect out of it, and neither did any of the attendees I talked to on the shuttle bus from the airport. I think that's fun because everyone I met during the 2 days seemed to be there purely out of curiosity. If you want another perspective of the entire 2 days, I recommend Yannick Willemin's post. A fellow attendee, he was one of the first people I met at the event. There is one small thing that spoiled the event for me, the 15 minute breaks between sessions were too short. I appreciate that every conference organiser wants to squeeze lots of content in, but the magic at these events happens in between the sessions when your mind has been stimulated by a speaker and you have conversations that open new doors in your life. It's a problem that afflicts virtually every conference I attend. I wish they would have less content and longer breaks. 

On Day 1, there were external speakers from around the world, getting us to think about social, spacial, temporal and sustainable mobility. Rupert Stadler made a big impression on me with his vision of the future as he cited technologies such as Artificial Intelligence (AI) and the Internet of Things (IoT) and how they might enable this very different future. He also mentioned how he believes the car of the future will change its role in our lives, maybe being a secretary, a butler, a courier, or even an empathic companion in our day.  And throughout, we were asked to think deeply about how mobility could be measured, what we will do with the 25th hour, the extra time gained because eventually machines will turn drivers of today into the passengers of tomorrow. He spoke of a future where cars will be online, connected to each other too, sharing data, to reduce traffic jams and more. He urged us to never stop questioning. Steve Wozniak described the mobility quotient as "a level of freedom, you can be anywhere, anytime, but it also means freedom, like not having cords." 

We heard about Hyperloop transportation technologies cutting down on travel time between places and then the different things we might do in an autonomous vehicle, which briefly cited 'healthcare' as one option. Sacha Vrazic, who spoke about his work on self driving cars gave a great hype free talk and highlighted just how far away we are from the utopia of cars that drive themselves. We heard about technology, happiness and temporal mobility. It was such a diverse mix of topics. For example, we heard from Anna Nixon, who is just 17 years old, and already making a name for herself in robotics, and inspired us to think differently. 

What's weird but in a good way is that Audi, a car firm was hosting a conversation about access to education and improving social mobility. I found it wonderful to see Fatima Bhutto, a journalist from Pakistan give one of the closing keynotes on Day 2, where she reminded us of the challenges with respect to human rights and access to sanitation for many living in poorer countries, and how advances in mobility might address these challenges. It was surprising because Audi sells premium vehicles, and it made me think that that mobility isn't just about selling more premium vehicles. What's clear is that Audi (like many large organisations) is trying to figure out how to stay relevant in our lives during this century. Instead of being able to sell more cars in the future, maybe they will be selling us mobility solutions & services which may not even always involve a car. Perhaps they will end up becoming a software company that licences the algorithms used by autonomous vehicles in decades to come? It reminds me of the pharmaceutical industry wanting to move to a world of 'beyond the pill' by adapting their strategy to offer new products and services, enabled by new technologies. When you're faced with having to rip up the business model that's allowed your organisation to survive the 20th century, and develop a business model that will maximise your chances of longevity for the 21st century, it's a scary but also exciting place to be. 

On Day 2 attendees were able to choose 3 out of 12 workspaces where we could discuss how to make an impact on each of the 4 types of mobility. I chose these 3 workspaces.

  • Spatial mobility - which obstacles are still in the way of autonomous driving?

  • Social mobility - what makes me trust my digital assistant?

  • Sustainable mobility - what will future mobility ecosystems look like? 

The first workspace made me realise the range of challenges in terms of autonomous cars. Legal, technical, cultural and infrastructure challenges. We had to discuss and think about topics that I rarely think about when just reading news articles on autonomous cars. The fact that attendees were from a range of backgrounds made the conversations really stimulating. None of that 'groupthink' that I encounter at so many 'innovation' events these days, which was so refreshing.  BTW, Audi's new A8 is the first production vehicle with Level 3 automation, and the feature is called Traffic Jam Pilot. Subject to legal regulations, on selected roads, the driver would be able to take their hands off the wheel and do something else, like watch a video. The car would be able to drive itself. However, the driver would have to be ready to take back control of the car at any time, should conditions change. I found two very interesting real world tests of the technology here and here. Also, isn't it fascinating that a survey found only 26% of Germans would want to ride in autonomous cars. What about a self driving wheelchair in a hospital or an airport? Sounds like science fiction, but they are being tested in Singapore and Japan. Today few of us will be able to access these technologies because they are only available to those with very deep pockets. However, this will change. Just look at airbags, introduced as an option by Mercedes Benz on their flagship S-class in 1981. Now, 36 years later, even the smallest of cars often comes fitted with multiple airbags. 

In the second workspace, with other attendees, I formed a team and our challenge was to discuss transparency on collection and use of personal data from a digital assistant in the car of the future? Almost like a virtual co-driver. Our team had a Google Home device to get us thinking about the personal data that Google collects and we had to pitch our ideas at the end of the workspace in terms of how we envisaged getting drivers and passengers to trust these digital assistants in the car. How could Audi make it possible for consumers to control how their personal data is used? It's encouraging to see a large corporate like Audi thinking this way.  Furthermore, given that these digital assistants may one day be able to recognise our emotional state and respond accordingly, how would you feel if the assistant in your car noticed you were feeling angry, and instead of letting you start the engine, asked if you wanted to have a quick psychotherapy session with a chatbot to help you deal with the anger? Earlier this year, I tested Alexa vs Siri in my car with mixed results. You can see my 360 video below. 
 

In the third workspace on sustainable mobility, we had to choose one of 3 cities (Beijing, Mumbai and San Francisco) and come up with new ideas to address challenges in sustainable mobility given each city's unique traits. This session was truly mind expanding, as I joked about the increasing levels of congestion in Mumbai, and how maybe they need flying cars. It turned out that one of the attendees sitting next to me was working on urban vehicles that can fly! None of discussions and pitches in the workspaces were full of easy answers, but what they did remind me was the power of bringing together people that normally don't work together to come up with fresh ideas to very complex challenges. Furthermore, these new solutions we generate can't just be for the privileged few, but we have to think global from the beginning. It's our shared responsibility to find a way of including everyone on this new journey. Maybe instead of owning, leasing or even renting a car the traditional way, we'd like to be able to rent a car by the hour using an app on our phones? In fact, Audi have trialled on demand car hire in San Francisco, just launched in China and plan to launch in other countries too, perhaps even with providing you with with a chauffeur too. Only time will tell if they succeed, as others have already tried and not been that successful. 

Taking part in this summit was very useful for me, I left feeling challenged, inspired and motivated. There was an energy during the event that I rarely see in Europe, I experienced a feeling that I only tend to get when I'm out in California, where people attending events are so open to new ideas and fresh thinking that you walk away feeling that you truly can build a better tomorrow. My brain has been buzzing with new ideas since then. 

For example, whether we believe that consumers will have access to autonomous in 5 years or 50 years, we can see more funds being invested in this. I was watching a documentary where Sebastian Thrun, who lost his best friend in a car accident aged 18, and helped build Google's driverless car, believes that a world with driverless vehicles will save the lives of the 1 million people who currently die on the roads every year around the globe. Think about that for a moment. If that vision is realised this century, even partially, what does that mean for those resources in healthcare that currently are spent on dealing with road traffic accidents? He has now turned his attention to flying cars.

Thinking about chronic disease for a second, you'd probably laugh at the thought of a car that could monitor your health during your commute to the office?

Audi outlined a concept called Audi Fit Driver in 2016 which "The Audi Fit Driver project focuses on the well-being and health of the driver. A wearable (fitness wristband or smartwatch) monitors important vital parameters such as heart rate and skin temperature. Vehicle sensors supplement this data with information on driving style, breathing rate and relevant environmental data such as weather or traffic conditions. The current state of the driver, such as elevated stress or fatigue, is deduced from the collected data. As a result, various vehicle systems act to relax, vitalize, or even protect the driver."

Another car manufacturer, Toyota, has filed a patent suggesting a future where the car would know your health and fitness goals and the car would offer suggestions to help you meet those goals, such as parking further away from your planned destination so you can get some more steps in towards your daily goal. My friend, Bart Collet, has penned his thoughts about "healthcartech", which makes for a useful read. One year ago, I also made a 360 video with Dr Keith Grimes discussing if cars in the future will track our health. 

Consider how employers may be interested in tracking the health of employees who drive as part of their job. However, it's not plain sailing. A European Union advisory panel recently said that "Employers should be banned from issuing workers with wearable fitness monitors, such as Fitbit, or other health tracking devices, even with the employees’ permission." So at least in Europe, who knows if we'll ever be allowed to have cars that can monitor our health? On top of that, in this bold new era, in order for these new connected services to really provide value, all these different organisations collecting data will have to find a way to share data. Does blockchain technology have a role to play in mobility? I recently came across Dovu which talks about the world's first mobility cryptocurrency, "Imagine seamless payment across mobility services: one secure global token for riding a bus or train, renting a bike or car or even enabling you to share your own vehicle or vehicle data." Sounds like an interesting idea. 

Thinking about some of the driver assist technologies available today, what do they mean for mobility? Could they help older people remain at the helm of a car even if their reflexes have slowed down? In Japan, the National Police Agency "calls on the government to create a new driver’s license that limits seniors to vehicles with advanced safety systems that can automatically brake or mitigate unintended accelerations." Apparently, one of the most common accidents in Japan is when drivers mistake the accelerator for the gas pedal. Today some new cars come with Autonomous Emergency Braking (AEB) where the car's sensors will detect if you are about to hit another vehicle or a pedestrian and perform an emergency stop if the car detects that the driver is not braking quickly enough. So by relinquishing more control to the car, we can have safer roads. My own car has AEB and on one occasion when I faced multiple hazards on the road ahead, it actually took over the braking, as the sensors thought I wasn't going to stop in time. It was a very strange feeling. Many seem to be reacting with extreme fear when hearing about these new driver assist technologies, yet if you currently drive a car with an automatic transmission or airbags, you are perfectly happy to let the car decide when to change gears or when to inflate the airbag. So on the spectrum of control, we already let our cars make decisions for us. As they get smarter, they will be making more and more decisions for us. If someone over 65 doesn't feel like driving even if the car can step in, then maybe autonomous shuttles like the ones being tested in rural areas in Japan are one solution to increasing mobility of an ageing community.

When we pause to think of how big a problem isolation and loneliness are in our communities, could these new products and services go beyond being simply a mobility solution and actually reduce loneliness? That could have far reaching implications for our health. What if new technology could help those with limited mobility cross the road safely at traffic lights? It's fascinating to read the latest guidance consultation from the UK's National Institute for Health and Care Excellence on the topic of Physical Activity and the Environment. Amongst many items, it suggests mentions modifying traffic lights so those with limited mobility can cross the road safely. Now just extending the time by default so that traffic lights are red by a few extra seconds so that this is possible might end up just causing more traffic jams. So in a more connected future, imagine traffic lights with AI that can detect who is waiting to cross the road, and whether they will need an extended crossing time, and adjust the duration of the red light for vehicles accordingly. This was one of the ideas I brought up at the conference during the autonomous vehicle workspace.

If more people in cities use ride hailing services like Uber and fewer people own a car, does this mean our streets will have fewer parked cars, allowing residents to reclaim the streets for themselves? If this shift continues, in the long term, it might lead to city dwellers of all ages becoming more physically active. This could be good news in improving our health and reducing demand on healthcare systems. One thing is clear to me, these new mobility solutions will require many different groups across society to collaborate. It can't just be a few car manufacturers who roll out technology without involving other stakeholders so that these solutions are available to all, and work in an integrated manner. The consumer will be king though, according to views aired at New Mobility World in Germany this week, "With his smartphone, he can pick the optimal way to get from A to B,” said Randolph Wörl from moovel. “Does optimal mean the shortest way, the cheapest way or the most comfortable way? It’s the user’s choice.” It's early days but we already have a part of the NHS in the UK looking to use Uber to transfer patients to/from hospital. 

Urban mobility isn't just about cars, it's also about bicycles. I use the Santander bike sharing scheme in London on a daily basis, which I find to be an extremely valuable service. I don't want to own a bicycle since in my small home, I don't really have room to store it. Additionally, I don't want the hassle of maintaining a bike. Using this bike sharing scheme has helped me to lose 15kg this summer, which I feel has improved my own health and wellbeing. If we really want to think about health, rather than just about healthcare, it's critical we think beyond those traditional institutions that we associate with health, and include others. Incidentally, Chinese bike sharing firms are now entering the London market.

In the UK, some have called for cycling to be 'prescribed' to the population, helping people to stay healthier and again to reduce demand on the healthcare system. Which is why I find the news that Ford of Germany is getting involved with a new bike sharing scheme. Through the app, people will be able to use Ford's car sharing and bike sharing scheme. An example of Mobility as a Service and of another car manufacturer seeking a path to staying relevant during this century. Nissan of Japan are excitedly talking about Intelligent Mobility for their new Nissan Leaf, talking about Intelligent Driving where "Soon, you can have a car that takes the stress out of driving and leaves only the joy. It can pick you up, navigate heavy traffic, and find parking all on its own." A Chinese electric car startup, Future Mobility Cop who have launched their Byton brand have said their "models are a combination of three things: a smart internet communicator, a spacious luxury living room and a fully electric car." Interestingly, they also want to "turn driving into living." I wonder if in 10-15 years time, we'll spend more time in cars because the experience will be a more connected one? Where will meetings take place in future? Ever used Skype for Business from work or home to join an online meeting? BMW & Microsoft are working to bring that capability to some of BMW's vehicles. Samsung have announced they are setting up a £300m investment fund focusing on connected technologies for cars. It appears that considerable sums of money are being invested in this new arena of connected cars that fit into our digital lifestyles. Are the right people spending the right money on the right things? 

I feel that those developing products which involve AI are often so wrapped up in their vision that it comes across as if they don't care what the social impact of their ideas will be. In an article about Vivek Wadhwa's book, The Driver in the Driverless Car, the journalist points out that the book talks about the possibility of up to 5m American jobs in trucking, delivery driving, taxis and related activities being lost, but there are no suggestions mentioned for handling the the social implications of this shift. Toby Walsh, a Professor of AI believes that Elon Musk, founder of Tesla cars is scaremongering when tweeting about AI starting World War 3. He says, "So, Elon, stop worrying about World War III and start worrying about what Tesla’s autonomous cars will do to the livelihood of taxi drivers." Personally, we need some more balance and perspective in this conversation. The last thing we need is a widening of social inequalities. How fascinating to read that India is considering banning self driving cars in order to protect jobs. 

This summit has really made me think hard about mobility and health. Perhaps car manufacturers will end up being part of solutions that bring significant improvements in our health in years to come? We have to keep an open mind about what might be possible. Maybe it's because I'm fit and reasonably healthy, live in a well connected city like London and can afford a car of my own, that I never really thought about the impact of impaired mobility on our health? In the Transport Research Laboratory's latest Quarterly Research Review, I noticed a focus on mental health and ageing drivers, and it's clear they want transport planners to put health and wellbeing as a higher priority with a statement of, "With transport evolving, it’s vital that we don’t lose sight of the implications it can have on the health of the population, and strive to create a network that encourages healthy mobility.” At minimum, mobility might just mean being able to walk somewhere in your locality, but what if you don't feel safe walking in your neighbourhood due to high rates of crime? Or what if you can't walk because there it literally nowhere to walk? I remember visiting Atlanta in the USA several years ago, and I took a walk from a friend's house in the suburbs. A few minutes into my walk, the sidewalk just finished, just like that with no warning. The only way I could walk further would be to walk inside a car dealership. Ironic. The push towards electrification of vehicles is interesting to witness, with Scotland wanting to phase out sales of new petrol and diesel cars by 2032. India is even more ambitious, hoping to move towards electric vehicles by 2030. The pollution in London is so high that I avoid walking down certain roads because I don't want to breathe in those fumes. So a future with zero emission electric cars gives me hope. 

It's obvious that we can't just think about health as building bigger hospitals and hiring more doctors. If we really want societies where we can prevent more people from living with chronic diseases like heart disease and diabetes, we have to design with health in mind from the beginning. There is an experiment in the UK looking to build 10 Healthy New Towns. Something to keep an eye on.

phonebox2.jpg

The technology that will underpin this new era of connectivity seems to be the easy part. The hard part is getting people, policy and process to connect and move at the same pace as the technology, or at least not lag too far behind. During one of my recent sunrise bike rides in London, I came across a phone box. I remember using them as a teenager, before the introduction of mobile phones. At the time, I never imagined a future where we didn't have to locate a box on the street, walk inside, insert coins and press buttons in order to make a call whilst 'mobile' and in such a short space of time, everything has changed, in terms of how we communicate and connect. These phone boxes scattered around London remind me that change is constant, and that even though many of us struggle to imagine a future that's radically different from today, there is every chance that the healthy mobility in 20 years time will look very different from today.

Who should be driving our quest for healthy mobility? Do we rest our hopes on car manufacturers collaborating with technology companies? As cities grow, how do we want our cities to be shaped?

What's your definition of The Mobility Quotient?

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

Managing our health: One conversation at a time

If you've watched movies like Iron Man, featuring virtual assistants, like JARVIS, which you can just have a conversation with and control your home, you probably think that such virtual assistants belong in the realm of science fiction. Earlier this year, Mark Zuckerberg, who runs Facebook set a personal challenge to create a JARVIS style assistant for his own home. "My personal challenge for 2016 is to build a simple AI to run my home and help me with my work. You can think of it kind of like Jarvis in Iron Man." He may be closer to his goal, as he may be giving a demo something this month. For those that don't have an army of engineers to help them, what can be done today? Well, one interesting piece of technology is Amazon's Echo. So what is it? Amazon describes it as, "Amazon Echo is a hands-free speaker you control with your voice. Echo connects to the Alexa Voice Service (AVS) to play music, provide information, news, sports scores, weather, and more—instantly. All you have to do is ask." Designed for your home, it is plugged into the mains and connected to your wifi. It's been on sale to the general public in the USA since last summer, and was originally available in 2014 for select customers.

This week, it's just been launched in the UK and Germany as well. However, I bought one from America 6 months ago, and I've been using it here in the UK every day since then. My US spec Echo does work here in the UK, although some of the features don't work, since they were designed for the US market. I've also got the other devices that are powered by AVS, the Amazon Tap, Dot and also the Triby, which was the first 3rd party device to use AVS. To clarify, the Echo is the largest, has a full size speaker and is the most expensive from Amazon ($179.99 US/£149.99 UK/179.99 Euros). The Tap is cheaper ($129.99, only in USA) and is battery powered, so you can charge it and take it to the beach with you, but it requires that you push a button to speak to Alexa, it's not always listening like the other products. The Dot is even cheaper (now $49.99 US/£49.99 UK/59.99 Euros) and does everything the original Echo can do, except the built-in speaker is good enough only for hearing it respond to your voice commands. If you want to use it for playing music, Amazon expect you to connect the Dot to external speakers. A useful guide comparing the differences between the Echo, Dot and Tap is here. The Triby ($199.99) is designed to be stuck on your fridge door in the kitchen. It's sold in the UK too, but only the US version comes with AVS. Amazon expect you'd have the Echo in the living room, and you'd place extra Dots in other rooms. Using this range of products has not only given me an insight into what the future looks like, but I can see the potential for devices like the Echo (and the underlying service, AVS) to play a role in our health. In addition, I want to share my research on the experiences of other consumers who have tried this product. There are a couple of new developments announced this week which might improve the utility of the device, which I'll cover towards the end of the post. 

Triby, Amazon's Tap, Dot & Echo

Triby, Amazon's Tap, Dot & Echo

You can see in this 3 min video, some of the things I use my Echo for in the morning, such as reading tweets, checking the news, weather/my Google calendar, adding new events to my calendar or turning my lights on.  For a list of Alexa commands, this is a really useful guide. If you're curious about how it works, you don't have to buy one, you can test it out in your web browser, using Echosim (you will need an Amazon account though) 

What's really fun are experimenting with the new skills [i.e apps] that get added by 3rd parties, one of which is how my Echo is able to control devices in my smart home, such as my LifX lights. I tend to browse the Alexa website for skills and add them to my Echo that way. You can also enable skills just by speaking to your device. At the moment, every skill is free of charge. I suspect that won't always be the case. 

Some of the skills are now part of my daily routine, as they offer high quality content and have been well designed. Amazon boast that there are now over 3,000 skills. However, the quality of the skills varies tremendously, just like app stores for other devices we already use. For example, in the Health & Fitness section, sorted by relevance, the 3rd skill listed is one called Bowel Movement Facts. 

The Echo is always on, it's 7 microphones can detect your voice even if the device itself is playing music and you're speaking from across the room. It's always listening for someone to say 'Alexa' as the wake up word, but you have a button to mute the Echo so it won't listen. I use Siri, but I was really impressed when I started to use my Echo, it was felt quicker than Siri in answering my questions. Anna Attkisson did a 300 question test, comparing her Echo vs Siri, and found that overall, Amazon's product was better. Not only does the Echo understand my London accent, but it also has no problem understanding me when I used some fake accents to ask it for my activity & sleep data from my Fitbit. I think it's really interesting that I can simply speak to a device in my home and obtain information that has been recorded by the Fitbit activity tracker that I've been wearing on my wrist. It makes me wonder about how we will access our health data in the future. Whilst at the moment, the Echo doesn't speak unless you communicate with it, that may be changing in the future, if push notifications are enabled. I can see it now, having spent all day sitting in meetings, and sat on my smart sofa watching my smart TV, my inactivity as recorded by my Fitbit, triggers my Echo to spontaneously switch off my smart TV, switch my living room lights to maximum brightness, announce at maximum volume that I should venture outside for a 5,000 step walk, and instruct my smart sofa to adjust the recline so I'm forced to stand up. That's an extreme example, but maybe a more realistic one is that you have walked much less today than you normally do, and you end up having a conversation with Echo because your Echo says, "I noticed you haven't walked much today. Is everything ok?"

We still don't know about the impact on society as our homes become smarter and more connected. For example, in the USA, those with GE appliances will be able to control some of them with the Echo. You'll be able to preheat the oven, without even getting off your sofa. That could have immense benefits for those with limited mobility, but what about our children? If they grow up in a world where so much can be done without even having to lift a finger, let alone walk a few steps from the sofa to the kitchen, is this technology a welcome advance? If you have a Hyundai Genesis car, you can now use Alexa to control certain aspects of your car. When I read this part of the Hyundai Genesis article, "Being able to order basic functions by voice remotely will keep owners from having to run outside to do it themselves" it made me think about a future where we just live an even more sedentary lifestyle, with implications for an already over burdened healthcare system. Perhaps having a home that is connected makes more sense in countries like the USA and Australia which on average have quite large houses. Given how small the rooms in my London home are, it's far quicker for me to reach for the light switch than to issue a verbal command to my Echo (and wait for it to process the command)

Naturally, some of us would be concerned about privacy. Right now, anyone could walk into the room and assuming they knew the right commands, could quiz my Echo about my activity and sleep data. One of the things you can do in the US (and now in Europe) is order items from Amazon by speaking to your Echo, and Alex Cranz wrote a post saying, "And today it let my roommate order forty-eight Cadbury Creme Eggs on my account. Despite me not being home. Despite us having very different voices. Alexa is burrowing itself deeper and deeper into owners’ lives, giving them quick and easy access not just to Spotify and the Amazon store, but to bank accounts and to do lists. And that expanded usability also means expanded vulnerability.", he also goes on to say, "In the pursuit of convenience we have to sacrifice privacy." Note that Amazon do offer the ability to modify your voice purchasing settings, so that the device would ask you for a 4 digit confirmation code before placing the order. The code would NOT be stored in your voice history. You can also turn off voice purchasing completely if you wish.

Matt Novak filed a FOI request to ask if the FBI had ever wiretapped an Amazon Echo. The response he got, "we can neither confirm nor deny."

If you don't have an Echo at home, how would you feel about having one? How would you feel about your children using it? One thing I've noticed is that the Echo seems to work better over time, in terms of responding to my voice commands. The way that the Echo works is that it does record your voice commands in the cloud, and by analysing the history of your voice commands, it refines its ability to serve your needs. You can delete your voice recordings, although it may make the Echo less accurate in future. Some Echo users whose children also use the device say their kids love it, and in fact got to grips with the device and it's capabilities faster than the parents. However, according to this Guardian article, if a child under 13 uses an Echo, it is likely to contravene the US Children’s Online Privacy Protection Act (COPPA). This doesn't appear to have put off households installing an Echo in the USA, as research suggests Amazon have managed to sell 3 million devices. Another estimate puts the installed user base significantly lower, at 1.6 million. Either way, in the realm of home based virtual assistants, Amazon are ahead, and probably want to extend that lead, with reports that in 2017 they want to sell 10 million of these speakers. 

Can the Echo help your child's health? Well a skill called KidsMD was released in March that allows parents to seek advice provided by Boston Children's hospital. After the launch, their Chief Innovation Officer, John Brownstein said, "We’re trying to extend the know-how of the hospital beyond the walls of the hospital, through digital, and this is one of a few steps we’ve made in that space." So I tested Kids MD back in April, and you can see in this 3 minute video what it's like to use. What I find fascinating is that I'm getting access to validated health information, tailored to my situation, simply by having a conversation with an internet connected speaker in my home. Of course, the conversation is fairly basic for now, but the pace of change means it won't be rudimentary forever. 

I was thinking about the news last week here in the UK, where it was announced that the NHS will launch a new website for patients in 2017. My first thought was, what if you're a patient who doesn't want to use a website, or for whatever reason can't use a website. If the Echo (and others like it) launch in the UK, why couldn't this device be one of the digital channels that you use to interface with the NHS? Some of us at a grassroots level are already thinking of what could be done, and I wonder if anyone in the NHS has been formally testing an Echo to see how it might be of use in the future? 

The average consumer is already innovating themselves using the Echo, they aren't waiting years for the 'system' to innovate. They are conducting their own experiments, buying these new products with their own money. One man in the USA has used the Echo to help him care for his aging mother, who lives in a different location from him. 

In this post, a volunteer at a hospice asks the Reddit community for input on what the Echo could be useful for with patients. 

How about Rick Phelps, diagnosed back in 2010 at the age of 57 with Early Onset Alzheimer's Disease, and now an advocate for Dementia awareness. Back in Feburary, he wrote about his experience of using the Echo for a week. What does he use it for? To find out what day it is, not knowing what day it is because of Dementia.

For many of us, consumer grade technology such as the Echo will be perceived as a gimmick, a toy, of being of limited or no value with respect to our health. I was struck by what Rick wrote in his post, "To many, the Amazon Echo is a cool thing to have. Some what of a just another electronic gadget. But to a dementia patient it is much, much more than that.It has afforded me something that I have lost. Memory. I can ask Alexia anything and I get the answer instantly. And I can ask it what day it is twenty times a day and I will still get the same correct answer." Rick also highlights how he used the Echo to set medication reminders.

I have to admit, the Echo is still quite clunky, but the original iPhone was clunky too, and the 1st generation of every new type of technology is usually clunky. For people like Rick, it's good enough to make a difference to the outcomes that matter to him in his daily life, even if others are more skeptical. 

Speaking of medication reminders, there was a 10 day Pymts/Alexa challenge this year, using Alexa to "to reimagine how consumers interact with their payments and financial services solutions providers." What I find fascinating is that the winner was DaVincian Healthcare, and they created something called DaVincianRX, an “interactive prescription, communication, and coordination companion designed to improve medication adherence while keeping family caregivers in the loop." You can read more and watch their video of it in action here. People and organisations constantly ask me, where do we look for innovation and new ideas? I always remind them to look outside of healthcare. From a health perspective, most of the use cases I've seen so far involving the Echo are for older members of society or those that care for them. 

I came across a skill called Marvee, which is described as "a voice initiated voice-initiated concierge application integrated with the Alexa Voice service and any Alexa-enabled device, like the Amazon Echo, Dot or Tap." Most of the reviews seem to be positive. It's actually refreshing to see a skill that is purpose built to help those with challenges that are often ignored by the technology sector. 

In the shift towards self-care, when you retire or get diagnosed with a long term condition for the first time, will you be getting a prescription for an Amazon Echo (or equivalent)? Who is going to pay for the Echo and related services? Whilst we have real world evidence that shows the Echo is making a positive impact on people's lives, I haven't been able to find any published studies testing the Echo within the context of health. That's a gap in knowledge, and I hope there are researchers out there who are conducting that research. Like any product, there will be risks as well as benefits, and we need to be able to quantify those risks and benefits now, not in 5 years time. Earlier I cited how Rick who lives with Alzheimer's Disease finds the Echo to be of benefit, but for other people like Rick, using the Echo might lead to harm rather than benefit. We don't know yet. However, not every application of the Echo will require a double blinded randomised clinical trial to be undertaken. If I can already use my Echo to order an Uber, or check my bank balance, why can't I use it to book an appointment with my doctor?

In the earlier use case, a son looked through the data from his mother's usage of her Echo to spot the signs when something is wrong. Surely, Amazon could parse through that data for you and automatically alert you (or any interested person) that there could be an issue? Allegedly, Amazon is working on improvements to the service where Alexa could one day recognise our emotions and respond accordingly. I believe our voice data is going to play an increasing role in improving our health. It's going to be a new source of value. At an event in San Francisco recently, I met Beyond Verbal, an emotions analytics company. They are doing some really pioneering work. We already have seen the emergence of the Parkinson's Voice Initiative, looking to test for symptoms using voice recordings.

How might a device like the Echo contribute to drug safety? Imagine it reminds you to take your medication, and in the conversation with your Echo, you reply that you're skipping this dose, and it asks you why? In that conversation, you have the opportunity in your own words to say why you have skipped that dose. Throw in the ability to analyse your emotions during that conversation, and you have a whole new world of insights on the horizon. Some of us might just be under the impression that real world data is limited to data posted on social media or online forums, but our voice recordings are also real world data. When we reach a point when we can weave all this real-world data together to get a deeper understanding of our health, we will be able to do things we never thought possible. Naturally, there are immense practical challenges on that pathway, but progress is being made every day. Having all of this data from all of these sources is great, and even if it's freely available, it needs to be linked together to truly make a difference. Researchers in the UK have demonstrated that it's feasible to use consumer grade technology such as the Apple watch to accurately monitor brain health. How about linking the data from my Apple watch with the voice data from my Amazon Echo to my electronic health record?

An Israeli startup, Cordio Medical has come up with a smartphone app for those patients with Congesitve Heart Failure (CHF) that captures voice data, analyses it in real-time, and "detects early build-up of fluids in the patient’s lung before the appearance of physical symptoms", and deviations found in the voice data would trigger an alert, where "These alerts permit home- or clinic-based medical intervention that could prevent hospitalisation." For those CHF patients without smartphones, could they simply use an Echo at home with a Cordio skill? Or does Amazon offer the voice data directly to organisations like Cordio for remote monitoring (with the patient's consent)? With devices like the Echo, if Amazon (or their rivals) continue to grow their user base over the next 10 years, they could have an extremely valuable source of unique voice based health data that covers the entire population. 

At present, Amazon has surprisingly made rather good progress in terms of the Echo as a virtual assistant. However, other tech giants are looking to launch their own products and services. For example, Google Home, that is due to arrive later this year. This short video shows what it will be able to do. Now for me, Google plays a much larger role in my daily life than Amazon, in terms of core services. I use Google for email, for search, for my calendar, and maps for navigation. So, Google's Home might be vastly superior to Echo, simply because of that integration with those core services that I already use. We'll have to wait and see. The battle to be a fundamental part of your home is just beginning, it seems. 

The battle to be embedded in every aspect of our lives with extend beyond the home, perhaps in our cars. I tested the Amazon Dot in my car, and I reckon it's only a matter of time before we see new cars on sale with these virtual assistants built into the car's systems, instead of being an add-on. We already have new cars coming with 4G internet connectivity, offering wifi for your devices, from brands like Chevrolet in the USA. 

For when we are on the move, and not in our car or home, maybe we'll all have earphones like the new Apple Airpods, where we can discreetly ask our virtual assistants to control the objects and devices around us. Perhaps Sony's product, the Experia Ear, which launches in November, and is powered by something called Sony's Agent, which could be similar to Amazon's AVS, is what we will be wearing in our ears? Or maybe none of these big tech firms will win the battle? Maybe it will be one of us, or one of our kids who comes up with the virtual assistant that will rule the roost? I'm incredibly inspired after watching this video where a 7 year old girl and her father built their own Amazon Echo using a Raspberry Pi. This line in the video's description stood out to me, "She did all the programming following the instructions on the Amazon Github repository." Next time there is a health hackathon, do we simply invite a bunch of 7 year old kids and give them the space to dream up new solutions to problems that we as adults have created? Or maybe it should be a hackathon that invites 7 year olds with their grandparents? Or maybe we have a hackathon where older adults are invited to co-design Alexa skills with younger people for the Echo? We don't just have financial deficits in health & social care, but we have a deficit of imagination. Amazon have a programming tutorial where you can build a trivia skill for Alexa in under an hour. When it comes to our health, do we wait for providers to develop new Alexa skills, or will consumers start to come together and build Alexa skills that their community would benefit from, even if that community happens to be a community of people scattered around the world, who are all living with the same rare disease?

You'll have noticed that in this post, I haven't delved into the convergence of technologies that have enabled something like the Echo to work so well. This was deliberate on this occasion. At present, I'm really interested in how virtual assistants like the Echo make you feel, rather than the technical details of the algorithm being used to recognise my voice. For someone living far away from their aging parents/grandparents, does the Echo make you feel reassured? For someone living alone and feeling social isolated, does the Echo make you feel not alone? For a young child, does it make you feel like you can do magic, controlling other devices just with your voice? For someone considering moving out of their own home into an institution, does the Echo make you feel independent again? If more and more services are becoming digital by default, how many of these services will be available just by having a conversation? I am using my phone & laptop less since I've had my Echo, but I'm not yet convinced that virtual assistants will one day eliminate the need for a smartphone, but some of us are convinced. 50% of urban smartphone owners around the world believe that smartphones will be no longer be needed in 5 years time. That's one of the findings when Ericsson Consumer Lab quizzed smartphone users in 13 cities around the globe last year. The survey is supposed to represent the views of 68 million urban citizens. In addition, they also found, "Furthermore, a third would even rather trust the fidelity of an AI interface than a human for sensitive matters. 29 percent agree they would feel more comfortable discussing their medical condition with an AI system." I personally think the consumer trends identified have deep implications for the nature of our interactions with respect to our health. Far too many organisations are clinging on to the view that the only (and best) way that we interact with health services is face to face, in a healthcare facility, with a human being. Despite these virtual assistants at home not needing a smartphone with a data plan to work, they would need fixed broadband to work. However, looking at OECD data from December 2015, fixed broadband penetration is rather low. The UK is not even at 40%, so products such as the Echo may not be accessible for many across the nation who might find it beneficial with regard to their health. This is an immense challenge, and one that will need joined up thinking, as we need everyone included in this digital revolution.

You might be thinking right now that building a virtual assistant is your next startup idea, it's going to be how you make an impact on society, it's how you can change healthcare. Alas, it's not as easy as we first thought. Cast your mind back to 2014, the same year that the Echo first became available. I was one of early adopters who pledged $499 for the world's first social robot, Jibo [consider it a cuter or creepier version of the Echo with a few extra features] They raised almost $4 million from people like me, curious to explore this new era. Like the Echo, you are meant to be able to talk to Jibo from anywhere in the room, and it will act upon your command. The release got delayed and delayed, and then recently I got an email informing me that the folks behind Jibo have decided that they won't be shipping Jibo to backers outside of the USA, and I was offered a full refund.

One of the reasons that cited was, "we learned operating servers from the US creates performance latency issues; from a voice-recognition perspective, those servers in the US will create more issues with Jibo’s ability to understand accented English than we view as acceptable." How bizarre, my US spec Echo understands my London accent, and even my fake ones! It took the makers of Jibo 2 years to figure this out, and this too from people who are at the prestigious MIT Media Lab. So just how much effort does it take to make something like the Echo? A rather large amount, it seems. According to the Jeff Bezos, the CEO of Amazon, they have over 1,000 people working on this new ecosystem. A very useful read is the real story behind the Echo explaining in detail how it was invented. Apparently, the reason why the Echo was not launched outside of America until now, was to so it could handle all the different accents. So, if you really want to do a hardware startup, then one opportunity is to work on improving the digital microphones found not just in the Echo, but in the our smartphones too. Alternatively, Amazon even have an Alexa Fund, with $100m in funding for those companies looking to "fuel voice technology innovation." Amazon must really believe that this is the computing platform of the future. 

Moving on this week's news, the UK Echo will have UK partners such as the Guardian, Telegraph & National Rail. I use the train frequently from my home into central London, the station is a 15 minute walk from my house, so that's one of the UK specific skills I'm most likely to use to check if the train is delayed or cancelled before I head out of the front door. Far easier and quicker than pulling out my phone and opening an app. The UK version will also have a British accent. If you have more than one Echo device at home, and speak a command, chances are that two or more of your devices will hear you and respond accordingly, which is not good, especially if you're placing an order with Amazon. So now they have updated the software with ESP (Echo Spatial Perception) and when you talk to your Echo device, only the closest one to you will respond. It's being rolled out to those who have existing Echo devices, so no need to upgrade. You might want to though, as there is a new version of the Echo Dot (US, UK & Germany), which is cheaper, thinner, lighter and promises better voice recognition than the original model. For those who want an Echo in every room, you can now buy Dots in 6 or 12 packs! In the UK, given that the Echo Dot is just £49.99, I expect this Christmas, many people will be receiving them as presents. 

Amazon's Alexa Voice Service is one example of a conversational user interface, and at times it's like magic, and other times, it's infuriatingly clumsy.  I'm mindful that my conversations with my Echo are nowhere near as sophisticated as conversations I have with humans. For example, if I say "Alexa, set a reminder to take my medication at 6pm" and it does that, and then I immediately say "Alexa, set a reminder to take my medication at 6.05pm", and so forth, it currently won't say, "Are you sure? You just set a medication reminder close to that time already." Some parents are concerned that the use of an Echo by their kids is training them to be rude, because they can throw requests at Alexa, even in an aggressive tone of voice, with no please, no thank you, and Alexa will always comply. Are these virtual assistants going to become our companions? Busy parents telling their kids to do their homework with Alexa, or lonely elders who find that Alexa becomes their new friend in helping them cope with social isolation? Will we end up with bathroom mirrors we can have conversations with about the state of our skin? Are we ever going to feel comfortable discussing the colour of our urine with the toilet in our bathroom? When you grab your medication box out of the cupboard, do you want to discuss the impact on your mood after a week of taking a new anti-depressant?

Could having conversations with our homes help us to manage our health? It seems like a concept from a science fiction movie, but to me, the potential is definitely there. The average consumer will have greater opportunities to connect their home to the internet in years to come. Brian Cooley, asks in this post if our home becomes the biggest health device of all.

A thought provoking read is a new report by Plextek examining the changes in the medical industry by 2020 from connected homes. I want you to pause for a moment when reading their vision, "The connected home will be a major enabler in helping the NHS to replace certain healthcare services, freeing up beds for just the most serious cases and easing the pressure on GP surgeries and A&E departments. It will empower patients with long-standing health conditions who spend their life in and out of hospitals undertaking tests, monitoring, rehabilitation or therapy, and give them freedom to care for themselves in a safe way."

Personally, I believe the biggest barrier to making this vision a reality is us, i.e people and organisations that don't normally work together will have to collaborate in order to make connected homes seamless, reliable and cost effective. Think of all the people, policies & processes involved in designing, installing, regulating, and maintaining a connected home that will attempt to replace some healthcare services. That's before we even think about who will be picking up the tab for these connected homes.

Do you believe the Echo is a very small step on the path towards replacing healthcare services, one conversation at a time?

[Disclosure: I have no commercial ties with the individuals or organisations mentioned above]

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

Robots as companions: Are we ready?

Nao

Nao

Some people on Earth seem to think so. In fact, they believe in the concept so much, they are actually building the world's first personal robot that can read & respond to human emotions. A collaboration between French robotics firm, Aldebaran, and Softbank mobile from Japan. You may already know one of Aldebaran's existing robots, Nao. The new robot is called Pepper, is due to launch in Japan in February 2015, and is priced at 198,000 Yen. Using today's exchange rates, that's approximately $1,681 and £1,074, although only the Japanese launch has been confirmed for now. Pepper may be sold in the USA through Sprint stores at some point. The notion of a robot in your home that can interact with you, and even tell you a joke if you're feeling sad, attracted my curiosity. So much so, that in September 2014, I hopped on a train from London to Paris. 

Me & Pepper in Paris

Me & Pepper in Paris

Why Paris? Well, the world's first home robot store opened in Paris this summer, called Aldebaran Atelier, and they had Pepper in the store. You can't buy any of the robots in the store just yet, it's more a place to come and learn about these robots. 

So what's Pepper like? You have to bear in mind that the version I interacted with in Paris is not the final version, so the features I saw are not fully developed, especially the aspects of recognising who you are, and getting to know you and your needs. The 3 minute video below shows some of the interaction I had. For now, Pepper understands English, French and Japanese. 

A bit more about how Pepper works. In the final version, Pepper will be able to understand 5 basic emotional expressions of the face: smiling, frowning, surprise, anger & sadness. Pepper will also read the tone of your voice, the verbage used, as well as non verbal communication such a tilting your head. So for example, if you're feeling sad, Pepper may suggest you go out. If you're feeling happy, Pepper may sing a song and do a dance for you (more on that later). According to a Mashable article, "Pepper has an 'emotional engine' and cloud based artificial intelligence". The article also states, "The cloud AI will allow Pepper to share learnings with cloud-based algorithms and pull down additional learning, so that its emotional intuition and response can continually improve. It's either a technological breakthrough or the most terrifying robot advancement I've ever heard of."

Some facts and figures for you; 

  • 4 feet tall, and weighs 61 lbs/28kg

  • 12 hour battery life - and automatically returns to charging station when battery is low

  • 3D camera which senses humans and their movements up to 3 metres away

In the press kit I was given at the store, it's stated that "Pepper's number one intention is about being kind and friendly. He has been engineered to meet not functional but emotional needs." 

It's not just speech and movement that Pepper responds to, it's also touch. There are sensors on the upper part of his head, upper part of his hands and on the tablet attached to his chest. Pepper may be talking to you, and if you place your hand on his head, the way that you would with a child, Pepper will go quiet. Although, when I tried it, Pepper responded by saying something about sensing someone was scratching his head! 

The creators anticipate Pepper being used to care for the elderly and for baby sitting. What are your thoughts? Do YOU envisage leaving your elderly parent or young child with Pepper for company whilst you do some chores or dash to the supermarket? I told Shirley Ayres, Co-Founder of the Connected Care Network, about Pepper. Her response was; "I'd prefer a robot companion to 15 minutes of care by a worker on minimum wage struggling to provide quality care on a zero hour contract."

Given aging populations, and the desire for many to grow old in their own home, rather than an institution, are household companion robots the answer to this challenge? As technology such as Pepper evolves, will a robot at home be the solution to increasingly lonely societies? Will we really prefer the company of a household robot versus another human being? Will we end up treating the purchase of Pepper the same way we treat the purchase of an Ipad? Will your children buy you a Pepper so they don't have to visit you as often as you'd like? The CEO of Aldebaran, Bruno Maisonnier, believes they will sell millions of these robots. Apparently, they'll be able to make a profit from the sales of robot related software and content. Apps for robots?

Pepper does have all sorts of sensors so it can understand humans as well as understand the environment it's operating within. I understand it will collect data, but it's not clear to me, at this stage, exactly what would be collected or shared. Just because Pepper seems kind and friendly, doesn't mean we should not consider the risks and benefits associated with any data it collects on us, our behaviours and intentions. There could be immense benefits from a robot that can 24 hours a day remind an older person when to take their medications, and potentially collect data on when doses are being skipped and why.

An Institute of Medicine panel has just recommended that "Physicians should collect more information about patients' behaviour and social environment in their electronic health records." Some of the information the panel recommends collecting include "whether they are experiencing depression; their social connections and sense of social isolation." Is technology such as Pepper the most effective route to collecting that data? Do we want a world where our household robot sends data to our doctor on how often we feel sad and lonely? Perhaps for those of us too afraid to reach out for help and support, that's a good thing?

My brief interaction in Paris with Pepper was fun and enjoyable, a glimpse into a possible future. With it's childlike gestures and ability to monitor and respond to our emotions, could we as humans one day form emotional attachments to household robots? Here is the video of Pepper wanting to play some music for me in the Paris store. 

One does wonder how the introduction of these new robots might impact jobs? What does technology such as Pepper mean for human carers? A recent report from Deloitte forecasts that 30% of jobs in London are at high risk from automation over the next 20 years. It's low paid, low skill jobs that are most at risk. Microsoft is trying out a different robot called K5 from Knightscope as security guards in their Silicon Valley campus. In Japan, Pepper has been used by Softbank to conduct market research with customers in a Tokyo store. Nestle is planning to use Pepper to sell coffee machines in 1,000 of it's stores across Japan by the end of 2015. Here is the video showing how Pepper might work in selling to consumers in Nestle's stores. 

Now, some of us may dismiss this robot technology as crude and clumsy, with little or no potential to make a significant impact. I personally think it's amazing that we've reached this point, and like any technology, it won't stand still. Over time, it will improve and become cheaper. We are at a turning point, whether we like it or not. Does Pepper signify the dawn of a new industry, or will these household robots be rejected by consumers? How are household robots treated by the law? Do we need to examine how our societies function rather than build technology such as Pepper? Are we becoming increasingly disconnected from ourselves that we need Pepper in our homes to connect with ourselves as humans? Does the prospect of having a robot like Pepper in your own home with your family, excite you or frighten you?

Given the intense pressure to reduce costs in health & social care, it would be foolish to dismiss Pepper completely. So in the future, will we also see companion robots like Pepper stationed in hospitals and doctor's offices too? Can personal robots that connect with our emotions change how we 'deliver' and 'receive' care?

[Disclosure: I have no commercial ties to any of the individuals or organisations mentioned in the post]

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

Will the home of the future improve our health?

According to BK Yoon, that would be one of the benefits in their vision of the 'home of the future'. Who is BK Yoon? He's President and CEO of Samsung Electronics. Last Friday, I listened as he delivered the opening keynote of IFA 2014, which is the largest consumer electronics and home appliance show in Europe.

Whilst many talk of bringing healthcare out of the hospital/doctor's office into the home, Samsung, in theory, have the resources and vision to make this a reality at a global level. Samsung Group, of which Samsung Electronics is the biggest subsidiary, have just invested $2 billion in setting up a biopharmaceuticals unit. Christopher Hansung Ko, CEO at the Samsung Bioepis unit. said in an earlier interview, “We are a Samsung company. Our mandate is to become No. 1 in everything we enter into, so our long-term goal is to become a leading pharmaceutical company in the world.” 

Is South Korea innovative?

My views on Samsung (and South Korea overall), changed when I visited Seoul, the capital of South Korea during my round the world trip in 2010. I had only scheduled a 3 day stopover, but ended up staying over 3 weeks. I was impressed by their ambitions, their attitude towards new technology and their commitment to education. Their journey over the last half century is truly amazing. "Fifty years ago, the country was poorer than Bolivia and Mozambique; today, it is richer than New Zealand and Spain." Bloomberg released their Global Innovation Index at the start of this year. Guess which country was top of the list? Yup, South Korea. The UK was ranked a lowly 16th.

After my visit, I left South Korea with a different perspective, and have been paying close attention to developments there ever since. I believe many people in the West underestimate the long term ambitions of a company like Samsung. Those wishing to understand the future, would be wise to monitor not just what's happening in Silicon Valley, but also in South Korea. 

Aging populations worry policy makers in many advanced economies. Interestingly, recent data shows that South Korea's population is aging the fastest out of all OECD countries. Maybe that's one of the drivers behind Samsung's strategy of developing technology that could one day help older people live independently. 

We've been hearing about smart and connected homes for many years, and one wonders how this technology would be integrate with our lives. How easy would it be to set up and use? How reliable would it be? Would I have to figure out what to do with all these data streams from the different devices? I used to believe that Apple was unique in really understanding the needs and wants of the consumer, but it seems Samsung have been taking notes. They have conducted lifestyle research with 30,000 people around the world. The results of that research were shared after they keynote. Whilst reading the reports, one feels like Samsung Electronics is now trying to position itself as a lifestyle company, not a technology company. Whether it's one of the opening statements, "The home of the future is about more than technology and gadgets. It's about people." It's about responsive homes that adapt to our needs, homes that protect us, homes that are empathetic. How much is all of this going to cost us, right? Is it only going to be for the rich? Is it only going to work with new homes, or can we retrofit the technology?

Data driven homes - is that what we truly want?

Their research also says, "Technology will promote eating and living right. Devices around the home will inspire us to make decisions that are right for our bodies, taking an active role in helping us achieve better health by turning goals into habits." So in Samsung's vision, the fridge of the future will inform you that some of the food inside has expired and needs to be thrown away. Where do we draw the line? What if you come home from work hoping to grab a beer from the fridge, but the fridge door is locked, because the home of the future knows you've already consumed more than your daily allowance of calories?

Do we actually want to live in data driven homes? Our homes are often an analogue refuge in an increasing digital & connected world. We have the choice to disconnect and switch off our devices and just 'be'. What if part of our contract with our energy provider or home insurer is having smart home technology installed?

In a recent survey of Americans aged 18+ for Lowe's, 70% of smartphone/tablet owners want to control something inside their home from their bed via their mobile. Obesity is already a public health issue not just in the USA, but in many nations. Will being able to switch on the coffee pot, adjust the thermostat and switch on the lights by speaking into your smartphone whilst lying in bed lead to humans leading even more sedentary lives in the future? 

However, there is a flip side. Rather than just dismiss this emerging technology as silly or promoting inactivity, these advancements may be of immense benefit to certain groups of people. For example, would the home of the future enable someone who is blind, disabled or with learning difficulties a greater chance to live independently? Would the technology be useful when you're discharged from hospital after surgery?

Is it just about the data?

One of the byproducts of these new technologies for our homes, are data. Sensors and devices which track everything we do in the home are going to be collecting and processing data about us, our behaviour and our habits. Samsung to mention in their research "Our home will develop digital boundaries that keep networks separate & secure, protecting us from data breaches, and ensuring that family members cannot intrude on each other's data." It all sounds great in a grand vision, but turning that into reality is much harder than it appears.

We expect to hear about Apple's HealthKit later today. Who will you trust with the data from your home in the future? Apple, an American corporation or Samsung, a South Korean corporation? Or neither of them? Is the strategy of connecting our homes to the internet simply a ploy to grab even more data about us? Who will own and control the data from our homes? Where will it be stored? 

Despite the optimism and hope in BK Yoon's keynote, I can't help wonder about the risks of your home's data feed getting hacked, and what it means for criminals such as burglars, or even terrorists. Do we want machines storing data on the movements of each family member within our own home? Or will tracking of family members who are very young or very old, in and around the home, give families peace of mind and reassurance?

Ultimately, who is going to pay for all of this innovation? Even today, when I talk to ordinary hard working families about new technologies, such as sleep tracking, I'm conscious it's out of the reach of many. For example, the recently launched Withings Aura system allows you to track and improve your sleep. It's priced at $299.95/£249.95. If given the choice, how many ordinary families would invest in the Withings Aura to improve their sleep vs buying a new bed from Ikea for the same price?

The video below was played during the keynote, and gave me a glimpse into Samsung's vision of the home of the future. BK Yoon claimed that the future is coming faster than we think. He seemed pretty confident. Only time will tell. 

How does this video make you feel? Does Samsung's vision make you excited and hopeful, or does it frighten you? Do you look forward to a home that aims  to protect you and cares for your family? How comfortable do you feel with your home potentially knowing more about your health than you or your doctor? How will the data about our health collected by our home integrate with the health & social care system? Will the company that is most successful in smart homes be the one that consumers trust the most?

[Disclosure: I have no commercial ties with the individuals and organisations mentioned in this post]

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

How do we make Aging as sexy as Global Warming?

The title of this blog post is one of the questions that I posed to the panel & the audience at the pre-conference workshop I ran for Health 2.0 in London on Sunday 17th November. [Note: credit to Victor Wang for coming up with the quote on Saturday]

Yesterday, Health 2.0, myself, and the participants of this workshop created history.

A bold claim, but look at the variety of Health IT, mHealth & Digital Health conferences around the world. How many run workshops which discuss old age and dying? Kudos to Matthew Holt and Pascal Lardier for being pioneers in making this happen. What I love about being involved with the Health 2.0 conferences is the genuine desire of the team to challenge the status quo. When the idea for this workshop was first discussed, the immediate response from Health 2.0 was not the all too common, "No, it's going to be too difficult", but "Yes, this is risky and uncharted territory, but let's give it a go". 

Where is everyone else?

The workshop was not perfect. We had no patients in the room, apart from Sarah Reed who delivered wonderful insights on behalf of older patients. We didn't have policy makers, the NHS, investors, the third sector, or designers in the room. We did use various channels to publicise the workshop, and I reached out to a number of entities that would have benefited from attending, but either no response or they were simply too busy. Perhaps they felt talking about old age & dying and how technology can help was a bit too radical for a Sunday afternoon? 

I started the workshop setting the scene in terms of Aging populations and the challenges & opportunities. You can see the slides below. The trends of Aging populations is a gradual one, and perhaps that's why it's often not number 1 priority on the minds of decision makers & politicians, who are often faced with short term pressures. Furthermore, many of us use words such as 'burden, 'cost' and 'problem' when describing citizens aged 65 years and up.

I say, older people must be embraced by society. Let's celebrate their existence. They offer life experience, wisdom & talents that could help so many of us, especially young people. In modern times, many of us, not just older people live lonely and isolated lives. A recent survey of GPs in the UK found that 1 in every 10 patients they see every day are coming to see the doctor because they are lonely. Naturally, that places pressure on already stretched services, but more importantly, what does that say about our society? 

Whilst technology can't necessarily reinvent our social fabric, some of the innovations shown at the workshop could be employed to allow older people to stay connected with others in society. 

A very intense and well received 3.5 hours with patient insights, 13 demos of innovative technologies and a 5 member panel discussion featuring Tobias StoneJanet JadavjiClive Bowman, Bart ColletBrenda Reginatto.

The demos were split into 3 sections;

Disease prevention & Disease Management

uMotif, Memory Lane Games, Advanced Balance Systems, SmartCitizen

Aging in Place

Fresh Idea Factory, Vivago, Intelesant, Zilta

Tools for Caregivers & Families

GeriJoy, Yecco, SpeakSet, Mindings, Breezie

Sarah Reed sharing insights from a patient's perspective

Sarah Reed sharing insights from a patient's perspective

Key learnings

  1. You can be successful at a global level without being based in Silicon Valley

  2. The importance of 'science driven' health startups

  3. Many startups in this arena are founded by people who have cared for an elderly relative

  4. We can't just treat older people as one big cohort and assume they all have the same needs

  5. Underlying technology doesn't need to be complex to be effective

  6. How can innovators & investors make money after developing these technologies?

  7. Who is actually going to pay for the innovation?

One or more of the startups who demonstrated their products at the workshop may well be a global name in future years. I left the workshop feeling hopeful, inspired and positive. I acknowledge that data & technology alone won't solve every issue faced by older people, health & social care professionals and their families & caregivers but it definitely has a role to play.

What next?

Even if you weren't at the workshop, and won't be attending Health 2.0 Europe, I encourage you, wherever in the world you are based, to discuss and debate, old age and dying. These conversations will be uncomfortable and frightening, and at times rather unpleasant, but we won't be able to move forward unless we act with courage. Each of you can (and hopefully will) play a part in making Aging as sexy as Global Warming.

[Disclosure: I have no commercial ties to any of the companies mentioned above, apart from Health 2.0, which I provide consulting services to from time to time]