Healthy mobility

Mobility is an interesting term. Here in the UK, I've grown up seeing mobility as something to do with getting old and grey, when you need mobility aids around the home, or even a mobility scooter. Which is why I was curious about Audi (who make cars) hosting an innovation summit at their global headquarters in Germany to explore the Mobility Quotient. I'd never even heard of that term before. The fact that the opening keynote was set to be given by Audi's CEO, Rupert Stadler and Steve Wozniak (who co-founded Apple Computers) made be think that this would be an unusual event. I applied for a ticket, got accepted and what follows are my thoughts after the event that took place a few weeks ago. In this post, I will be looking at this through the lens of what this might mean for our health. 

[Disclosure: I have no commercial ties with the individuals or organisations mentioned in this post]

It turns out that 400 people attended, from 15 countries. This was the 1st time that Audi had hosted this type of event, and I didn't know what to expect out of it, and neither did any of the attendees I talked to on the shuttle bus from the airport. I think that's fun because everyone I met during the 2 days seemed to be there purely out of curiosity. If you want another perspective of the entire 2 days, I recommend Yannick Willemin's post. A fellow attendee, he was one of the first people I met at the event. There is one small thing that spoiled the event for me, the 15 minute breaks between sessions were too short. I appreciate that every conference organiser wants to squeeze lots of content in, but the magic at these events happens in between the sessions when your mind has been stimulated by a speaker and you have conversations that open new doors in your life. It's a problem that afflicts virtually every conference I attend. I wish they would have less content and longer breaks. 

On Day 1, there were external speakers from around the world, getting us to think about social, spacial, temporal and sustainable mobility. Rupert Stadler made a big impression on me with his vision of the future as he cited technologies such as Artificial Intelligence (AI) and the Internet of Things (IoT) and how they might enable this very different future. He also mentioned how he believes the car of the future will change its role in our lives, maybe being a secretary, a butler, a courier, or even an empathic companion in our day.  And throughout, we were asked to think deeply about how mobility could be measured, what we will do with the 25th hour, the extra time gained because eventually machines will turn drivers of today into the passengers of tomorrow. He spoke of a future where cars will be online, connected to each other too, sharing data, to reduce traffic jams and more. He urged us to never stop questioning. Steve Wozniak described the mobility quotient as "a level of freedom, you can be anywhere, anytime, but it also means freedom, like not having cords." 

We heard about Hyperloop transportation technologies cutting down on travel time between places and then the different things we might do in an autonomous vehicle, which briefly cited 'healthcare' as one option. Sacha Vrazic, who spoke about his work on self driving cars gave a great hype free talk and highlighted just how far away we are from the utopia of cars that drive themselves. We heard about technology, happiness and temporal mobility. It was such a diverse mix of topics. For example, we heard from Anna Nixon, who is just 17 years old, and already making a name for herself in robotics, and inspired us to think differently. 

What's weird but in a good way is that Audi, a car firm was hosting a conversation about access to education and improving social mobility. I found it wonderful to see Fatima Bhutto, a journalist from Pakistan give one of the closing keynotes on Day 2, where she reminded us of the challenges with respect to human rights and access to sanitation for many living in poorer countries, and how advances in mobility might address these challenges. It was surprising because Audi sells premium vehicles, and it made me think that that mobility isn't just about selling more premium vehicles. What's clear is that Audi (like many large organisations) is trying to figure out how to stay relevant in our lives during this century. Instead of being able to sell more cars in the future, maybe they will be selling us mobility solutions & services which may not even always involve a car. Perhaps they will end up becoming a software company that licences the algorithms used by autonomous vehicles in decades to come? It reminds me of the pharmaceutical industry wanting to move to a world of 'beyond the pill' by adapting their strategy to offer new products and services, enabled by new technologies. When you're faced with having to rip up the business model that's allowed your organisation to survive the 20th century, and develop a business model that will maximise your chances of longevity for the 21st century, it's a scary but also exciting place to be. 

On Day 2 attendees were able to choose 3 out of 12 workspaces where we could discuss how to make an impact on each of the 4 types of mobility. I chose these 3 workspaces.

  • Spatial mobility - which obstacles are still in the way of autonomous driving?
  • Social mobility - what makes me trust my digital assistant?
  • Sustainable mobility - what will future mobility ecosystems look like? 

The first workspace made me realise the range of challenges in terms of autonomous cars. Legal, technical, cultural and infrastructure challenges. We had to discuss and think about topics that I rarely think about when just reading news articles on autonomous cars. The fact that attendees were from a range of backgrounds made the conversations really stimulating. None of that 'groupthink' that I encounter at so many 'innovation' events these days, which was so refreshing.  BTW, Audi's new A8 is the first production vehicle with Level 3 automation, and the feature is called Traffic Jam Pilot. Subject to legal regulations, on selected roads, the driver would be able to take their hands off the wheel and do something else, like watch a video. The car would be able to drive itself. However, the driver would have to be ready to take back control of the car at any time, should conditions change. I found two very interesting real world tests of the technology here and here. Also, isn't it fascinating that a survey found only 26% of Germans would want to ride in autonomous cars. What about a self driving wheelchair in a hospital or an airport? Sounds like science fiction, but they are being tested in Singapore and Japan. Today few of us will be able to access these technologies because they are only available to those with very deep pockets. However, this will change. Just look at airbags, introduced as an option by Mercedes Benz on their flagship S-class in 1981. Now, 36 years later, even the smallest of cars often comes fitted with multiple airbags. 

In the second workspace, with other attendees, I formed a team and our challenge was to discuss transparency on collection and use of personal data from a digital assistant in the car of the future? Almost like a virtual co-driver. Our team had a Google Home device to get us thinking about the personal data that Google collects and we had to pitch our ideas at the end of the workspace in terms of how we envisaged getting drivers and passengers to trust these digital assistants in the car. How could Audi make it possible for consumers to control how their personal data is used? It's encouraging to see a large corporate like Audi thinking this way.  Furthermore, given that these digital assistants may one day be able to recognise our emotional state and respond accordingly, how would you feel if the assistant in your car noticed you were feeling angry, and instead of letting you start the engine, asked if you wanted to have a quick psychotherapy session with a chatbot to help you deal with the anger? Earlier this year, I tested Alexa vs Siri in my car with mixed results. You can see my 360 video below. 

In the third workspace on sustainable mobility, we had to choose one of 3 cities (Beijing, Mumbai and San Francisco) and come up with new ideas to address challenges in sustainable mobility given each city's unique traits. This session was truly mind expanding, as I joked about the increasing levels of congestion in Mumbai, and how maybe they need flying cars. It turned out that one of the attendees sitting next to me was working on urban vehicles that can fly! None of discussions and pitches in the workspaces were full of easy answers, but what they did remind me was the power of bringing together people that normally don't work together to come up with fresh ideas to very complex challenges. Furthermore, these new solutions we generate can't just be for the privileged few, but we have to think global from the beginning. It's our shared responsibility to find a way of including everyone on this new journey. Maybe instead of owning, leasing or even renting a car the traditional way, we'd like to be able to rent a car by the hour using an app on our phones? In fact, Audi have trialled on demand car hire in San Francisco, just launched in China and plan to launch in other countries too, perhaps even with providing you with with a chauffeur too. Only time will tell if they succeed, as others have already tried and not been that successful. 

Taking part in this summit was very useful for me, I left feeling challenged, inspired and motivated. There was an energy during the event that I rarely see in Europe, I experienced a feeling that I only tend to get when I'm out in California, where people attending events are so open to new ideas and fresh thinking that you walk away feeling that you truly can build a better tomorrow. My brain has been buzzing with new ideas since then. 

For example, whether we believe that consumers will have access to autonomous in 5 years or 50 years, we can see more funds being invested in this. I was watching a documentary where Sebastian Thrun, who lost his best friend in a car accident aged 18, and helped build Google's driverless car, believes that a world with driverless vehicles will save the lives of the 1 million people who currently die on the roads every year around the globe. Think about that for a moment. If that vision is realised this century, even partially, what does that mean for those resources in healthcare that currently are spent on dealing with road traffic accidents? He has now turned his attention to flying cars.

Thinking about chronic disease for a second, you'd probably laugh at the thought of a car that could monitor your health during your commute to the office?

Audi outlined a concept called Audi Fit Driver in 2016 which "The Audi Fit Driver project focuses on the well-being and health of the driver. A wearable (fitness wristband or smartwatch) monitors important vital parameters such as heart rate and skin temperature. Vehicle sensors supplement this data with information on driving style, breathing rate and relevant environmental data such as weather or traffic conditions. The current state of the driver, such as elevated stress or fatigue, is deduced from the collected data. As a result, various vehicle systems act to relax, vitalize, or even protect the driver."

Another car manufacturer, Toyota, has filed a patent suggesting a future where the car would know your health and fitness goals and the car would offer suggestions to help you meet those goals, such as parking further away from your planned destination so you can get some more steps in towards your daily goal. My friend, Bart Collet, has penned his thoughts about "healthcartech", which makes for a useful read. One year ago, I also made a 360 video with Dr Keith Grimes discussing if cars in the future will track our health. 

Consider how employers may be interested in tracking the health of employees who drive as part of their job. However, it's not plain sailing. A European Union advisory panel recently said that "Employers should be banned from issuing workers with wearable fitness monitors, such as Fitbit, or other health tracking devices, even with the employees’ permission." So at least in Europe, who knows if we'll ever be allowed to have cars that can monitor our health? On top of that, in this bold new era, in order for these new connected services to really provide value, all these different organisations collecting data will have to find a way to share data. Does blockchain technology have a role to play in mobility? I recently came across Dovu which talks about the world's first mobility cryptocurrency, "Imagine seamless payment across mobility services: one secure global token for riding a bus or train, renting a bike or car or even enabling you to share your own vehicle or vehicle data." Sounds like an interesting idea. 

Thinking about some of the driver assist technologies available today, what do they mean for mobility? Could they help older people remain at the helm of a car even if their reflexes have slowed down? In Japan, the National Police Agency "calls on the government to create a new driver’s license that limits seniors to vehicles with advanced safety systems that can automatically brake or mitigate unintended accelerations." Apparently, one of the most common accidents in Japan is when drivers mistake the accelerator for the gas pedal. Today some new cars come with Autonomous Emergency Braking (AEB) where the car's sensors will detect if you are about to hit another vehicle or a pedestrian and perform an emergency stop if the car detects that the driver is not braking quickly enough. So by relinquishing more control to the car, we can have safer roads. My own car has AEB and on one occasion when I faced multiple hazards on the road ahead, it actually took over the braking, as the sensors thought I wasn't going to stop in time. It was a very strange feeling. Many seem to be reacting with extreme fear when hearing about these new driver assist technologies, yet if you currently drive a car with an automatic transmission or airbags, you are perfectly happy to let the car decide when to change gears or when to inflate the airbag. So on the spectrum of control, we already let our cars make decisions for us. As they get smarter, they will be making more and more decisions for us. If someone over 65 doesn't feel like driving even if the car can step in, then maybe autonomous shuttles like the ones being tested in rural areas in Japan are one solution to increasing mobility of an ageing community.

When we pause to think of how big a problem isolation and loneliness are in our communities, could these new products and services go beyond being simply a mobility solution and actually reduce loneliness? That could have far reaching implications for our health. What if new technology could help those with limited mobility cross the road safely at traffic lights? It's fascinating to read the latest guidance consultation from the UK's National Institute for Health and Care Excellence on the topic of Physical Activity and the Environment. Amongst many items, it suggests mentions modifying traffic lights so those with limited mobility can cross the road safely. Now just extending the time by default so that traffic lights are red by a few extra seconds so that this is possible might end up just causing more traffic jams. So in a more connected future, imagine traffic lights with AI that can detect who is waiting to cross the road, and whether they will need an extended crossing time, and adjust the duration of the red light for vehicles accordingly. This was one of the ideas I brought up at the conference during the autonomous vehicle workspace.

If more people in cities use ride hailing services like Uber and fewer people own a car, does this mean our streets will have fewer parked cars, allowing residents to reclaim the streets for themselves? If this shift continues, in the long term, it might lead to city dwellers of all ages becoming more physically active. This could be good news in improving our health and reducing demand on healthcare systems. One thing is clear to me, these new mobility solutions will require many different groups across society to collaborate. It can't just be a few car manufacturers who roll out technology without involving other stakeholders so that these solutions are available to all, and work in an integrated manner. The consumer will be king though, according to views aired at New Mobility World in Germany this week, "With his smartphone, he can pick the optimal way to get from A to B,” said Randolph Wörl from moovel. “Does optimal mean the shortest way, the cheapest way or the most comfortable way? It’s the user’s choice.” It's early days but we already have a part of the NHS in the UK looking to use Uber to transfer patients to/from hospital. 

Urban mobility isn't just about cars, it's also about bicycles. I use the Santander bike sharing scheme in London on a daily basis, which I find to be an extremely valuable service. I don't want to own a bicycle since in my small home, I don't really have room to store it. Additionally, I don't want the hassle of maintaining a bike. Using this bike sharing scheme has helped me to lose 15kg this summer, which I feel has improved my own health and wellbeing. If we really want to think about health, rather than just about healthcare, it's critical we think beyond those traditional institutions that we associate with health, and include others. Incidentally, Chinese bike sharing firms are now entering the London market.

In the UK, some have called for cycling to be 'prescribed' to the population, helping people to stay healthier and again to reduce demand on the healthcare system. Which is why I find the news that Ford of Germany is getting involved with a new bike sharing scheme. Through the app, people will be able to use Ford's car sharing and bike sharing scheme. An example of Mobility as a Service and of another car manufacturer seeking a path to staying relevant during this century. Nissan of Japan are excitedly talking about Intelligent Mobility for their new Nissan Leaf, talking about Intelligent Driving where "Soon, you can have a car that takes the stress out of driving and leaves only the joy. It can pick you up, navigate heavy traffic, and find parking all on its own." A Chinese electric car startup, Future Mobility Cop who have launched their Byton brand have said their "models are a combination of three things: a smart internet communicator, a spacious luxury living room and a fully electric car." Interestingly, they also want to "turn driving into living." I wonder if in 10-15 years time, we'll spend more time in cars because the experience will be a more connected one? Where will meetings take place in future? Ever used Skype for Business from work or home to join an online meeting? BMW & Microsoft are working to bring that capability to some of BMW's vehicles. Samsung have announced they are setting up a £300m investment fund focusing on connected technologies for cars. It appears that considerable sums of money are being invested in this new arena of connected cars that fit into our digital lifestyles. Are the right people spending the right money on the right things? 

I feel that those developing products which involve AI are often so wrapped up in their vision that it comes across as if they don't care what the social impact of their ideas will be. In an article about Vivek Wadhwa's book, The Driver in the Driverless Car, the journalist points out that the book talks about the possibility of up to 5m American jobs in trucking, delivery driving, taxis and related activities being lost, but there are no suggestions mentioned for handling the the social implications of this shift. Toby Walsh, a Professor of AI believes that Elon Musk, founder of Tesla cars is scaremongering when tweeting about AI starting World War 3. He says, "So, Elon, stop worrying about World War III and start worrying about what Tesla’s autonomous cars will do to the livelihood of taxi drivers." Personally, we need some more balance and perspective in this conversation. The last thing we need is a widening of social inequalities. How fascinating to read that India is considering banning self driving cars in order to protect jobs. 

This summit has really made me think hard about mobility and health. Perhaps car manufacturers will end up being part of solutions that bring significant improvements in our health in years to come? We have to keep an open mind about what might be possible. Maybe it's because I'm fit and reasonably healthy, live in a well connected city like London and can afford a car of my own, that I never really thought about the impact of impaired mobility on our health? In the Transport Research Laboratory's latest Quarterly Research Review, I noticed a focus on mental health and ageing drivers, and it's clear they want transport planners to put health and wellbeing as a higher priority with a statement of, "With transport evolving, it’s vital that we don’t lose sight of the implications it can have on the health of the population, and strive to create a network that encourages healthy mobility.” At minimum, mobility might just mean being able to walk somewhere in your locality, but what if you don't feel safe walking in your neighbourhood due to high rates of crime? Or what if you can't walk because there it literally nowhere to walk? I remember visiting Atlanta in the USA several years ago, and I took a walk from a friend's house in the suburbs. A few minutes into my walk, the sidewalk just finished, just like that with no warning. The only way I could walk further would be to walk inside a car dealership. Ironic. The push towards electrification of vehicles is interesting to witness, with Scotland wanting to phase out sales of new petrol and diesel cars by 2032. India is even more ambitious, hoping to move towards electric vehicles by 2030. The pollution in London is so high that I avoid walking down certain roads because I don't want to breathe in those fumes. So a future with zero emission electric cars gives me hope. 

It's obvious that we can't just think about health as building bigger hospitals and hiring more doctors. If we really want societies where we can prevent more people from living with chronic diseases like heart disease and diabetes, we have to design with health in mind from the beginning. There is an experiment in the UK looking to build 10 Healthy New Towns. Something to keep an eye on.


The technology that will underpin this new era of connectivity seems to be the easy part. The hard part is getting people, policy and process to connect and move at the same pace as the technology, or at least not lag too far behind. During one of my recent sunrise bike rides in London, I came across a phone box. I remember using them as a teenager, before the introduction of mobile phones. At the time, I never imagined a future where we didn't have to locate a box on the street, walk inside, insert coins and press buttons in order to make a call whilst 'mobile' and in such a short space of time, everything has changed, in terms of how we communicate and connect. These phone boxes scattered around London remind me that change is constant, and that even though many of us struggle to imagine a future that's radically different from today, there is every chance that the healthy mobility in 20 years time will look very different from today.

Who should be driving our quest for healthy mobility? Do we rest our hopes on car manufacturers collaborating with technology companies? As cities grow, how do we want our cities to be shaped?

What's your definition of The Mobility Quotient?

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

Being Human

This is the most difficult blog post I’ve ever had to write. Almost 3 months ago, my sister passed away unexpectedly. It’s too painful to talk about the details. We were extremely close and because of that the loss is even harder to cope with. 

The story I want to tell you today is about what’s happened since that day and the impact it’s had on how I view the world. In my work, I spend considerable amounts of time with all sorts of technology, trying to understand what all these advances mean for our health. Looking back, from the start of this year, I’d been feeling increasingly concerned by the growing chorus of voices telling us that technology is the answer for every problem, when it comes to our health. Many of us have been conditioned to believe them. The narrative has been so intoxicating for some.

Ever since this tragedy, it’s not an app, or a sensor or data that I turned to. I have been craving authentic human connections. As I have tried to make sense of life and death, I have wanted to be able to relate to family and friends by making eye contact, giving and receiving hugs and simply just being present in the same room as them. The ‘care robot’ that had arrived from China this year as part of my research into whether robots can keep us company, remains switched off in its box. Amazon’s Echo, the smart assistant with a voice interface that I’d also been testing a lot also sits unused in my home. I used it most frequently to turn the lights on and off, but now I prefer walking over to the light switch and the tactile sensation of pressing the switch with my finger. One day last week, I was feeling sad, and didn’t feel like leaving the house, so I decided to try putting on my Virtual Reality (VR) headset, to join a virtual social space. I joined a virtual computer generated room where it was sunny and in someone’s back yard for a BBQ, I could see their avatars, and I chatted to them for about 15 minutes. After I took off the headset, I felt worse.

There have also been times I have craved solitude, and walking in the park at sunrise on a daily basis has been very therapeutic. 

Increasingly, some want machines to become human, and humans to become machines. My loss has caused me to question these viewpoints. In particular, the bizarre notion that we are simply hardware and software that can be reconfigured to cure death. Recently, I heard one entrepreneur believe that with digital technology, we’ll be able to get rid of mental illness in a few years. Others I’ve met believe we are holding back the march of progress by wanting to retain the human touch in healthcare. Humans in healthcare are an expensive resource, make mistakes and resist change. So, is the answer just to bypass them? Have we truly taken the time to connect with them and understand their hopes and dreams? The stories, promises and visions being shared in Digital Health are often just fantasy, with some storytellers (also known as rock stars) heavily influenced by Silicon Valley’s view of the future. We have all been influenced on some level. Hope is useful, hype is not. 

We are conditioned to hero worship entrepreneurs and to believe that the future the technology titans are creating, is the best possible future for all of us. Grand challenges and moonshots compete for our attention and yet far too often we ignore the ordinary, mundane and boring challenges right here in front of us. 

I’ve witnessed the discomfort many have had when offering me their condolences. I had no idea so many of us have grown up trained not to talk about death and healthy ways of coping with grief. When it comes to Digital Health, I’ve only ever come across one conference where death and other seldom discussed topics were on the agenda, Health 2.0 with their “unmentionables” panel. I’ve never really reflected upon that until now.

Some of us turn to the healthcare system when we are bereaved, I chose not to. Health isn’t something that can only be improved within the four walls of a hospital. I don’t see bereavement as a medical problem. I’m not sure what a medical doctor can do in a 10 minute consultation, nor have I paid much attention to the pathways and processes that scientists ascribe to the journey of grief. I simply do my best to respond to the need in front of me and to honour my feelings, no matter how painful those feelings are. I know I don’t want to end up like Prince Harry who recently admitted he had bottled up the grief for 20 years after the death of his mother, Princess Diana, and that suppressing the grief took him to the point of a breakdown. The sheer maelstrom of emotions I’ve experienced these last few months makes me wonder even more, why does society view mental health as a lower priority than physical health? As I’ve been grieving, there are moments when I felt lonely. I heard about an organisation that wants to reframe loneliness as a medical condition. Is this the pinnacle of human progress, that we need medical doctors (who are an expensive resource) to treat loneliness? What does it say about our ability to show compassion for each other in our daily lives?

Being vulnerable, especially in front of others, is wrongly associated with weakness. Many organisations still struggle to foster a culture where people can truly speak from the heart with courage. That makes me sad, especially at this point. Life is so short yet we are frequently afraid to have candid conversations, not just with others but with ourselves. We don’t need to live our lives paralysed by fear. What changes would we see in the health of our nation if we dared to have authentic conversations? Are we equipped to ask the right questions? 

As I transition back to the world of work, I’m very much reminded of what’s important and who is important. The fragility of life is unnerving. I’m so conscious of my own mortality, and so petrified of death, it’s prompted me to make choices about how I live, work and play. One of the most supportive things someone has said to me after my loss was “Be kind to yourself.” Compassion for one’s self is hard. Given that technology is inevitably going to play a larger role in our health, how do we have more compassionate care? I’m horrified when doctors & nurses tell me their medical training took all the compassion out of them or when young doctors tell me how they are bullied by more senior doctors. Is this really the best we can do? 

I haven’t looked at the news for a few months and immersing myself in Digital Health news again makes me pause. The chatter about Artificial Intelligence (AI), where commentaries are at either end of the spectrum, almost entirely dystopian or almost entirely utopian, with few offering balanced perspectives. These machines will either end up putting us out of work and ruling our lives or they will be our faithful servants, eliminating every problem and leading us to perfect healthcare. For example, I have a new toothbrush that says it uses AI, and it’s now telling me to go to bed earlier because it noticed I brush my teeth late at night. My car, a Toyota Prius, which is primarily designed for fuel efficiency scores my acceleration, braking and cruising constantly as I’m driving. Where should my attention rest as I drive, on the road ahead or on the dashboard, anxious to achieve the highest score possible? Is there where our destiny lies? Is it wise to blindly embark upon a quest for optimum health powered by sensors, data & algorithms nudging us all day and all night until we achieve and maintain the perfect health score? 

As more of healthcare moves online, reducing costs and improving efficiency, who wins and who loses? Recently, my father (who is in his 80s) called the council as he needed to pay a bill. Previously, he was able to pay with his debit card over the phone. Now they told him it’s all changed, and he has to do it online. When he asked them what happens if someone isn’t online, he was told to visit the library where someone can do it online with you. He was rather angry at this change. I can now see his perspective, and why this has made him angry. I suspect he’s not the only one. He is online, but there are moments when he wants to interact with human beings, not machines. In stores, I always used to use the self service checkouts when paying for my goods, because it was faster. Ever since my loss, I’ve chosen to use the checkouts with human operators, even if it is slower. Earlier this year, my mother (in her 70s) got a form to apply for online access to her medical records. She still hasn’t filled in it, she personally doesn’t see the point. In Digital Health conversations, statements are sometimes made that are deemed to be universal truths. Every patient wants access to their records, or that every patient wants to analyse their own health data. I believe it’s excellent that patients have the chance of access, but let’s not assume they all want access. 

Diversity & Inclusion is still little more than a buzzword for many organisations. When it comes to patients and their advocates, we still have work to do. I admire the amazing work that patients have done to get us this far, but when I go to conferences in Europe and North America, the patients on stage are often drawn from a narrow section of society. That’s assuming the organisers actually invited patients to speak on stage, as most still curate agendas which put the interests of sponsors and partners above the interests of patients and their families. We’re not going to do the right thing if we only listen to the loudest voices. How do we create the space needed so that even the quietest voices can be heard? We probably don’t even remember what those voices sound like, as we’ve been too busy listening to the sound of our own voice, or the voices of those that constantly agree with us. 

When it comes to the future, I still believe emerging technologies have a vital role to play in our health, but we have to be mindful in how we design, build and deploy these tools. It’s critical we think for ourselves, to remember what and who are important to us. I remember that when eating meals with my sister, I’d pick up my phone after each new notification of a retweet or a new email. I can’t get those moments back now, but I aim to be present when having conversations with people now, to maintain eye contact and to truly listen, not just with my ears, and my mind, but also with my heart. If life is simply a series of moments, let’s make each moment matter. We jump at the chance of changing the world, but it takes far more courage to change ourselves. The power of human connection, compassion and conversation to help me heal during my grief has been a wake up call for me. Together, let’s do our best to preserve, cherish and honour the unique abilities that we as humans bring to humanity.

Thank You for listening to my story.

Developing a wearable biosensor: A doctor's story

For this post, I caught up with Dr Brennan Spiegel, to hear in more detail about his journey to get a wearable biosensor from concept to clinic. In the interview, we discuss how an idea for a sensor was borne out of an unmet clinical need, how the sensor was prototyped, tested, and subjected to clinical research, and how it was finally FDA approved in December of 2015. Throughout, we learn about the challenges of developing a wearable biosensor, the importance of working with patients, doctors, and nurses to get it right, and how to conduct rigorous research to justify regulatory approval of a device. The interview ends with seven suggestions from Dr. Spiegel for other inventors seeking to develop wearable biosensors.

1. What is AbStats?
AbStats is wearable sensor that non-invasively measures your intestinal activity – it's like a gut speedometer. The sensor is disposable, about the size of a large coin, sticks on the external abdominal wall, and has a small microphone inside that dutifully listens to your bowel churn away as it digests food. A specialized computer analyzes the results and presents a value we call the "intestinal rate," which is like a new vital sign for the gut.  We've all heard of the heart rate or respiratory rate; AbStats measures the intestinal rate.  The sensor tells the patient and doctor how much the intestines are moving, measured in "events per minute."  If the intestinal rate is very high, like 30 or 40 events per minute, then it means the gut is revved up and active.  If it's very low, like 1 or 2 per minute, then it means the gut is asleep or, possibly, even dysfunctional depending on the clinical situation.

2. What existing problem(s) does it solve?
AbStats was specifically designed, from the start, to solve for a real problem we face in the clinical trenches.  

We focused first on patients undergoing surgery.  Almost everyone has at least temporary bowel paralysis after an operation.  When your body undergoes an operation, whether on your intestines or on your toe (or anywhere in-between), it's under a great deal of stress and tends to shut down non-vital systems.  The gastrointestinal (GI) tract is one of those systems – it can take a hit and shut down for a while.  Normally, the GI system wakes up quickly.  But in some cases the GI tract is slow to come back online.  This is a condition we call postoperative ileus, or POI, which occurs in up to 25% of patients undergoing abdominal surgeries.  

The issue is that it's hard to know when to confidently feed patients after surgery.  Surgeons are under great pressure by administrators to feed their patients quickly and discharge them as soon as possible. But feeding too soon can cause serious problems, from nausea and vomiting, to aspiration, pneumonia, or even death.  On the other hand, feeding too late can lead to infections, prolong length of stay, and cost money.  As a whole, POI costs the US healthcare system around $1.5 billion because of uncertainties about whether and when to feed patients.  It's a very practical and unglamorous problem – exactly the type of issue doctors, nurses, and patients care about. 

Now, you might ask how we currently decide when to feed patients.  Here's the state of the art: we ask patients if they've farted or not. We literally ask them, practically all day long, "have you passed gas yet?"  No joke.  Or, we'll look at their belly and determine if it looks overly distended.  We might use our stethoscope to listen to the bowels for 15 seconds at a time, and then make a call about whether to feed.  It's nonsense.  Data reveals that we do a bad job of determining whether someone is fit to eat.  We blow it in both directions – sometimes we overcall, and sometimes we under call.  We figured, in this fantastical age of digital health, there had to be a better way than asking people about their flatus!  So we invented AbStats. 

3. What prompted you to embark upon this journey?
One day, about 4 years ago, I was watching Eric Topol give a TED talk about wearable biosensors and the "future of medicine." As I watched the video, I noticed that virtually very part of the human body had a corresponding wearable, from the heart, to the lungs, to the brain, and so forth.  But, sitting there in the middle was this entire body cavity – the abdominal cavity – that had absolutely zero sensor solutions.  As a gastroenterologist, I thought this must be an oversight.  We have all manner of medieval devices to get inside the GI system, and I'm skilled at inserting those things to investigate GI problems.  But typical procedures like colonoscopies, enteroscopies, capsule endoscopies, and motility catheters are all invasive, expensive, and carry risks.  There had to be a way to non-inavsively monitor the digestive engine.  So, I thought, what do we have available to us as doctors?  That's easy: bowel sounds.  We listen to bowel sounds all the time with a stethoscope, but it's highly inefficient and inaccurate.  It makes no sense to sit there with a stethoscope for 20 minutes at a time, much less even 1 whole minute.  But the GI system is not like the heart, where we can make accurate diagnoses in short order, over seconds of listening.  The GI system is slow, plodding, and somewhat erratic.  We needed something that can stand guard, vigilantly, and literally detect signal in the noise.  That's when AbStats was borne.  It was an idea in my head, and then, about 4 years later, became an FDA-approved device.  

4. What was the journey like from initial idea to FDA approval? 
When I first invented AbStats, I wasn't thinking about FDA approval.  I knew virtually nothing about FDA approval of biomedical devices.  I just wanted the thing built, as fast as possible, and rigorously tested in patients.  As a research scientists and professor of medicine and public health, this is all I know.  I need to see proof – evidence – that something works.  AbStats would be no different. 

I was on staff at UCLA Medical Center when I first invented the idea for AbStats. I told our office of intellectual property about the idea, and they suggested I speak with Professor William Kaiser at the UCLA Wireless Health Institute.  So, I gave him a call.  

Dr. Kaiser got his start working for General Motors, where he contributed to inventing the automotive cruise control system.  Later, he went to work for the Jet Propulsion Laboratory, where he worked on the Mars Rover project.  Then, he came to UCLA and founded the Wireless Health Institute.  He is fond of saying that of all the things he's done in his career, from automotive research to spaceships, he believes the largest impact on humanity he's had is in the realm of digital health.  He is a real optimist.  

So, when I told Professor Kaiser about my idea for AbStats, he immediately got it.  He got to work on building the sensor and developed important innovations to enhance the system.  For example, he developed a clever way to ensure the device is attached to the body and not pulled off.  This is really important, because if AbStats reports that a patient's intestinal rate is zero, then it might mean severe POI, or it might mean the device fell off.  AbStats can tell the difference thanks to Professor Kaiser's engineering ingenuity.  

Once we developed a minimal viable product, we worked like crazy to test it in the clinics, write papers, and publish our work.  At the same time, UCLA licensed the IP to a startup company, called GI Logic, that worked with our teams to submit the FDA documentation.  Professor Kaiser's team did the heavy lifting on the engineering and safety side, and we focused on the clinical side.  It was a great example of stem-to-stern teamwork, ranging from in-house engineering expertise, to clinical expertise, to regulatory expertise.  It all came together very fast.  

Importantly, it was my sister who came up with the name "AbStats."  I always remember to credit her with that part of the journey!

5. What role did patients play in the design of AbStats? 
Patients were critical to our design process.  We went through a series of form factors before settling on the current version of AbStats.  At first, the system resembled a belt with embedded sensors. Patients told us they hated the belt.  They explained that, after undergoing an abdominal surgery, the last thing they wanted was a belt on their abdomen.  We tweaked and tweaked, and eventually developed two small sensors that adhere to the abdomen with Tegaderm.  Even those are not perfect – it hurts to pull Tegaderm off of skin, for example.  And the sensors are high profile, so they are not entirely unobtrusive.  We're working on that, too.  But patient feedback was key and remains vital to our current and future success with AbStats.  

6. How did patients & physicians respond to AbStats during research & development?
It was gratifying that virtually every surgeon, nurse, and patient we spoke with about AbStats immediately "got it."  This is not a hard concept to sell.  Your bowels make sound.  The sound matters. And AbStats can listen to those sounds, make sense of them, and provide feedback to doctors and nurses to drive decisions.  The "so what" question was answered.  If your belly isn't moving, then we shouldn't feed you.  If it's moving a little, we should feed a little.  And if it's moving a lot, then we should feed a lot.  The surgeons called this the AbStats "stoplight", as in "red light," "yellow light," and "green light."  Each is mapped to a very specific action plan.  It's not complicated.  

We were especially surprised by the engagement of nurses in this process.  Nurses are the heart and soul of patient care, especially in surgery.  Our nursing colleagues told us that feeding decisions come up in nearly every discussion with post-operative patients.  They said they have virtually no objective parameter to follow, and saw AbStats as a way to engage patients in ways they previously could not. This was surprising.  For example, the nurses pointed out that many patients are on narcotics for pain control, and that can slow their bowels even further. By having an objective parameter, the nurses can now use AbStats to make conversations more objective and actionable.  For example, they can show that every time a patient uses a dose of narcotics, it paralyzes the bowels further.  Knowing that, some patients might be willing to reduce their medications, if only by a little, to help expedite feeding decisions.  AbStats enables that conversation.  It's really gratifying to see how a device can alter the very process of care, to the point of impacting the nature of conversations between patients and their providers.  Almost uniformly, the patients in our trials felt the sensors provided value, and so did their nurses. 

7. Would you approach the problem differently if you had to do this again?
Not really.  Considering that in 4 years we invented a sensor, iteratively improved its form factor, conducted and published two peer-reviewed clinical trials, submitted an FDA application, and received clearance for the device, it's hard to second guess the approach.

8. What other problems would you like to solve with the use of wearable technology in the future?
AbStats has many other applications beyond POI.  We are currently studying its use in an expanding array of applications, including acute pancreatitis, bowel obstructions, irritable bowel syndrome, inflammatory bowel disease, obesity management, and so on.  There are more opportunities than there are hours in the day, so we're trying to remain strategic about how best to proceed.  Thankfully, we are well aligned with the startup, GI Logic, to move things forward.  I am also fortunate to be at Cedars-Sinai Medical Center, my home institution since moving from UCLA, where most of the clinical research on AbStats was conducted.  Cedars-Sinai has been extremely supportive of AbStats and our work in digital health.  We couldn't do our research without our medical center, patients, administrative support, and technology transfer office. I am immensely grateful to Cedars-Sinai.  

More generally, wearable technology and digital health still have a long way to go, in my opinion.  I've written about that before, here. AbStats is an example of a now FDA-approved sensor supported by peer-reviewed research.  I'd like to see a similar focus on other wearables.  There are good examples, like AliveCor for heart arrhythmias, and now Proteus, which is an "ingestible."  But, for many applications in healthcare, there is still too little data about how to use wearables.  

I believe that digital health, in general, is more of a social and behavioral science than a computer or engineering science.  Truth be told, most of the sensors are now trivial.  Our sensor is a small microphone in a plastic cap.  The real "secret sauce" is in the software, how the results are generated and visualized, how they are formed into predictive algorithms, and, most importantly, how those algorithms change behavior and decision making.  Finally, there is the issue of cost and value of care. There are so many hurdles to cross, one wonders whether many sensors will run the gauntlet. AbStats, for example, may be FDA approved, but that doesn't mean we're ready to save money using the device.  We need to prove that.  We need data.  FDA approval is a regulatory hurdle, but it doesn't guarantee a device will save lives, reduce costs, reduce disability, or anything close to it.  That only comes from hard-fought science.  

9. Are clinically proven medical applications of wearable technology likely to grow in years to come?
Almost certainly, although my caveats, above, indicate this may be slower and more deliberate than some are suggesting in the digital health echo chambers.

10. For those wishing to follow in your footsteps, what would you words of wisdom be?
First, start by addressing an unmet need. Clinical need should drive technology development, not the other way around.  

Second, if you're working on patient-facing devices, then I believe you should really have first hand experience with literally putting those devices on patients.  If you're not a healthcare provider, then you should at least visit the clinical trenches and watch what happens when sensors go on patients. What happens next can be unexpected and undermine your presuppositions, as I've written about here and here.  I do not believe one can truly be a wearable expert without having literally worked with wearables.  That's like a pharmacist who has never filled a prescription, or, a cartographer who has never drawn a map.  Digital health is, by definition, about healthcare. It's about patients, about their illness and disease, and about figuring out how to insert technology into a complex workflow.  The clinical trenches are messy, gray, indistinct, dynamic, and emotional — injecting technology into that environment is exceptionally difficult and requires first-hand experience.  Digital health is a hands-on science, so look to the clinical trenches to find the unmet needs, and start working on it, step-by-step, in direct partnership with patients and their providers.

Third, make sure your device provides actionable data.  Data should guide specific clinical decisions based on valid and reliable sensor indicators.  We're trying to do that with AbStats. 

Fourth, make sure your device provides timely data. Data should be delivered at the right time, right place, and with the right visualizations.  We spent days just trying to figure out how best to visualize the data from AbStats.  And I'm still not sure we've got it right.  This stuff takes so much work. 

Fifth, if your'e making a device, make sure it's easy to use and has a favorable form factor.  It should be simple to hook up the device, it should be unobtrusive, non-invasive, with zero infection risk, comfortable, safe, and preferably disposable.  We believe that AbStats meets those standards, although there is always more work to be done.

Sixth, the wearable must be evidence-based.  A valuable sensor should be able to replace or supplement gold standard metrics, when relevant, and be supported by well designed, properly powered clinical trials.  

Finally, and most importantly, the sensor should provide health economic value to health systems.  It should be cost-effective compared to usual care.  That is the tallest yet most important hurdle to cross.  We're working on that now with AbStats.  We think it can save money by shaving time off the hospital stay and reducing readmissions.  But we need to prove it.  

[Disclosure: I have no commercial ties to any of the individuals or organizations mentioned in this post]

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

Data or it didn't happen

Today, there is incredible excitement, enthusiasm and euphoria about technology trends such as Wearables, Big Data and the Internet of Things. Listening to some speakers at conferences, it often sounds like the convergence of these technologies promises to solve every problem that humanity faces. Seemingly, all we need to do is let these new ideas, products and services emerge into society, and it will be happy ever after. Just like those fairy tales we read to our children. Except, life isn't a fairy tale, neither is it always fair and equal. In this post, I examine how these technologies are increasingly of interest to employers and insurers when it comes to determining risk, and how this may impact our future. 

Let's take the job interview. There may be some tests the candidate undertakes, but a large part of the interview is the human interaction, and what the interviewer(s) and interviewee think of each other. Someone may perform well during the interview, but turn out to under perform when doing the actual job. Naturally, that's a risk that every employer wishes to minimise. What if you could minimise risk with wearables during the recruitment process? That's the message of a recent post on a UK recruitment website,  "Recruiters can provide candidates with wearable devices and undertake mock interviews or competency tests. The data from the device can then be analysed to reveal how the candidate copes under pressure." I imagine there would be legal issues if an employer terminated the recruitment process simply on the basis of data collected from a wearable device, but it may augment the existing testing that takes place. Imagine the job is a management role requiring frequent resolution of conflicts, and your verbal answers convince the interviewer you'd cope with that level of stress. What if the biometric data captured from the wearable sensor during your interview showed that you wouldn't be able to cope with that level of stress. We might immediately think of this as intrusive and discriminatory, but would this insight actually be a good thing for both parties? I expect all of us at one point have worked alongside colleagues who couldn't handle pressure, and their reactions caused significant disruption in the workplace. Could this use of data from wearables and other sensors lead to healthier and happier workplaces? 

Could those recruiting for a job start even earlier? What if the job involved a large amount of walking, and there was a way to get access to the last 6 months of activity data from the activity tracker you've been wearing on your wrist every day? Is sharing your health & fitness data with your potential employer the way that some candidates will get an edge over other candidates that haven't collected that data? That assumes that you have a choice in whether you share or don't share, but what if every job application required that data by default? How would that make you feel? 

What if it's your first job in life, and your employer wants access to data about your performance during your many years of education? Education technology used at school which aims to help students may collect data that could tag you for life as giving up easily when faced with difficult tasks. The world isn't as equal as we'd like it to be, and left unchecked, these new technologies may worsen inequalities, as Cathy O’Neil highlights in a thought provoking post on student privacy, “The belief that data can solve problems that are our deepest problems, like inequality and access, is wrong. Whose kids have been exposed by their data is absolutely a question of class.”

There is increasing interest in developing wearables and other devices for babies, tracking aspects of a baby, mainly to provide additional reassurance to the parents. In theory, maybe it's a brilliant idea, with no apparent downsides? Laura June doesn't think so, She states, "The merger of the Internet of Things with baby gear — or the Internet of Babies — is not a positive development." Her argument against putting sensors into baby gear is that it would increase anxiety levels in parents, not reduce them. I'm already thinking about that data gathered from the moment the baby is born. Who would own and control it? The baby, the baby's parents, the government or the corporation that had made the software & hardware used to collect the data? Furthermore, what if the data from the baby could impact not just access to health insurance, but the pricing of the premium paid by the parents to cover the baby in their policy? Do you decide you don't want to buy these devices to monitor the health of your newborn baby in case one day that data might be used against your child when they are grown up? 

When we take out health and life insurance, we fill in a bunch of forms, supply the information needed for the insurer to determine risk, and then calculate a premium. Rick Huckstep points out, "The insurer is not able to reassess the changing risk profile over the term of the policy." So, you might be active, healthy and fit when you take out the policy, but what if your behaviour changes and your risk profile changes during the term of the policy? This is the opportunity that some are seeing for insurers to use data from wearables to determine how your risk profile changes during the term of the policy. Instead of a static premium at the outset, we have a world with dynamic and personalised premiums. Huckstep also writes, "Where premiums will adjust over the term of the policy to reflect a policyholder’s efforts to reduce the risk of ill-health or a chronic illness on an on-going basis. To do that requires a seismic shift in the approach to underwriting risk and represents one of the biggest areas for disruption in the insurance industry."

Already today, you can link your phone or wearable to Vitality UK health insurance, and accumulate points based upon your activity (e.g. 10 points if you walk 12,500+ steps in a day). Get enough points and exchange them for rewards such as a cinema ticket. A similar scheme has also launched in the USA with John Hancock for life insurance

Is Huckstep the only one thinking about a radically different future? Not at all. Neil Sprackling, Managing Director of Swiss Re (a reinsurer) has said, “This has the potential to be a mini revolution when it comes to the way we underwrite for life insurance risk." In fact, his colleague, Oliver Werneyer, has an even bolder vision with a post entitled, "No wearable device = no life insurance," in which he believes that in 5 to 10 years time, you might find not be able to buy life insurance if you don't have a wearable device collecting data about you and your behaviour. Direct Line, a UK insurer believe that technology is going to transform insurance. Their Group Marketing Director, Mark Evans, has recently talked about technology allowing them to understand a customer's "inherent risk." Could we be penalised for deviating away from our normal healthy lifestyle because of life's unexpected demands? In this new world, if you were under chronic stress because you suddenly had to take time off work to look after a grandparent that was really sick, would less sleep and less exercise result in a higher premium next month on your health insurance? I'm not sure how these new business models would work in practice. 

When it comes to risk being calculated more accurately based upon this stream of data from your wearables, surely it's a win-win for everyone involved? The insurers can calculate risk more accurately, and you can benefit from a lower premium if you take steps to lower your risk. Then there are opportunities for entrepreneurs to create software & hardware that serves these capabilities. Would the traditional financial capitals such as London and New York be the centre of these innovations? 

One of the big challenges to overcome, above and beyond established data privacy concerns, is data accuracy. In my opinion, these consumer devices that measure your sleep & steps are not yet accurate and reliable enough to be used as a basis for determining your risk, and your insurance premium. Sensor technology will evolve, so maybe one day, there will be 'insurance grade' wearables that your insurer will be able to offer you. These would be certified to be accurate, reliable and secure enough to be used in the context of being linked to your insurance policy. In this potential future, another issue is whether people will choose to not take insurance because they don't want to wear a wearable, or they simply don't like the idea of their behaviour being tracked 24/7. Does that create a whole new class of uninsured people in society? Or would their be so much of a backlash from consumers (or even policy makers) to this idea of insurers accessing this 24/7 stream of data about your health, that this new business model never becomes a reality? If it did become a reality, would consumers switch to those insurers that could handle the data from their wearables? 

Interestingly, who would be an insurer of the future? Will it be the incumbents, or will it be hardware startups that build insurance businesses around connected devices? That's the plan of Beam Technologies, who developed a connected toothbrush (yes, it connects via Bluetooth with your smartphone and the app collects data about your brushing habits). Their dental insurance plan is rolling out in the USA shortly. Beam are considering adding incentives, such as rewards for brushing twice a day. Another experiment is NEST partnering with American Family Insurance. They supply you a 'smart' smoke detector for your home, which "shares data about whether the smoke detectors are on, working and if the home’s Wi-Fi is on." In exchange, you get 5% discount off your home insurance. 

Switching back to work, employers are increasingly interested in the data from employee's wearables. Why? Again, it's about a more accurate risk profile when it comes to health & safety of employees. Take the tragic crash of the Germanwings flight this year, where it emerges the pilot deliberately crashed the plane, killing 150 passengers. At a recent event in Australia, it was suggested this accident might have been avoided if the airline were able to monitor stress in the pilot using data from a wearable device.

What other accidents in the workplace might be avoided if employers could monitor the health, fitness & wellbeing of employees 24 hours a day? In the future, would a hospital send a surgeon home because the data from the surgeon's wearable showed they had not slept enough in the last 5 days? What about bus, taxi or truck drivers that could be monitored remotely for drowsiness by using wearables? Those are some of the use cases that Fujitsu are exploring in Japan with their research. Conversely, what if you had been put forward for promotion to a management role, and a year's worth of data from your wearable worn during work showed your employer that you got severely stressed in meetings where you had to manage conflict? Would your employer be justified in not promoting you, citing the data that suggested promoting you would increase your risk of a heart attack? Bosses may be interested in accessing the data from your wearables just to verify what you are telling them. Some employees phone in pretending to be sick, to get an extra day off. In the future, that may not be possible if your boss can check the data from your wearable to verify that you haven't taken many steps as you're stuck in bed at home. If you can't trust your employees to tell the truth, do you just modify the corporate wellness scheme with mandatory monitoring using wearable technology?

If it's possible for employers to understand the risk profile for each employee, would those under pressure to increase profits, ever use the data from wearables to understand which employees are going to be 'expensive', and find a way to get them out of the company? Puts a whole new spin on 'People Analytics' and 'Optimising the workforce'. In a compelling post, Sarah O'Connor shares her experiment where she put on some wearables and shared the data with her boss. She was asked how it felt to share the data with her boss, "It felt very weird, and actually, I really didn't like the feeling at all. It just felt as if my job was suddenly leaking into every area of my life. Like on the Thursday night, a good friend and colleague had a 30th birthday party, and I went along. And it got to sort of 1 o'clock, and I realized I was panicking about my sleep monitor and what it was going to look like the next day." We already complain about checking work emails at home, and the boundaries between work and home blurring. Do you really want to be thinking about how skipping your regular session at the gym on a Monday night would look to your boss? Devices that will betray us can actually be a good thing for society. Take the recent case of a woman in the USA who reported being sexually assaulted whilst she was asleep in her own home at night. The police used the data from the activity tracker she wore on her wrist to prove that at the time of the alleged attack, she was not asleep but awake and walking. On the other hand, one might also consider that those with malicious intent could hack into these devices and falsify the data to frame you for a crime you didn't commit. 

If these trends continue to converge, I see enterprising criminals rubbing their hands with glee. A whole new economy dedicated to falsifying the stream of data from your wearable/IoT device to your school, doctor, insurer or employer, or whoever is going to be making decisions based upon that stream of data. Imagine it's the year 2020, you are out partying every night, and you pay a hacker to make it appear that you slept 8 hours a night. So many organisations are blindly jumping into data driven systems with the mindset of, 'In data, we trust,' that few bother to think hard enough about the harsh realities of real world data. Another aspect is bias in algorithms using this data about us. Hans de Zwart has written an illuminating post, "Demystifying the algorithm: Who designs our life?" Zwart shows us the sheer amount of human effort in designing Google Maps, and the routes it generates for us, "The incredible amount of human effort that has gone into Google Maps, every design decision, is completely mystified by a sleek and clean interface that we assume to be neutral. When these internet services don’t deliver what we want from them, we usually blame ourselves or “the computer”. Very rarely do we blame the people who made the software." With all these potential new algorithms classifying our risk profile based upon data we generate 24/7, I wonder how much transparency, governance and accountability there will be? 

There is much to think about and consider, one of the key points is the critical need for consumers to be rights aware. An inspiring example of this, is Nicole Wong, the former US Deputy CTO, who wrote a post explaining why she makes her kids read privacy policies. One sentence in particular stood out to me, " When I ask my kids about what data is collected and who can access it, I am asking them to think about what is valuable and what they are prepared to share or lose." Understanding the value exchange that takes place when you share your data with a provider is critical step towards being able to make informed choices. That's assuming all of us have a choice in the sharing of our data. In the future, when we teach our children how to read and write English, should they be learning 'A' is for algorithm, rather than 'A' is for apple? I gave a talk in London recently on the future of wearables, and I included a slide on when wearables will take off (slide 21 below). I believe they will take off when we have to wear them or when we can't access services without them. Surgeons and pilots are just two of the professions which may have to get used to being tracked 24/7.

Will the mantra of employers and insurers in the 21st century be, "Data or it didn't happen?"

If Big Data is set to become one of the greatest sources of power in the 21st century, that power needs a system of checks and balances. Just how much data are we prepared to give up in exchange for a job? Will insurance really be disrupted or will data privacy regulations prevent that from happening? Do we really want sensors on us, in our cars, our homes & our workplaces monitoring everything we do or don't do? Having data from cradle to grave on each of us is what medical researchers dream of, and may lead to giant leaps in medicine and global health. UNICEF's Wearables for Good challenge could solve everyday problems for those living in resource poor environments. Now, just because we might have the technology to classify risk on a real time basis, do we need to do that for everyone, all the time? Or should policy makers just ban this methodology before anyone can implement it? Is there a middle path? "Let's add in ethics to technology" argues Jennifer Barr, one of my friends who lives and works in Silicon Valley. Instead of just teaching our children to code, let's teach them how to code with ethics. 

There are so many questions, and still too few places where we can debate these questions. That needs to change. I am speaking at two events in London this week where these questions are being debated, the Critical Wearables Research Lab and Camp Alphaville. I look forward to continuing the conversation with you in person if you're at either of these events. 

[Disclosure: I have no commercial ties to any of the individuals or organisations mentioned in the post]

Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner

The Apple watch is dead. Long live the Apple watch.

I've had the Apple watch for just over a week now, and in this post, I'd like to share my experience and my thoughts about the future. I've examined many aspects of the functionality of the device, but also its potential for playing a role in health. It appears to be a device that polarises opinions, before it has even hit the market. I've met people who ordered one, not because they like it, or because they want some kind of 'smartwatch', but simply because it is a new product from Apple. Others have told me they would never purchase such a watch, because of the cost, and also they don't see a use for it given they already have an iPhone. 

There have been multiple attempts at 'smartwatches' to win over consumers. I use the term, 'smartwatch' very loosely, simply to group these wearables together. I'll be sharing more in this post about why these watches are still not particularly smart. Last summer, Android Wear launched, and I wrote about my initial experience and thoughts on health & social care. Android Wear hasn't been as successful as Google had hoped. I've actually been using a number of 'smartwatches', and for me, the closest existing rival to the Apple watch is the Samsung Gear S, which released late 2014, and didn't sell very well (I only know one person on Earth who has also purchased one). It overlaps in functionality with the Apple watch, with two big differences. It has its own SIM card inside the watch, with its own phone number, and it only works with an Android phone. I've been using the Gear S since November 2014, and the user experience is very different. Whilst the Samsung seemed to have just tried to miniaturise a computer/phone into a watch, It is clear to me that Apple have put considerably more thought into the design of the watch. A clear example of this difference in design thinking is the fact that the Gear S offers both a QWERTY keyboard on the watch such as when you write a text message, and also a web browser. Just because it is technically possible to do something on a device as small as a watch, doesn't mean it should be included as a feature. Thankfully, Apple have not added those two features. 

Some people say to me if the Apple watch is not replacing the iPhone, then what's the point? Why use an app on a tiny screen on your wrist when you could just use the same app on your iPhone? A perfectly sensible question to ask. To answer, I'll give you a real life example of why having the Apple watch made me feel safer as I navigated the streets of a foreign city at night. I flew from London to Milan on Tuesday evening, and after dinner in the city, I wanted to walk back to my hotel. I didn't know the route, so I used my Apple watch. I opened the Maps app, dictated the name of the hotel (which the watch recognised despite me being on a busy street), and chose the walking (vs driving) option for navigation. Why did the Apple watch make me feel safer walking back to my hotel at night? Well, you can keep your iPhone in your pocket, and you don't even need to glance at your Apple for instructions on when to turn left or right. You just walk normally, except that when you do have to turn left or right, the watch 'taps' your wrist in different ways. To anyone observing you, they wouldn't know you had an expensive phone and watch. Bear in mind that GPS is not always accurate, especially in cities with tall buildings. On one walk in London, the watch tapped to indicate that I turn right, into a clothing store. The street I actually had to turn right into was 50 yards up ahead. However, I'm not sure if the driving mode on the watch would be safe. Would you really want to navigate using your watch whilst you drive? 

One of the standard notifications is to alert you once an hour to stand up and move. Sounds like a useful concept given how many of us work in jobs that keep us sitting in a chair all day long. These notifications are simple, not smart, as they appear at the strangest of times. You'd think the notifications could have made use of data from sensors in your watch to be more relevant and timely.

Using the Gear S has changed how I use my Android phone. I typically keep my phone on silent, and use the Gear S to notify me of emails/calls etc. I find it particularly useful if I'm charging my phone at home or in the office, and I want to wander away from the phone, without missing any notifications. All these devices need to be paired with your phone via Bluetooth in order to work. Since the Gear S has its own SIM card, as soon it loses the Bluetooth connection with my phone, it forwards calls from the phone to the Gear S. So, if I left the phone at home to visit the gym, and someone rang my phone's number, the call would be forwarded to the Gear S. Since the Gear S has a speaker you can answer the call (or alternatively, you can connect the Gear S with bluetooth earphones, which is a lot better). Incidentally, I found the speaker on the Apple watch is competent, but not as loud as the speaker on the Gear S. 

When the Apple watch loses the Bluetooth connection with the iPhone because you've walked out of the house, the watch isn't completely useless. You can use it to track your workouts and it will continue to monitor your activity (move, stand & exercise). You can listen to music that's stored on the watch, and use Apple Pay to buy stuff (Apple Pay only available in the USA right now). Oh, and you can still use it as a watch to tell the time, set alarms and use the stopwatch feature! If you're at home or in the office and you wander around so that the Bluetooth connection is lost between your iPhone and the Apple watch, if your iPhone is also connected to a wifi network, you can also use Siri on the watch & send and receive iMessages. 

Which menu of apps do you prefer? Gear S (left) or Apple Watch (right)?

Which menu of apps do you prefer? Gear S (left) or Apple Watch (right)?

When it comes to learning to use the Apple watch, it should be intuitive, given Apple's previous products. Tell me something, if the Apple watch was intuitive, why would the user guide be nearly 100 pages long? (For comparison, the manual for the Gear S is also of a similar length!)

For example, the Apple watch features something called 'Force Touch', which can distinguish between you tapping the screen and pressing the screen. Pressing the screen brings up new menus or options within apps. For example when you open up the Maps app, in order to search for a destination, you have to press firmly on the screen, and two options then appear, "Search" & "Contacts." If you were unaware of "Force Touch" or had not read the User Guide, you might be bamboozled. When the Apple watch has a bunch of notifications you wish to clear, you have to press firmly on the screen for a "Clear All" option to appear. On the Gear S, when browsing the notifications, you simply swipe up to see the "Clear All" option. It seems the user interface on the Apple watch leaves many users confused, leading to 9to5mac creating a quick start user guide. Whilst browsing and choosing the apps on the Apple watch, I sometimes find myself starting the wrong app, because the screen and the icons are so small. In that respect, I do prefer the larger screen and traditional menu of the Gear S. For clarity, I purchased the larger of the two Apple watches, 42mm, rather than the 38mm. I do wonder how difficult or easy it would be for someone with Arthritis to use the Apple watch (or any wearable device with a touch screen)?

When it comes to health, one of the first apps I tried was one called Sickweather. It uses crowdsourced data for forecasting and mapping sickness. It is the same notification that would appear on the iPhone, but if you have the watch, it will appear there instead. Now it might seem of limited or no value to many, but for some people, it is useful. After I put out the tweet showing how the cough alert looked, it led to an interaction on Twitter with a guy called Jarrod, who has Cystic Fibrosis, and said the app would be useful for him. Sickweather has a Sickweather score that is only available on the Apple watch. 

I also tried an app called DocNow that provides instant access to doctors 24/7 from the Apple watch. A tap on the watch will initiate a HD video call with a doctor via the iPhone. Unfortunately, being in England, it didn't work for me when I tried it. That's being resolved I believe. 

There are also a number of apps on the watch for Medication reminders. Medication reminders on a watch are not new, I tested the MediSafe version for Android Wear last year. For the Apple watch, I tested an app from WebMD, and one good thing I noticed was it even includes a picture of the medication you are supposed to take. In the WebMD app on the iPhone, you can even use your own picture, if your pills look different from the stock image. It all sounds great, doesn't it? However, once I shared via Twitter, I got valuable feedback. Is the screen size too small for older people and/or people with poor eyesight? So, rather than on a watch, perhaps medication reminders for older people taking multiple medications are better delivered via a personal companion robot? (more on that in a future post as I have some updates in that arena) 

The Deadline app that shows my predicted life expectancy

The Deadline app that shows my predicted life expectancy

Another interesting app I tested was Deadline. This is an app that asks you questions about your lifestyle, and family history as well as reading some of your health data from the iPhone to then determine your life expectancy. It displays it on the watch as a tip on how to improve your life expectancy. The science behind this app is probably unvalidated, but as a concept, but it does make me wonder. In the future, If the science was accurate, and the app was validated, how comfortable would you feel with tailored health advice via your watch that was based upon the state of your health there and then? Would it be too intrusive if your watch nudged you to eat a salad instead of a burger?

The Apple watch searches for Bluetooth devices

The Apple watch searches for Bluetooth devices

Within the Bluetooth menu on the watch, I found that it shows two types of devices it can connect to, devices & health devices. I understand that it is possible to pair the watch to an external heart rate monitor, if you wanted to use that to monitor your heart rate rather than the sensor within the watch itself (I plan to test this connectivity). It is not clear what other health devices you could connect to the watch, but its a feature worth keeping track of. 

The watch comes with a sensor that will normally record your heart rate every 10 minutes, and store that data in the health app on the iPhone. That sensor could also act as a pulse oximeter, allowing measurement of oxygen content of your blood. However, this feature has not been activated yet. 

Now if you choose the Workout app, and select one of the workouts (such as Outdoor walk or Indoor Cycle), it will track your heart rate continuously. I did try that out with an Outdoor walk, and I also compared how the Gear S was measuring my heart rate compared with the Apple watch. Bear in mind that the positioning of both devices may have affected the results, and I'll have to repeat the test, with the devices in different positions, on different arms. 

HR on Gear S almost double that of Apple watch (I was sitting on a bench as a I rested during my walk) 

HR on Gear S almost double that of Apple watch (I was sitting on a bench as a I rested during my walk) 

How do steps/distance walked compare against other devices? Well, this picture illustrates the challenge with these consumer devices. For the picture, the Apple watch & Gear S were worn on my left hand, and the Microsoft Band was worn on my right hand. Same walk, different devices, different results. Note, I entered my age, gender, height and weight were entered exactly the same in the app for each device. Why does the Apple watch show more steps walked than the Microsoft band, but a longer distance? Why does the Gear S show more steps & more distance but fewer calories than the Apple watch? BTW, since the Apple Watch doesn't track sleep, I'm using the Microsoft band to track my sleep. Will we ever have one device that can serve every purpose or do we have multiple wearables?

Apple Watch (left), Microsoft Band (top right), Gear S (bottom right) 

Apple Watch (left), Microsoft Band (top right), Gear S (bottom right) 

Health app on my iPhone

Health app on my iPhone

I was curious about the data from my watch being recorded in the Health app on my iPhone, and I found something quite puzzling. The Outdoor walk I had selected on the watch, had captured my heart rate continuously but something didn't make sense.

The app shows 6 entries for 8.21am, two of them for 128bpm, two more for 127bmp, one at 78bpm, and one at 69bpm. The date stamp only shows the hour and minute not the second. How will it be possible to make sense of this data in any analysis if I have 6 different heart readings at 8.21am? (Update: 18th May - I got a response from Apple about this issue. They told me the watch will measure HR multiple times in a minute, but that the data in the health app is only in hours and minutes.)

Now that my heart rate is being captured with the watch, could that data ever be used with other personal data to tailor advertising messages to me? I'm outside Starbucks, having not slept well, woken up late, missed by usual bus to work, and voila, my watch gets a coupon offering me a discount off coffee within the next 10 minutes at THAT particular Starbucks. Would that be creepy or cool? I envisioned this scenario after reading a brilliant post by Aaron Friedman, on the future of search engines, which he says is all about context. Delivering information to your watch at the right place and the right time was the plan behind Google's Android Wear. A great idea, but their implementation last year was not optimal. Additionally, many of the first Android Wear watches didn't look very fashionable either. Their new strategy for Android Wear in response to the Apple watch may win them more consumers, but I'm not convinced.

I have been examining the role of the Apple watch in health primarily from a consumer perspective. What about people working in healthcare? Is the watch helpful for them? Well, Doximity, a professional network for physicians in the USA thinks so. An article about their app for the watch highlights, "They think the Apple Watch can enable medical professionals to share information easily, securely, and quickly — and perhaps most importantly, hands-free."

There is a hospital in the USA, Ochsner Health System, that is trialling the use of the Apple watch with patients with high blood pressure. Then you've got one of the biggest hospitals in Los Angeles, Cedars-Sinai has now added support for Apple's HealthKit, allowing data from a patient's phone to be added to their medical record. That's where I see the biggest advantage of the Apple watch over any other makers of smartwatches.  

  • Interface - Whilst not perfect, and probably too complex, once you get the hang of it, the Apple watch is a more polished user experience than its current rivals 
  • Integration - Whilst I capture health information with the Gear S, it doesn't really go anywhere from the Samsung S-health app. This is where Apple really shines. 
  • Ecosystem - With around 3,500 apps already available for the Apple watch (including many popular iPhone apps), and around 1,000 apps for the Samsung Gear watches, once again, Apple are ahead. I downloaded very few apps for the Gear S, as I didn't find many good ones.
The Bump is an app for pregnant women - this is the screen you see for several seconds as the app loads 

The Bump is an app for pregnant women - this is the screen you see for several seconds as the app loads 

Since I've mentioned apps, thanks to Tyler Martin for reminding me to mention some of the issues I faced with installing & using apps on the Apple watch. Maybe it is because the ecosystem is so new, but the apps can be buggy. You expect a tiny device like a watch to respond swiftly, it is not like a computer with a hard drive. Yet, there are times, when the watch does take a relatively long time to install/open apps, or the app crashes whilst you're using it. Those are the moments when you feel like you've purchased a product that is still a work in progress. I would hope these bugs get ironed out as more people start using these apps and report issues to the developers. The source of these problems may be that developers have had to create apps for the watch without actually having access to the watch prior to launch. Maybe those consumers waiting for Apple Watch 2.0 or 3.0 are the sensible ones?

Dr Eric Topol highlights in a tweet how the Apple watch may be of benefit to diabetics wishing to monitor their blood glucose levels when using a Dexcom CGM.

One thing I was reminded of this week was that we might have the latest technology such as an Apple watch, but the infrastructure around us was designed for a different era. For example, I travelled from London to Milan and Paris this week with British Airways. As a result, I was able to use their mobile boarding pass on my Apple watch. I checked-in online using the BA app on my iPhone, and then retrieved my boarding pass, which I added to Passbook. The passes in Passbook on your iPhone get transferred to Passbook on your Apple watch, so you could even board a plane using your Apple watch alone, if your phone was off or left at home. There are two parts to the boarding pass on the watch, one is the text information about your flight and the other part is the QR code which airport machines will scan. On the iPhone, you'd see both parts at once, on the watch, due to the small screen, you have to swipe up to see the QR code.

Instead of waiting by the screens in departures at Heathrow airport, I wandered around the airport at my leisure, and got a notification on my watch when the gate for my flight was announced. However, when I was at the gate, and was asked for my boarding pass, I had to take the watch off my wrist so the boarding pass could be scanned. The machine which scans boarding passes had been designed to scan paper boarding passes, and so didn't have a gap large enough to accommodate someone's arm wearing a watch. Where I wished I had a paper boarding pass was at Milan airport, where on departure, passport control wanted to see my boarding pass. The officer was in a kiosk fronted by a glass screen, and I had to take off my watch and slide it across the counter. However, when I did that, the screen of the watch went off, and as I leaned over the counter to tap the screen for the boarding pass to reappear, a bunch of notifications pinged to the watch, which then confused the officer in the kiosk. I had to then clear all the notifications from the watch, open Passbook on the watch, and bring up the boarding pass again.

When you Apple watch notifies you, it uses the new 'Taptic engine' to tap your wrist rather than the traditional vibration I get on devices such as the Gear S and Microsoft Band. I found these taps to be too weak. After reading the User Guide, I found within the watch, a menu that offered 'Prominent Haptic', which I switched on. It is better than before, but I still prefer the more noticeable vibration from the Gear S and Microsoft Band. 

There are some features of the Apple watch which seem rather frivolous. One of them is that you can press both buttons on the side of the watch, and a screenshot of the watch's display is then added to your iPhone's photo library.

You're probably wondering about battery life. Well, Apple claim 18 hours, and I did get close to that on the second day. After 16 hours, the battery was down to 14%. Another day, after 12 hours it was down to 12%. When it comes to charging the Apple watch, it's a magnetic dock that has a 2 metre long USB cable. You can't use your iPhone charging cable to charge the watch. I understand it's an engineering challenge that means currently every wearable has its own charging connector or charging dock. It's annoying, another cable to carry around. Don't lose it, a replacement isn't cheap at £29.

There are countless other reviews based upon a week's usage of the Apple watch. One week's usage won't always reveal the flaws, especially design defects. I'll give you a very real example. My Gear S has a charging dock that clips onto the watch, and you plug the micro USB charging cable into the charging dock. After 4 months of usage, the charging dock no longer clips onto the watch, meaning I can't charge it (unless I keep the dock in place with an elastic band). The exact same thing also happened after a few months with my Samsung Gear Fit. I went to the Samsung store in London yesterday, who told me that my warranty wouldn't cover this problem, as it was a cosmetic fault. I would have to purchase a new charging dock, which they didn't have in stock, as they don't sell many Gear S devices. I'm not the only one, as Gear S owners in the USA have the same problem, and a received a similar response from Samsung USA. Knowing how the Apple watch performs over a longer period of time is critical, as well as observing how Apple will respond to problems as they occur.

Charging docks, special adaptors, and unique cables all make living with wearable technology, more challenging than it needs to be. Just yesterday I came across a company called Humavox in Israel working on wireless charging which would include wearables. I really hope they succeed in making wearables easier to live with. In the meantime, one advantage of the Gear S is the charging dock also doubles as a supplementary battery, so if you are away from home and low on battery, you can just clip on the charging dock. Nothing like that with the Apple watch, apart from an aftermarket 'Reserve strap' coming out later this year. It promises to charge your watch as you wear it, but costs $249. An expensive fix for an already expensive watch.

The futuRE

The Apple watch is a good first attempt, and if Apple invest in refining the product, it may become a successful product line for them in the long term. Like many of its rivals, it is primarily an extension of the smartphone, another screen, on our wrist. Look around you, and most people aren't wearing a 'smartwatch.' Samsung has launched so many models, yet none of them have really gone mainstream. Apple may not succeed immediately with this first version of their watch, but simply because they are Apple, they may shift the culture and make consumers more interested in purchasing (and regularly using) a 'smartwatch' of some kind, even if it's not made by Apple. 

How much value will the Apple watch add to to our daily lives? Will it make a difference only to the young, or will it benefit the old too? Apart from making life more convenient, will it actually play a role in improving our health, or saving us money? It is too early to answer those questions as it has only just hit the market, but those are key questions to answer.

I'd personally want to get my questions about the accuracy of my heart rate data answered, especially if data from my watch could one day be added to my medical records. Even the differences in steps/distance walked/calories burned between the Apple, Samsung & Microsoft devices make me think twice about unvalidated data ending up in the system of my doctor or insurer. 

Genuine advances are needed in battery life, how much information is the Apple watch not able to capture about my health because it has to be charged whilst I sleep? If I'm travelling, I don't want to interrupt my routine to find somewhere to charge my watch. 

Today, based upon my experience so far,  I believe the Apple watch is the best 'smartwatch' available. It has got fewer flaws than other devices, such as the Gear S, but it still has got flaws that Apple needs to deal with. I have to admit I didn't really like it at first, but as I learnt how to use the features, it grew on me. Now tomorrow, it could be someone else, or a new form of technology, not even necessarily in form of a watch. Some people tell me they view the Apple watch as technology that is already redundant, good for loyal Apple customers, but not a genuine innovation in this arena. 

You've got the Pebble Time watch with the concept of 'smart straps', which could allow new possibilities. Then there is the Bluetooth 4.2 specification which just got finalised in Dec 2014, featuring low power IP connectivity. What would that mean for future devices? Bluetooth smart sensors that could connect directly to the internet, without having to be paired to a smartphone or tablet. 

Yi Tuntian, a former Microsoft official in China claims that wearables will replace mobile phones soon. I find that claim hard to believe. 

How about the new chip developed in Taiwan, which integrates sensors for tracking health as well as data transmission and processing. This chip is so small so that "people could wake up in the morning to the voice of a microcomputer in a headset informing them of the state of their health and things to look out for in their lifestyle."

It may be the case that the Apple watch ends up being of significant value for particular applications in healthcare & clinical trials for those who can afford it, but does not have long term success as a general smartwatch with the average consumer. Here in the UK, with the NHS hunting down the back of the sofa looking for extra pennies, experimenting with the Apple watch may be a pipe dream at best. 

I did find a wonderful story of how the Apple watch has changed someone's life, in just 5 days. Molly Watt, a 20 year old woman in England who has Usher Syndrome Type 2a. She was born deaf and has only a very small tunnel of vision in her right eye. In her blog, she describes how the different taps for turning left and right that helped me feel safer walking in Milan at night, allowed her to feel more confident when walking down the street, without relying upon hearing or sight. We might not see much benefit from Apple Pay on the watch for mobile payments, but could this feature be tremendously useful to someone with learning difficulties?

Without giving it a try and without generating evidence, it would premature to dismiss the Apple watch completely. As these consumer technologies evolve at an increasing rate, what actually is evidence, and how do we collect it?

You may see wearables as just a fad, a passing phase, and you'd never wear any of these devices. Well, what if you had to wear a device on your wrist, just to get insured? Nope, it is not science fiction, it iss the view of Swiss Re, a reinsurance giant whose executives believe it will be impossible to get life insurance in 5 to 10 years without a wearable device

A really fascinating article from Taiwan discusses the profitability of smartwatches in healthcare, and mentions, "Service platforms that integrate medical care organizations with insurance companies will produce the greatest value." 

Fitness is touted as one of the immediate applications for the Apple watch, yet Gregory Ferenstein's review suggests you won't gain much from the Apple watch in the fitness arena over and above simply using an iPhone. 

Or maybe we are misguided in pursuing the idea of 'smartwatches' entirely? Below is a great talk given by Gadi Amit recently at WIRED Health talking about the concept of wearable tech under our skin, and he states, "The biggest issue that I see… is the idea that if we load more and more functionality on our wrists, things will get better. In many cases, it does not."

I'm not surprised that Apple have sold so many watches, as they have a well oiled marketing machine. How many of today's purchasers will still be using the watch in 12 months time, or will it go the way of Google Glass? And how many people would be willing to upgrade to a newer version of the watch in 12-18 months? These are the hard metrics that we need to pay attention to, once the initial enthusiasm dissipates.

In my opinion, the single biggest improvement Apple could make is finding a way to extend the battery life, and also offer wireless charging. I'd happily have fewer features in exchange for not having to charge it every day. Come to think of it, current engineering limitations on battery life impacts the use of many portable devices, whether it is wearable tech, phones, tablets or laptops. We need a breakthrough in battery technology. 

So, will people wearing an Apple watch been treated with the same disdain as those who wore Google Glass? Users of Google Glass got branded, "Glassholes" and so will users of the Apple watch get branded, "Glanceholes?" 

The Apple watch is dead. Long live the Apple watch.

[Disclosure: I have no commercial ties to any of the individuals or organisations mentioned in the post]


Enter your email address to get notified by email every time I publish a new post:

Delivered by FeedBurner