Take a look at the beta version of dw.com. We're not done yet! Your opinion can help us make it better.
About 500 million people use health and wellness apps on smartphones - clearly, we want to trust such tools. But should we? Dr Kit Huckvale of Imperial College London found "unaddressed privacy risks."
DW: Given the number of people using health apps on smartphones, watches and other devices, your study has revealed a massive breached of data and privacy.
Dr Kit Huckvale: Well, we see it as a big risk, for sure. What we found was that a large number of the apps we looked at had the potential to place users' data in jeopardy. But the study wasn't designed to look at whether that had actually occurred in practice. So we haven't shown that someone has gone in and stolen users' data. We found that a lot of these apps don't take appropriate precautions to protect against that kind of thing happening.
It certainly is a concern. As you say, it's been known for a while that apps generally pose a risk to privacy and health apps specifically out there in the wider world that may use poor security and privacy practices. But we were interested in this very specific set of apps - accredited apps - and I guess the purpose of accreditation - partly - was to address these privacy concerns, so a user could say, "It comes from an accredited source, I don't need to worry as much about whether there are privacy issues, because the accreditation program will have taken care of that for me.
A range of apps
So people have been putting their trust in the NHS Health Apps Library, which accredits apps… Give us an idea of what kind of health apps we're talking about and whether they're apps for personal or professional use.
The apps we looked at were exclusively for those intended for patient use, and they cover a range of uses: apps for people with a long-term condition, both as a source of information as to how to manage that condition, but also apps that allow you to record diary data - for example, a person with diabetes could log things about their insulin and their blood glucose over time with the intention perhaps of sharing that with their health care professional.
And then a second key class of apps we looked at were those for people who are maybe otherwise well but are looking to make some kind of change in their health or lifestyle - apps helping people lose weight, so they're recording weight and food intake, or those interested in stopping smoking.
For the actual study, I understand you performed a hack - a "man-in-the-middle" hack. What is that and how does it work?
Well, in fact, we did two things. We pretended to be users - we set ourselves up on these apps and over a period of time tried to use the apps as a [real] user might. And then we set ourselves up in a hacker role, so we could look into the devices and see what was being written onto the storage on the smartphones and tablets we were testing on, and then we also set ourselves up on the network so we could look at all the traffic that was coming off the device and see whether any of that related to the apps we were testing, and if so, what data were being sent and to where.
And you're right - that put us a little in the position of a hacker, because in some cases we were able to see the data that were being sent and were not encrypted, and so we could see in plain text the information that we had entered into the apps. And that would be the kind of information that a general user might enter into the apps that could be accessible to a third party.
So you intercepted encrypted data - but can encrypted data be cracked and read?
The purpose of doing that was to understand what data were being sent. In general, the level of encryption that was being used is an industry standard level, so I think we can assume that where there was encryption that would be an appropriate level of protection.
The idea behind the NHS Health Apps Library is, in a sense, to reassure people that these apps meet UK data protection standards. Does this mean then that the NHS is not fulfilling its duty, or that the Data Protection Act in the UK - and similar acts elsewhere - are insufficient?
I'm not an expert in law, but my understanding is that data protection principles are relatively well established and understood, and I think in terms of ultimate responsibility, it lies with the developers of these apps; they are ultimately responsible in ensuring they comply with the law.
Do we place too much trust in our own ability to track our health - and in the apps we use to do it?
The NHS's accreditation process asks developers to assert that they had taken those steps to comply with the law - that's the process that we understood was going on when we were looking at these apps [August 2013 to January 2014]. And I guess our data suggest that relying on that self-declaration may not be enough, and some more interventionalist [sic] approach may be necessary.
Dr Kit (Christopher) Huckvale is a qualified doctor, researcher at Imperial College London, and lead author of the study.