The emerging field of digital phenotyping aims to analyse data from people's use of smartphones and other digital devices to predict and monitor their mental health. But it throws up a range of issues.
An app called TapCounter records your interactions with your phone screen - your numbers of swipes and taps, texts and calls, writes Zoe Corbyn in The Guardian this week. The aim? To detect important indicators related to your mental and neurological health through the capture and analysis of the data.
The company behind this app, QuantActions, is one of a growing number of technology companies interested in digital phenotyping – using artificial intelligence to analyse data from digital devices to infer behaviours related to health and disease.
The idea is that digital signals, called digital biomarkers, can be identified, such as keystroke patterns, geo-location data or smartphone activity logs, and these can be used as a new way of diagnosing or monitoring a range of medical conditions, particularly those around mental and brain health.
It's a technology that's attracting interest among the likes of Apple and Google among others. For example, in September, the Wall Street Journal reported that Apple is working on developing features in its iPhone to help diagnose depression and cognitive decline.
Of course, the diagnosis of mental illness has traditionally relied on patients self-reporting their experiences and feelings and on medical assessment. Proponents of digital phenotyping say that the technology has the potential to help people manage their own mental health better, while also making things quicker, cheaper and more efficient – in the light of increasing demand on mental health services.
"It is worth exploring," Helen Christensen, a professor of mental health at the University of New South Wales, told The Guardian. "If we can find that this data is relevant, it would be a big breakthrough."
Others point out the substantial issues around this technology. There are data privacy concerns, about private companies using predictive inferences made with digital phenotyping in ways not everyone is aware of [think social media algorithms]. And there is concern about the lack of context for this kind of "diagnosis". Lisa Cosgrove, a professor of counselling psychology at the University of Massachusetts in Boston, told The Guardian that digital phenotyping focuses too much on the individual in isolation, deflecting attention away from other possible social causes of mental health issues, such as discrimination, job loss or poor housing. "Digital phenotyping misses the context in which people experience emotional distress," she says.
Is there any real replacement for human contact, others ask? And what happens if a tool's recommendations differ from those of a doctor or psychatrist? "[While using software to help spot signs of mental health problems is interesting], human interaction and professional clinical judgement aren't replaceable and should remain an essential component of a patient's experience of diagnosis and access to treatment and support," states Rosie Weatherley, information content manager at Mind.
If you are seeking support with your mental health, you can find local sources of help on this website.