The Association for Information Science & Technology (ASIS&T) Special Interest Group (SIG) Information Needs, Seeking, & Use (USE) awards committee has named Assistant Professor Kaitlin Costello and Ph.D. student Diana Floegel recipients of the SIG USE Early Career Paper Award for their paper “’Predictive ads are not doctors’: Mental health tracking and technology companies.” Costello and Floegel received the award at the SIG USE Symposium, which was held virtually on October 24.
According to the ASIS&T website, “The purpose of the award is to recognize the best paper presented at the Annual ASIS&T Conference that falls within the scope of information behavior written by an early career scholar. Information behavior is broadly defined to include how people construct, need, seek, manage, give, and use information in different contexts.”
The objective of the research Costello and Floegel conducted for the paper was to understand how people who have been diagnosed with mental health conditions feel about mood tracking apps and digital phenotyping. Costello and Floegel said these mood tracking apps are owned and operated by technology companies and are marketed to help people track their activities and moods, and “sell the hope or promise of mental wellness.”
To their knowledge, they said, the effectiveness of these commercially- available apps is not supported by empirical evidence. Their research is among the first to explore the use of mental health apps outside of a clinical setting.
Digital phenotyping, which they describe as “automated mental health assessments that infer one’s mood based on the trace data they generate when interacting with mobile phones or other Internet-enabled devices” is also a focus of the paper, as well as the many ethical issues surrounding the collection and use of this type of data by technology companies, law enforcement agencies, and other commercial, health, and governmental entities.
Explaining the reasons it is so important to understand user’s feelings and opinions about mental health apps, Costello said, “It is critical to center the perspectives of people who have lived experiences with mental health diagnoses in order to fully understand the potential impact digital phenotyping will have on them and their lives. They are the people who will be most impacted by this technology if and as it advances, so they should absolutely have a say in whether and how it is developed.”
Their paper also highlights the most critical privacy, ethics, and justice issues surrounding the use of mental health apps. One of the most pressing involves maintaining the privacy of users, because the technology companies who own them are able, by default, to collect and potentially share extremely sensitive data about people’s lives and health through the use of these apps.
“The harms of surveillance fall more heavily on people who are already oppressed in society: Black people, people of color, women, people who have been diagnosed with mental illness(es), queer people, trans people… they are more subject to the harms that result from surveillance,” Costello said. “People with untreated mental illnesses are sixteen times more likely to be killed in a police encounter in the United States, and Black people are killed at more than twice the rate of white people by police in this country. Yet digital phenotyping, which is a surveillance technology, is being discussed by many – from technology companies, to medical professionals, to the American government – as a viable technique for initiating wellness checks. This is cause for alarm.”
Asked if it might be possible to protect users through new laws and other types of regulation, Costello said, “We do believe that digital phenotyping for mental health should be banned, which is a form of regulation. We believe that this practice is directly in opposition to the Hippocratic Oath to ‘do no harm.’ Currently, violating laws that regulate data privacy to date often require companies to pay fines as penance. Large technology companies can consider these fines as simply a cost of doing business. Facebook has paid fines amounting to less than a day’s profits when they violated GDPR. We also think that mood tracking apps and digital phenotyping should be regulated, so that resultant data cannot be used to make predictive inferences about people and their health status.”
Despite the inherent privacy risks involved in using apps to track mental health, Costello and Floegel expect that people will continue to use them because the apps do provide benefits. Costello said, “People will definitely continue to use mood tracking apps, because being able to monitor your mood and feelings is a very useful thing to do! However, we note that most of our participants felt like they were satisficing when they used existing mood tracking apps. Satisficing is a term that means ‘good enough.’ So participants don’t like the mood trackers that are available—in large part because they worry about surveillance—but they use them because they’re the best option.”
In terms of the next steps they will take to advance their research on mental health apps and digital phenotyping, Costello said, “This was a preliminary study with preliminary findings. We note that most of our participants shared similar feelings about surveillance and privacy, so our next step is to interview people who are more excited about digital phenotyping. This will allow us to better triangulate our findings and to conduct a more robust analysis.”
Asked what the most important takeaway from the study is for anyone who is considering using an app to monitor their mental health or boost their mood, Costello said, “It’s always important to think critically about any app you install on your phone. What are you using it for? Who else will have your data? What purpose does your data serve? What implications does that have, not just for you, but for society?”