Update: https://www.crisistextline.org/blog/2022/01/31/an-update-on-data-privacy-our-community-and-our-service/ Loris won't delete models they've already built based on past data, but this is...
We understand that you don’t want Crisis Text Line to share any data with Loris, even though the data is handled securely, anonymized and scrubbed of personally identifiable information. As a result, we have ended our data-sharing relationship with Loris. This change includes our request that Loris delete the data it has received from Crisis Text Line. We have updated our Terms of Service & Privacy Policy accordingly to reflect this change. (Note: While a data sharing relationship has existed, Loris has not accessed any data since the beginning of 2020.)
Loris won't delete models they've already built based on past data, but this is still better than nothing and a pretty fast response.
In doing this, Crisis Text Hotline has not only damaged its own credibility, but the credibility of every suicide/crisis service out there. Their actions will have a real, tangible death toll. The...
In doing this, Crisis Text Hotline has not only damaged its own credibility, but the credibility of every suicide/crisis service out there. Their actions will have a real, tangible death toll.
The primary reason suicide hotlines are used rather than reaching out to local or family relationships is the anonymity. I know people who have been afraid to reach out to any given hotline for fear that:
they will have the police called on them
someone they know will find out
their conversations/worries/struggles/the simple fact they called will go on their permanent medical record.
I myself have been afraid in the past for reasons 1 and 3.
Privacy is of even greater importance for 'sensitive' medical services, whether that be birth control, STD prevention/screening, or mental health. Things like these make people even less willing to reach out for help when they need it. There have been mental health services working on their credibility as a safe, private space to talk about issues for years, and one greedy corp has set them ALL back years in that. This is NOT OK.
danah boyd, a founding board member of Crisis Text Line, made an (extremely long) post responding to this a few days ago: Crisis Text Line, from my perspective
danah boyd, a founding board member of Crisis Text Line, made an (extremely long) post responding to this a few days ago: Crisis Text Line, from my perspective
What really stings about that blog post, IMO, is how the first part was all apology, saying how after the story broke, they 'concluded that we were wrong to share texter data', and how they were...
What really stings about that blog post, IMO, is how the first part was all apology, saying how after the story broke, they 'concluded that we were wrong to share texter data', and how they were going to do better in the future. But she then goes onto say, in great detail, the exact justification she uses. Its nothing but another corporate apology. They arent sorry they did it. They are sorry they got caught. (a conclusion supported even more supported by their dismissal of the volunteer who first brought up these concerns.)
The founder of Crisis Text Line saw an opportunity and came to the board. We did not have the resources to simply train anyone who was interested. But HR teams at companies had both the need for, and the resources for, larger training systems. The founder proposed building a service that could provide us with a needed revenue stream. I don’t remember every one of the options we discussed, but I do know that we talked about building a separate unit in the organization to conduct training for a fee. This raised the worry that this would be a distraction to our core focus. We did all see training as mission-aligned, but we needed to focus on the core service CTL was providing.
this is where she completely and utterly lost me. Using data for revenue generation, especially such personal data, cannot be ethical no matter the amount of anonymization that you attempt to layer on.
and I couldn't help but notice this choice tidbit:
But I have also learned from our clinical team about the limits of consent and when consent undermines ethical action. I have also come to believe that there are times when other ethical values must be prioritized against an ideal of consent.
ACTING WITHOUT CONSENT IS NOT ETHICAL. not in mental health, not in physical health! Consent, especially informed consent, is the cornerstone of medical ethics, second only to do no harm. To even suggest otherwise is completely and utterly corrupt. It would be like saying that sometimes sex without consent is OK.
Finally, as I said in my other comment, CTL has not only damaged their own credibility, but the credibility of crisis mental health providers as a whole. this will lead to people hesitating or refusing to call and get help. CTFs actions will have a death toll.
That's a fantastic response. I don't know that I would have made the same decisions, but she lays out a reasonable narrative for why the decisions were made the way they were.
That's a fantastic response. I don't know that I would have made the same decisions, but she lays out a reasonable narrative for why the decisions were made the way they were.
There's two sides to this in my mind - This is a way for those suicide hotlines and mental health resources to stay afloat, and continue doing insane amounts of good. This is a despicable act and...
There's two sides to this in my mind -
This is a way for those suicide hotlines and mental health resources to stay afloat, and continue doing insane amounts of good.
This is a despicable act and the people in charge of these should care for their patients (it's what they are!) far more and not give away sensitive (even "anonymized") data to for-profit corps and find other ways to keep their project alive. There are always alternatives that would be better than just doing the lazy work of selling off personalized data on mentally susceptible patients of all people.
I'm leaning on (2) here. This is data generated from personal crises, it shouldn't even be retained, much less sold. Mental health services only do well when people are comfortable using them, and...
I'm leaning on (2) here. This is data generated from personal crises, it shouldn't even be retained, much less sold. Mental health services only do well when people are comfortable using them, and people knowing their conversations could leave the organization could mean the defining difference between somebody calling and not calling in a crisis.
We have to keep in mind as well that in this type of mental health organization, people not being willing to call can and will cause deaths.
people knowing their conversations could leave the organization could mean the defining difference between somebody calling and not calling in a crisis.
We have to keep in mind as well that in this type of mental health organization, people not being willing to call can and will cause deaths.
To your first point, Crisis Text Line raised $23.8 million with no strings attached in 2016: https://omidyar.com/news/crisis-text-line-raises-23-8-million/, and has ongoing corporate sponsors....
To your first point, Crisis Text Line raised $23.8 million with no strings attached in 2016: https://omidyar.com/news/crisis-text-line-raises-23-8-million/, and has ongoing corporate sponsors. They are not under pressure to scramble for revenue sources, this was a free choice.
Update: https://www.crisistextline.org/blog/2022/01/31/an-update-on-data-privacy-our-community-and-our-service/
Loris won't delete models they've already built based on past data, but this is still better than nothing and a pretty fast response.
In doing this, Crisis Text Hotline has not only damaged its own credibility, but the credibility of every suicide/crisis service out there. Their actions will have a real, tangible death toll.
The primary reason suicide hotlines are used rather than reaching out to local or family relationships is the anonymity. I know people who have been afraid to reach out to any given hotline for fear that:
I myself have been afraid in the past for reasons 1 and 3.
Privacy is of even greater importance for 'sensitive' medical services, whether that be birth control, STD prevention/screening, or mental health. Things like these make people even less willing to reach out for help when they need it. There have been mental health services working on their credibility as a safe, private space to talk about issues for years, and one greedy corp has set them ALL back years in that. This is NOT OK.
danah boyd, a founding board member of Crisis Text Line, made an (extremely long) post responding to this a few days ago: Crisis Text Line, from my perspective
What really stings about that blog post, IMO, is how the first part was all apology, saying how after the story broke, they 'concluded that we were wrong to share texter data', and how they were going to do better in the future. But she then goes onto say, in great detail, the exact justification she uses. Its nothing but another corporate apology. They arent sorry they did it. They are sorry they got caught. (a conclusion supported even more supported by their dismissal of the volunteer who first brought up these concerns.)
this is where she completely and utterly lost me. Using data for revenue generation, especially such personal data, cannot be ethical no matter the amount of anonymization that you attempt to layer on.
and I couldn't help but notice this choice tidbit:
ACTING WITHOUT CONSENT IS NOT ETHICAL. not in mental health, not in physical health! Consent, especially informed consent, is the cornerstone of medical ethics, second only to do no harm. To even suggest otherwise is completely and utterly corrupt. It would be like saying that sometimes sex without consent is OK.
Finally, as I said in my other comment, CTL has not only damaged their own credibility, but the credibility of crisis mental health providers as a whole. this will lead to people hesitating or refusing to call and get help. CTFs actions will have a death toll.
That's a fantastic response. I don't know that I would have made the same decisions, but she lays out a reasonable narrative for why the decisions were made the way they were.
There's two sides to this in my mind -
I'm leaning on (2) here. This is data generated from personal crises, it shouldn't even be retained, much less sold. Mental health services only do well when people are comfortable using them, and people knowing their conversations could leave the organization could mean the defining difference between somebody calling and not calling in a crisis.
Yes. This kinda of data is just way too sensitive, likely as sensitive as these things can get. This shouldn't even be recorded in the first place.
We have to keep in mind as well that in this type of mental health organization, people not being willing to call can and will cause deaths.
To your first point, Crisis Text Line raised $23.8 million with no strings attached in 2016: https://omidyar.com/news/crisis-text-line-raises-23-8-million/, and has ongoing corporate sponsors. They are not under pressure to scramble for revenue sources, this was a free choice.