Tech

The real harm of text-streaming data sharing during a crisis


Another week, one privacy horror show: Crisis Text Line, a nonprofit text messaging service for people going through a severe mental health crisis, used “anonymous” conversation data ” to provide a for-profit machine learning tool for customer support teams. (Following backlash, CTL notice it will stop.) Crisis Text’s response to the backlash focused on the data itself and whether it included personally identifiable information. But that response uses the data as a distraction. : Let’s say you messaged Crisis Message Line and got back a message that said “Hey, you know, we’re going to use this conversation to help our for-profit subsidiary building a machine learning tool for customer support companies.” Do you continue to text?

That’s the real betrayal – when the cost of getting mental health help during a crisis is becoming worth it for a machine learning factory. And not just paying CTL users; It’s all people looking for help when they need it most.

Americans need help and can’t get it. The enormous unmet need for critical help and advice has given rise to a new group of organizations and software tools that exist in a regulatory gray area. They help people who are bankrupt or deported, but they are not lawyers; they help people with mental health crises, but they are not care providers. They invite ordinary people to rely on them and often give real help. But these services can also avoid taking responsibility for their advice, or even abuse the trust people have placed in them. They can make mistakes, push predatory ads and misinformationor completely sell data. And consumer protections that often protect people from the carelessness or mistake of a lawyer or doctor haven’t caught on.

This regulatory gray area can also restrict organizations from having new solutions to offer. Join Upsolve, a nonprofit that develops software to guide people through bankruptcy. (The company takes great pains to insist that it does not offer legal advice.) Upsolve wants to train community leaders in New York to help others navigate the city’s notorious debt courts. One problem: These interns are not attorneys, so under New York law (and nearly every other state), Upsolve’s initiative would be illegal. Upsolve is now bale to carve out an exception to itself. The company claims, quite rightly, that the lack of legal aid means that people actually lack rights under the law.

The failure of the legal profession to grant Americans access to assistance is well documented. But Upsolve’s lawsuit also raises new, important questions. Who is ultimately responsible for advice given in a program like this, and who is responsible for a mistake — a practitioner, a coach, both? How do we teach people about their rights as a customer of this service and how to seek a claim? These are well-known questions that can be answered. There are many policy tools to create relationships with higher responsibilities: We can specify those who give advice a special legal status, establish a duty of loyalty for organizations that process sensitive data or create policies sandbox to test and learn from new model to provide advice.

But instead of using these tools, most regulators seem content to bury their heads in the sand. Officially, you cannot give legal or health advice without a professional license. Informally, such advice can be obtained by anyone, except for the names of fringe tools and organizations. And while credentials can be important, regulators are not getting involved in how software has fundamentally changed the way we give advice and care for each other, and what that means. for the responsibility of those who give advice.

And we need that engagement more than ever. People who seek help from professionals or carers are vulnerable. They may not be able to distinguish good service from poor service. They don’t have time to analyze terms of service dense with jargon, warnings, and disclaimers. And they have little or no bargaining power to strike better terms, especially when they are in the midst of a crisis. That is why the fiduciary duty that lawyers and doctors have in the first place is so essential: not only to protect a person seeking help once and for all, but to give everyone trust that they can seek help from experts for the most important and sensitive issues they face. In other words, an attorney’s obligation to their client is not just to protect the client from that particular attorney; it to protect society’s trust in lawyers.

And that’s the real harm — when people won’t contact a suicide hotline because they don’t believe the hotline is in their sole interest. That distrust can be contagious: Crisis Text’s actions may not only stop people from using Crisis Text. It can prevent people from using any similar service. What’s worse than not being able to get help? Can not be trusted.



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button