On the Samaritans Radar

Radar TwitterMy perception of the Samaritans is that it’s a service which the person needing help approaches themselves, possibly indicating that this is how a Samaritans app should work, rather than amateurs, or worse, being alerted to someone else’s distress. That most mental health sufferers are vulnerable is axiomatic, but Radar could potentially be particularly dangerous for women and children.

Radar is problematic not because it invades privacy or because it might be illegal, but because it assumes benign intent on the part of the user and creates vulnerability for the tweeter. Paedophiles, groomers and sex offenders are known to have targeted vulnerable young women in places like Rotherham. The Samaritans Radar app could very easily serve as a stalking tool to facilitate that kind of activity.

Campaigners, with their own clearly felt paternalism and an urge to prove they have a better understanding of how to protect people who make public cries for help, have caused the suspension of the app. While I don’t feel quite as strongly as they, I think the perception that everything on the internet is up for grabs because it is public isn’t always understood by users and especially by young people and children. Therefore, I don’t think it should be exploited – even when done transparently by good people – to allow followers to download what is effectively a monitoring tool that has no opt-out nor anonymity/blocking provisions for the tweeter.

By way of a more personal disclosure, I’ve never been diagnosed with a mental health condition, although I once taught people who had. I tend to use Twitter for news/campaigning purposes and don’t put personal information of an emotional nature on it. Nonetheless, I’d like to have a choice about whether, despite my profile being ‘public’, an algorithm was curating my tweets for anything other than what I intend it for.

I doubt that anyone could seriously accuse the Samaritans of developing Radar with anything other than good intent, but it’s also clear that the developers didn’t consider strongly enough the exposure it creates for already vulnerable people to unknown risk.