Should the Human Condition go High Tech?

Disclaimer: I am not a mental health professional.  This post shares observations which I have made solely on a personal level.

Everything is automated now.  Often, we don’t even touch the sink faucet in a public restroom before washing our hands.

Sensors on water faucets can help us avoid picking up additional microbes when we want our hands clean.  However, if you’ve ever soaped up your hands and then found it impossible to rinse them because the sensor on the tap has fallen asleep, you know modern technology isn’t always the answer.

At the end of this post, please click the link to a Reuters article shared on MSN’s site.  It describes a system which Facebook will begin using soon, to identify suicidal users.  The software has already been tested.

When I read this article, I was reminded of something unfortunate which happened in 2010, after a telephone crisis switchboard began using Caller I.D. and keeping records associated with callers’ phone numbers.  The incident wasn’t disastrous, and it wasn’t made public.  If the circumstances had been a little bit different, though, there could have been a tragedy.

Someone who was experiencing multiple crises called the same switchboard twice within a time frame of a few weeks, to ask for emotional support with different problems.  She was overwhelmed, and waiting for the next bunch of proverbial shit to hit the fan (which it did).

The volunteer with the switchboard was unqualified, and he focused on electronic records he was able to pull up on a computer screen.  When he saw that she had called recently, he confronted her with the earlier records and suggested no one has that much horror going on at once.  He was accusing her of lying.

Fortunately, the caller didn’t harm herself after getting that creepy Big Brother-like shock.  It didn’t occur to her to demand to speak to his supervisor, so she just resolved never to call any crisis line ever again.  So, one source of support was taken away from her.

External stress, depression and other things can push us over the edge.  The people who are suffering deserve respect, and we must never make quick assumptions.  Maybe someday the crisis line volunteer will remember treating callers that way, and cringe.  That doesn’t change the fact that he may have put some people in harm’s way when he arrogantly believed he held all the answers in a handy database.

Computerized human contact seems more concrete than it really is, and persons using electronic information must also act on something more sophisticated.  The airhead with the crisis switchboard was proud of the shoddy detective resource (a programmed system) that allowed him to confront a caller.  He was also wrong, and he could have contributed to a caller’s self-injury.

Be aware that the incident with that one crisis line volunteer was never made public.  We don’t know how often ignorance, narcissism and general boldness have caused jaw-dropping reactions in people who rely on telephone support, but modern technology has given unfit crisis counselors a new toy.

Too many people will assume Facebook is doing a great thing by using software to detect suicidal posts.  True, it might save lives.  However, some social media users will have a sense of violation when they realize their cries for help — if they are indeed cries for help — have triggered an alarm at a tech company.  Let’s hope not too many of those people are paranoid already.

https://www.msn.com/en-us/news/technology/facebook-to-expand-artificial-intelligence-to-help-prevent-suicide/ar-BBFORN0?li=BBmkt5R&ocid=spartandhp

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s