Do online service providers like Google and Facebook have a
social responsibility for how people use their sites? Are they
morally obliged to amend results for things like suicide searches?
And if they do, will that help keep people alive?
Google's latest big move is to intervene in suicide searches, in
partnership with a U.S suicide prevention helpline. It's a big
move. Until now, Google have been a strong advocate of giving you
what you ask for when you search - be it good, bad or ugly. The new
move has re-opened a giant can of worms over the moral obligations
of web giants like Facebook and Google. Can changing search results
be good corporate social responsibility or has Google gone too
Rather than just delivering the search results you ask for,
Google has recently adopted a 'we know best' approach, starting in
the United States. From now on, certain searches deemed suicidal
will no longer generate automatic results from a complicated
algorithm. Instead, searches like 'I want to die' or 'ways to
commit suicide' will automatically generate a red telephone icon
and the number of a suicide helpline at the top of search results.
The same approach was adopted for poison control not long ago,
after a mother found it difficult to locate a phone number for
Does it work?
So far, the Google move has got a big pat on the back for
increasing traffic to the helpline by 10%. Though there's no way to
measure how many suicides this new measure has actually prevented,
it's hard to argue that, in terms of suicide prevention, the new
search results could be doing much harm.
On the other hand, it could definitely be doing more good. There
is a lot of potential for Google to introduce the measure in other
countries (like Australia!) and expand the search criteria as well.
At the moment, the helpline icon only turns up in U.S search
results. Even then, the search terms used have to be spot on with
those ear-marked by Google. Anything that doesn't match up exactly
to Google's very short list will turn up regular search results,
including pro-suicide websites. The icon also only shows up on the
first page of search results.
What's all the fuss about?
It's hard to say that providing or emphasising a helpline
contact for those at risk of suicide could be a bad thing. For some
though, Google's interference with search results sets a dangerous
There is potential for Google to start 'prioritising' results on
searches around other social concerns. Issues like murder, child
abuse and obesity are some of the suggestions being put forward.
Who knows? It may be great thing, it may be a disaster. That's the
problem. Deciding on which issues it's ok to intervene in and which
aren't is a very slippery slope. There's also some debate as to
which organisations should get Google's priority placement. Why not
someone else doing similar work? The questions we face here aren't
that different to those raised by Google donating ad or 'sponsored
link' space for charities in a similar search here in
The saving grace of the Google move is that the rest of your
search results aren't altered. There's an extra result added in and
emphasized at the top of your page, but the rest of your search
will show up as normal. Google isn't seeking to remove anything
that isn't in line with its philosophy from your line of view, but
prioritise something that was a little hard to find before.
At the end of the day, the discussion really boils down to the
role of service providers like Google in social issues like
suicide. Some of us may think Google (and others) have some moral
responsibility for how people use their service. In this vein, the
Google move isn't that different from Facebook intervening in cases
of cyber bullying, or YouTube taking measures to keep children safe
from online predators. In a way, this kind of behaviour becomes an
online form of corporate social responsibility.
Taken from 3things newsletter produced by Actnow.com.au