Verizon reveals range of new tablets, smartphones for 4G network
Talking Tech at CES: Those big, beautiful TVs

Facebook called to do more to prevent suicides signaled by users

By Byron Acohido, USA TODAY
Updated

An Internet safety expert is calling for Facebook to take more direct steps to intervene in potential suicide cases signaled by users of the popular social network.

That suggestion from Hemu Nigam, founder of Internet safety consultancy SSP Blue, follows the recent suicide of a 42-year-old British woman on Christmas Day. Simone Back posted her last message to 1,048 Facebook friends saying "I took all my pills be dead soon bye bye everyone."

Only a few friends responded to Back -- with skepticism and mockery, according to The Telegraph of London.

Nigam, former chief security officer at MySpace, says Facebook should provide a simpler way for its members to report potential suicides. He also contends Facebook needs to get more proactively involved in notifying law enforcement and suicide counselors about developing suicide cases that come to light on its Web pages.

"When you create a technology that's widely used in society, and someone uses that technology to reach out for help, the company should step forward to help if it can," says Nigam. "That should be a core corporate responsibility."

Facebook spokesperson Marian Heath says the social network takes safety very seriously and does plenty to prevent suicides. On the upper right corner of each Facebook wall posting, there is a small blue x used for reporting problem postings. Clicking on the blue x opens a window with choices you can make by clicking on check circles. The user can navigate to a check circle indicating "self harm," and thereby alert Facebook to the problem posting.

"We try to make it easy and intuitive for someone to quickly report something they find that's disturbing," says Heath. "We have people on staff 24 hours a day to review these self-harm reports and respond to them."

Heath declined to say how many self-harm reports Facebook receives and how many staffers are assigned to review them. She said those staffers have, in the past, contacted law enforcement and professional suicide counselors to intervene. But she declined to characterize how often those types of contacts have actually happened.

Another way a Facebook user can alert the company is by navigating to its "help center." You can then navigate to an extensive "report suicidal content" form, fill it out, and click the submit button. The form then goes to the same staffers assigned to monitor self-harm posting notices.

"We provide timely follow ups," said Heath. "But at the end of the day, if you're concerned about someone, you should call 911. There's only so far you can go by bringing a third party into the picture. We can't possibly respond as quickly as 911."

Nigam disagrees. He points out that MySpace uses a reporting system that's more transparent and accessible than Facebook's. He says the MySpace system is tuned to quickly and directly contact local law enforcement and professional suicide conselors about worrisome suicide postings. The MySpace system has saved nearly 100 people from killing themselves, says Nigam.

Last September, Rutgers freshman Tyler Clementi messaged his Facebook friends that he was heading out to jump off the George Washington Bridge, which he did eight minutes later. Some of his friends tried to persuade him in Facebook postings not to do it.

Nigam contends that Clementi's friends should have been able to quickly alert Facebook. He says Facebook should have had a reporting system in place configured to quickly alert local police. Nigam maintains that had such a system been in place, officers could have intercepted Clementi at the bridge and saved his life.

"I'm extremely disappointed that Facebook is shifting the burden for handling such critical information to the users," says Nigam. "They should have learned their lesson after the George Washington Bridge incident, and now this has happened twice."

Nigam contends there is no onerous expense or liability issues stopping Facebook from improving its system. He says Twitter should do so, as well. "It's just a matter of making a corporate decision about safety," he says. "Pushing the onus for safety onto users is not the right thing to do."

By Byron Acohido

PREVIOUS
Verizon reveals range of new tablets, smartphones for 4G network
NEXT
Talking Tech at CES: Those big, beautiful TVs
To report corrections and clarifications, contact Standards Editor Brent Jones. For publication consideration in the newspaper, send comments to letters@usatoday.com. Include name, phone number, city and state for verification. To view our corrections, go to corrections.usatoday.com.