It seems poor Amanda even in Death cant escape online bullies.
Read more: RCMP to investigate Amanda Todd's death (with video)Two memorial pages called RIP Amanda Todd each had more than 4,000 "likes" as of Friday morning.
"She is a beautiful young lady. I am in tears over this," wrote Jennifer Fincher, while many other Facebook users said they hoped she was with angels in heaven.
But not all the chatter was positive. Mike Mace, whose own Facebook page says he is a member of the Canadian Military, faced a flurry of shaming comments after he posted a negative comment mocking her death.
In the comment, he suggests that it's not the bully's fault that she showed her breasts and gave out her private information on the Internet.
"You should be ashamed of yourself," wrote Amber Garofano in response to Mace's comments. Another woman, Ashley Soucy posted, "have a heart."
Offensive photos were also posted on the memorial pages, including one with a person holding a gun to the head, stirred up controversy. One Facebook user called Joseph Lopez posted a picture of Clorox bleach with a caption that read "it's to die for," and in response to the outrage wrote that he was doing it because he thought it was funny.
In September, Amanda posted a video to YouTube entitled My Story: Struggling, bullying, suicide and self harm.
From their terms & conditions under Safety.
You will not bully, intimidate, or harass any user.
You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.
You will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory.
As for reviewing content all they need is the processing power, which they have.
They may in the terms and conditions allow it but if they actually did inspect everything posted, the site would be dead in a matter of weeks especially if they were reporting offences to the authorities as most peopled wouldn't use it out of fear.
So like I posted before you can argue they should implement measures that would most likely bankrupt them all you like but shouldn't be surprised when they don't listen to you.
Like I said before as well in this thread, there are a million ways to solve the problem, requiring Facebook to inspect all information posted isn't the easiest, cheapest or most reasonable. Are you advocating the same for all websites that allow user comments or just Facebook?
What about politics.ie? Do you think it should be made screen all posts? How long do you think it would be able to keep going if that were the case?
It is simply not a viable option for any company to screen all content. The people who post on the website are responsible for what they post.
I am very sorry that this poor girl suffered so horribly.
However I do not see this as a failure on the part of Facebook (though they should be more responsive to 'take-down' requests).
The failure here is in parenting. Children need to be supervised when online - they are not mature enough to cope with the false intimacy of internet relationships.
You think T&C's mean things they don't by the looks of things and think that companies keep surplus processing capacity to inspect all content on their site and pre-screen it against their T&C's or something crazy like that.
Suppose they already have the software written to do it too, just haven't bothered to turn it on because they like throwing away money.
FACEBOOK like any other corporate entity have to have a Corporate Social Responsability ethic. You can'ty just put a product on the market, step back and deny all accountability for role that product plays in events like this particular tragic story. They need to engage on this and do something about it.