One of the most chilling aspects of Elliot Rodger's killing spree in Santa Barbara, Calif., on Friday was that he foretold his actions in a YouTube video posted earlier that day.
Despite the ease with which people can post and view online video, cyber-law experts say it's impossible to monitor a site such as YouTube in the hopes of preventing similar crimes.
"It would take thousands of people working 24-7 to even dip into that … torrent of stuff," says David Fraser, an internet and privacy lawyer with the Halifax firm McInnis Cooper.
And it is not just a matter of volume. "It becomes a very difficult exercise in trying to figure out what is real and what is hyperbole, what is an extreme political position and what is hate speech," he adds.
- California shootings: Parents heartbreakingly raced to stop killings
- California shootings: What we know about the victims
- #YesAllWomen: Twitter responds to alleged California shooter's YouTube video
On Friday night, Rodger, a 22-year-old living in southern California, killed six people in the Isla Vista community of Santa Barbara before killing himself. Thirteen others were injured during the rampage.
In a video posted to YouTube earlier on Friday entitled "Elliot Rodger’s Retribution," he said he had been "forced to endure an existence of loneliness, rejection and unfulfilled desires."
Speaking into the camera, he said, "You girls have never been attracted to me. I don't know why you have not been attracted to me, but I will punish you for it.
"I'll take great pleasure in slaughtering all of you," he said.
Viewers police site
YouTube removed the video on Saturday. Since the incident, authorities have determined that Rodger had posted at least 22 YouTube videos in the last year.
One of the recurring questions since Friday has been whether the tragedy could have been averted, given that Rodger had posted a number of threatening videos on YouTube.
Fraser says this is nearly impossible, given the sheer amount of content posted on the site. According to a YouTube spokesperson, 100 hours of video are uploaded to the site every minute.
The site's user guidelines disqualify videos on the basis of graphic violence or sexual activity, hate speech and copyright infringements.
In fact, YouTube has developed an automated system that identifies videos that infringe on copyrighted material. It's called Content ID, which allows copyright owners such as film studios and record companies to identify their own material on YouTube.
According to a tutorial posted on Google, which owns YouTube, videos "uploaded to YouTube are scanned against a database of files that have been submitted to us by content owners."
When another work is posted to the site that matches a work posted by, say, a film studio, the owner of the second video receives a notice saying that their clip contains copyrighted material. The copyright owner can then decide whether or not to ask YouTube to remove the video.
But there doesn't appear to be a similar system for offensive content. YouTube relies on crowd-sourcing to monitor offensive content.
Review team
"Videos threatening violence are against YouTube’s guidelines and we remove them when they are flagged," a YouTube spokesperson said in an email statement.
"We encourage anyone who sees material that they think crosses the line to flag it for us."
Each YouTube video contains a "Report" button that allows users to send a complaint to the site if they see a video they deem offensive.
Google did not elaborate on how it arbitrates YouTube complaints, but in a New York Times piece published on the weekend, a Google representative said that after a video is flagged, it is scrutinized by special review teams that determine whether it will ultimately be taken down.
Reporting an offensive video is no guarantee that it will be removed, says Toronto internet lawyer Gil Zvulony, who has dealt with a number of clients who have complained about libellous material on YouTube.
"I've had clients come to me and they've reported stuff [posted online] that was really bad and nasty about them, and YouTube's done nothing about it," says Zvulony.
Freedom of expression
Google told the New York Times that Elliot Rodger's video had been taken down because it violated the service’s guidelines against threats and intimidating behaviour.
Fraser says that part of the problem is that freedom of expression laws in Canada and the U.S. protect "everything that is speech, short of a death threat."
Freedom of speech is "your right to say things free from government interference," says Fraser. But a platform such as Facebook or an internet service provider can nonetheless impose restrictions on graphic material and threatening posts in their terms of use.
Facebook uses a combination of tools to weed out troublesome content, including automated keyword searches and an "active reporting team devoted to filtering through requests" from Facebook users, according to a Facebook spokesperson.
Even so, Fraser says there is still a challenge in pinpointing the difference in intent between millions of hateful online rants and a video such as Rodger's, which foreshadowed a massacre.
To illustrate the difficulty of discerning a person's true intent, Fraser cites a recent example from Nova Scotia, where an individual announced on Twitter that they were going to commit suicide.
The social media community spotted the tweet and mobilized to alert police, only to discover the crisis was a hoax.
"There is a lot of stuff out there that on the one hand might appear to be a literal threat, but is really not," Fraser says.
"I would not want to be in the position of having to make those judgment calls on a daily basis."
Zvulony acknowledges that Rodger's case has again shone a spotlight on the role of social media in crime prevention, but he cautions against trying to change cyber law based on a "freak occurrence" such as this, where an individual broadcasts his intent to kill right before doing so.
"I don't know if we should be making policies based on these freak occurrences," Zvulony says.
"This is a difficult case, and it was very public, but I don't think we should be looking at YouTube as the culprit here."
0 komentar:
Posting Komentar