Search this website
Mark Zuckerberg seemed to be in a jovial mood as he started off the F8 Developers Conference by making a reference to how the F8 2017 event was happening around the same time as the release of the Fast and Furious 8. He followed this up by holding a tome, a funny reference to the 6,000-word manifesto he had written in the past about building global communities.
A couple of awkward jokes later, there was a poignant moment during the keynote. Zuckerberg said, “We have a full roadmap of products to help build groups and communities, help build a more informed society, help keep our communities safe. And we have a lot more to do here. We were reminded of it this week by the tragedy at Cleveland.”
On Sunday, 37-year-old Steve Stephens murdered an elderly man in Cleveland, Ohio, and then posted a video of the crime on Facebook. This video was up on Facebook for nearly 2 hours before it was taken down. Two hours is a lot of time for a video showing an illegal activity, to stay online. Naturally, Facebook came under a lot of criticism for the fact that the video was taken down after so long. Especially when the accused boasted that he had already killed 12 people and would continue to kill.
Facebook Live: Boon or Bane?
According to NBC News, Stephens had posted two additional live videos as well — the first when he announced his intent to kill and the second after the murder, where he confessed to it.
The accused, Steve Stephens, was found dead after a police pursuit just an hour before the F8 Developers conference keynote had begun.
“Our hearts go out to the family and friends of Robert Godwin Sr., and we have a lot of work — and we will keep doing all we can to prevent tragedies like this from happening,” said Zuckerberg.
Here is a timeline of the events on Sunday, according to Facebook:
11:09 AM PT: First video, of intent to murder, uploaded. Not reported to Facebook.
11:11 AM PT: Second video, of shooting, uploaded.
11:22 AM PT: Suspect confesses to murder while using Live, is live for 5 minutes.
11:27 AM PT: Live ends and Live video is first reported shortly after.
12:59 AM PT: Video of shooting is first reported.
1:22 AM PT: Suspect’s account disabled; all videos no longer visible to public.
According to sociologist Shiv Viswanathan, these acts are an attempt by the perpetrators to make a spectacle out of themselves, thereby turning your body or the person you are murdering into a part of the gaze. “You objectify it. You basically frame it in a way that other people can consume it. Violence itself is a form of consumption which is promoted, because so far no one thought that people would consume violence on Facebook,” says Viswanathan.
Facebook needs to take up moral responsibility
Facebook always keeps saying that it is not a publisher, but a platform. This despite the fact that it provides enough tools for anyone to become a publisher. And with a 2 billion plus population using it, the publisher has a ready audience too.
Facebook Live is a great tool if used correctly. It has been effective in shining a light on issues when mainstream media has been stifled. It has been instrumental in bringing to the fore atrocities that people have faced at the hands of those who were supposed to be their protectors. Media houses routinely use Facebook Live to inform the public and answer their questions in real time. The possibilities are immense.
But every coin has two sides. Just like Facebook Live can be used for the good of the society, there are elements, such as Stephens, who will take the other extreme. The Facebook Live video policy has specific guidelines, wherein they do not immediately take down videos just because they are showing violence. The only exception is if the video is glorifying violence — as was the case with Stephens.
Under its Community Standard, Facebook has this to say about “Violence and Graphic Content”:
Facebook has long been a place where people share their experiences and raise awareness about important issues. Sometimes, those experiences and issues involve violence and graphic images of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, they are condemning it or raising awareness about it. We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.
Recently, a young college student from Mumbai used Facebook Live to actually give tips on committing suicide. Following which he jumped to his death from the higher floors of a five-star hotel — all while Facebook Live was rolling. And this isn’t even the first time a suicide has been committed live on Facebook. There are many instances of that. Yes, it is sad and quite a shocking thing.
“Murder or suicide is an intimate act, and these videos go on to show that if these acts are not put up on Facebook, they are in a sense, incomplete. So the production and consumption of violence is merging. All these things are new and they are quite frightening. We haven’t really thought deeply about these things,” says Viswanathan.
So when Zuckerberg says, “we will keep doing all we can to prevent tragedies like this from happening,” after so many tragedies already having taken place on its platform, then things tend to sound a bit hollow.
What Facebook is doing on this front
Clearly, there is no way for Facebook to know before hand if a violent act is going to be committed on Facebook Live. That makes it even more pertinent to ensure that strong measures are put into place to prevent such incidents from happening. Facebook has a huge population and there are possibilities of there being copycat cases of crimes on Facebook Live, as often happens in real life.
Facebook has announced that it would integrate real-time suicide prevention tools into Facebook Live, offer live-chat support from suicide prevention helplines and through Crisis Text Line through Facebook Messenger.
Facebook also said that it would add tools to make it easier to report suicide or self-harm. Facebook is also planning to employ artificial intelligence to identify warning signs of self-harm and suicide via Facebook posts or comments.
These are good starting points. But Zuckerberg should go beyond giving mere lip service about such cases. As a platform with 2 bn active users, the dynamics are completely different for Facebook as compared to other social media sites.
TV broadcasters, which go live from locations, follow certain set of guidelines. And there are consequences if one starts broadcasting illegal activities on TV. Facebook may not be in the same league as a TV station, but since it is a platform, it may not really have to undergo the same stringent check-list. In such a case ensuring self-censorship, specially for illegal activities happening on its platform, is a task that Facebook has to invest resources in.
How does this affect the society we are living in?
According to sociologist Nandini Sardesai, these incidents show the flip-side of social media and could lead to scary situations if not controlled in time. “Back in the day, technology was used to communicate, to reach out to people. You could never imagine someone picking up a telephone and start off by abusing the person on the other end. Though technology has helped us communicate better, I think a lot of people are just using it to spew venom, vent out their anger and frustration, which is not a good thing,” said Sardesai.
Incidents such as the suicides or murders being broadcast live on Facebook are just showing of the negative aspects of human nature, according to Sardesai. She feels Facebook does have to take responsibility, but adds that social media in general is not alone responsible for these things.
Social media can be looked at, in a way, as an electronic enemy, because there is a sort of boundlessness in this type of social media, says Viswanathan.
Image: Foamy Media
“Sure, these platforms may be used to showcase the crimes, but the onus also lies with everyone of us to call these things out. There has to be a boomerang effect rather than a domino effect. People must be punished for committing crimes on social media. Only then will there be deterrents,” said Sardesai.
According to Viswanathan, even though the social media sites call themselves as platforms, they cannot be carte-blanche platforms allowing everything to go on. “I think both the platforms and the people consuming this sort of content are at fault. It’s a reciprocal relationship. Once people consume this content, they talk about it and that somehow helps in its propagation. There should be some intervention there,” says Viswanathan.
Then there is the fear of copycat suicides or murders, just like in the real world. “Yes there will be a few cases of copycat suicides, but let’s just hope it is a fad and it dies off sooner rather than later,” says Viswanathan.
So while Mark Zuckerberg has said that ‘more needs to be done’, it is high time he actually stated how this issue is going to be resolved. Limiting the usage of the Live feature to selected users would be counterproductive, just because some people have misused it. Moreover, Facebook will not let it happen, as its bottomline depends on how engaged it keeps its users — Facebook Live videos ensure lot of engagement. But all said and done, so long as illegal activities are broadcast on Facebook’s watch, the responsibility will also come with it.
Just like Zuckerberg has become very specific about what Facebook intends to do to tackle the menace of fake news, he needs to similarly be proactive when it comes to tackling live broadcast of criminal activities on Facebook. The artificial intelligence possibilities showcased at the F8 keynote were impressive. Some of that AI goodness has to trickle down to help Facebook deal with these grey areas. Let’s just hope that happens sooner rather than later.