NEW YORK —
Mobile video is changing the way we witness crime, from live footage of a mentally disabled man tortured by four assailants, to a recording that led to the manslaughter conviction of an Israeli soldier , to the body cameras designed to keep police accountable.
We’re all still wrestling with the implications.
In theory, such videos should make it easier to hold criminals, including police officers who violate the law, accountable. In practice, that hasn’t always worked out the way proponents had hoped, although smartphone video played a big role in elevating public awareness of police violence.
And sometimes the presence of a camera might actually encourage criminal activity, or at least deter bystanders from helping victims.
SCENE OF THE CRIME
It’s not clear how any of that might have played out in an incident on this week, when attackers used Facebook Live to stream the beating and torture of a man with mental health problems. They threatened him with a knife, cut off his clothing and forced him to drink from a toilet.
The assault went on for up to two days, according to police, though it’s unclear how much of it was streamed on Facebook Live. Police have arrested four people in connection with the crime.
Last year, an Ohio woman plead not guilty to charges of rape, kidnapping and other crimes for live-streaming the rape of a friend on Periscope, Twitter’s live-streaming app. Prosecutors said Marina Lonina continued to film the assault despite the victim’s cries for help, caught up in the attention the livestream was getting. Lonina’s attorney said she was recording the attack as evidence.
During a Thursday press conference about the Chicago assault — itself live-streamed on Facebook — Chicago Police Cmdr. Kevin Duffin noted, “I can’t understand why anyone puts anything on Facebook.”
Facebook says it does not allow people to “to celebrate or glorify crimes” on its site. It has already removed the original video of the Chicago incident for that reason.
But the social network does allow crime video when people share it “to condemn violence or raise awareness about it,” the company said in an emailed statement.
That can lead to tricky assessments of intent. Facebook, for instance, wrote in a blog post that it would allow a violent video posted by someone who used it to help find the shooter, but would remove it when posted by another person who mocked the victim or celebrated violence.
Facebook generally tries to avoid making such judgments, preferring to rely on algorithms that automatically filter out banned content such as pornography. When it makes exceptions, it often wades into difficult territory — such as the time it was forced to restore the Pulitzer Prize-winning “napalm girl” photo after removing it because it features a naked child, blind.....