There are “further questions to be answered” by Silicon Valley’s tech giants following the terror attack in Christchurch, according to New Zealand’s prime minister – but she might find the answers unsatisfactory.
Facebook, Youtube, Twitter and others ‘broadcast’ footage which showed the massacre.
I use the word ‘broadcast’ deliberately – and some of the internet companies don’t like it.
They do not see themselves as ‘broadcasters’ but as ‘platforms’ where people just happen to place their content – more like your kitchen table or your shopping bag than ITV or the BBC.
This definition is at the heart of the issue of who should take responsibility for the live-streaming of Friday’s killings.
Update from Mia Garlick, Facebook New Zealand: "We continue to work around the clock to remove violating content using a combination of technology and people…
— Facebook Newsroom (@fbnewsroom) March 17, 2019
It’s why Facebook would rather talk about their efforts to take down offensive video than the fact they helped to broadcast it in the first place.
“We continue to work around the clock to remove violating content using a combination of technology and people” said Mia Garlick, a spokeswoman for Facebook New Zealand.
“In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload,” he added.
Of course, just one video was one too much.
NZ PM says there are “further questions to be answered”. But many of these are the same questions that followed previous incidents, such as the shootings at a football video game tournament in Florida last year, which were live-streamed on a different ‘platform’
— Rohit Kachroo (@RohitKachrooITV) March 17, 2019
If Facebook’s answers are familiar, so are prime minister Jacinda Ardern’s questions.
Some were asked following a shooting at a video game tournament in Jacksonville, Florida last year, which was broadcast live online via Twitch, a video-streaming site.
Many of the same themes were raised after the killing of Fusilier Lee Rigby in 2013, when Facebook was accused of failing to pass on information that could have prevented the murder.
The website was accused of becoming a “safe haven for terrorists” – that allegation remains.
When I visited the giant Facebook campus near San Francisco a year ago, it was difficult not to compare it with the much smaller headquarters of broadcasters in the UK, where producers and lawyers grapple over contentious material and are forced to take responsibility for every frame that goes out.
Once again, social media platforms stand accused of helping to turbo-charge modern terrorism.
They have helped to provide white supremacists with an ideological echo-chamber – I read hateful messages every day.
As for broadcasting the moments of death, they have yet to provide an answer that is likely to satisfy.