Did you Tricolorize your Facebook profile after the Paris atrocities? Huge swathes of people did, believing it was a simple way to show of digital solidarity with the French. And the nice people at Facebook made it easy. They even turned on the safety check function that’s only been ever been used for natural disasters. Which is great – except that it illustrates a huge Western bias at this most global of social networks.
You see, there was never an equivalent option when atrocities happened in Africa or the middle east.
if Facebook is truly global, why didn’t it offer users the chance to overlay the Kenyan flag in support for the 147 students killed by al-Shabaab in April?
And it’s not just Facebook that is using technology to mask editorial decisions. You know how Google searches, right? All those virtual robots, prodding every visible web page and interrogating the content. Not just the words in an article, but the names of images and the captions that go with them. And so much more. This is basic SEO stuff.
Except that doesn’t mean that Google’s search returns are unbiased. In fact, Google wields immense power when its human executives make huge editorial decisions. And unlike a paper, you don’t realise they’re being taken.
In his TED Talk below, Andres Ekström highlights two similar cases that elicited vastly different responses because the people at Google made a clear, moral decision about right and wrong, about good and bad. You’ll probably agree with the decisions they took… but are you aware that these calls are being taken? It’s alway worth keeping your eyes open.
P.S. There’s an insightful piece about the ‘war’ between Media and Tech over at Medium that’s well worth a read too. It’s core question:
“What will be the compromise at the core of a new new media?”