In a blog post this week, social media giant Facebook announced several changes meant to mitigate the effects of misinformation and hate speech causing real-world violence, including the formation of a Strategic Response team meant to stop a repeat of what happened in Myanmar.
Myanmar, where less than one per cent of the population had internet access as recently as 2014, became a Gods Must Be Crazy-like testbed for Facebook. For many in the region, Facebook is their one and only means to get information online.
It’s also, in Facebook’s own words, “the only country in the world with a significant online presence that hasn’t standardised on Unicode” — meaning many of the artificial intelligence tools meant to weed out harmful content simply didn’t work there.
Couple that with a lack of native speakers acting as moderators, weak civil liberties and racial tensions, and it’s little wonder Facebook played a starring role in stirring up an ongoing genocide there.
[referenced url=”https://gizmodo.com.au/2019/01/whatsapp-puts-new-limit-on-message-forwarding-in-effort-to-curb-misinformation/” thumb=”https://i.kinja-img.com/gawker-media/image/upload/t_ku-large/fuskgnr57sbigbwaigyg.jpg” title=”WhatsApp Puts New Limit On Message-Forwarding In Effort To Curb Misinformation” excerpt=”WhatsApp, the Facebook-owned chat service, has announced that it will cap the number of messages that can be forwarded at once to five, Reuters reported Monday. The move is WhatsApp’s latest effort to limit the spread of misinformation and fake news on its app.”]
Similar incidents have cropped up in Sri Lanka, India and elsewhere. The company promised last November, as it always seems to be doing lately, to do better.
The first fruit of those labours is the Strategic Response team — a group of individuals who, according to reporting by NBC, include “former diplomats, human rights researchers, a former military intelligence officer and one person who advised oil giant BP on geopolitical risk”.
Facebook did not share how many people actually comprise the team, but a member of the Myanmar Tech Accountability Network who has worked with the team claimed they number less than 10. Facebook first became aware of its complicity in Myanmar’s Rohingya genocide at least five years ago.
(Speaking to NBC, the head of the Strategic Response team, Rosa Birch, said, “there’s a lot of similarities there between government and military and Facebook,” an alarming statement at a time when Facebook is also working on minting its own money.)
Surely Facebook is doing more than putting together some perfunctory advisory committee a la its ridiculous PR plays at election security “war rooms”, right? Barely.
The company’s top agenda item is that it will be “removing bad actors and bad content”. Some version of that phrase has appeared in dozens of press releases from Facebook, second only to maybe “continuing to improve”, and doesn’t inspire much confidence compared to the seriousness of — and I can’t stress this enough — literal genocide.
There’s some cause for optimism, in that Facebook’s (self-reported) stats on catching bad content globally went up by around eight-per cent; Facebook is also in the process of staffing up Burmese speakers transitioning Myanmar to Unicode.
Hand in glove with actually enforcing its own long-standing policies, the company claims it’s continuing to limit distribution of content that approaches what’s off limits. Reason being: Internal research suggested that “borderline” content was being rewarded with increased engagement, incentivising the exact wrong sort of behaviour.
The only discreet change Facebook deigned to tell users that it’s willing to make, though, is “adding friction” to Facebook Messenger by limiting the number of times a message can be forwarded.
Chain messages, including those containing images and memes meant to inspire violence, were at the heart of issues in Myanmar, Sri Lanka and, let’s be honest, probably America too. According to TechCrunch (Facebook didn’t even bother to quantify it in its own post), the limit is five people, the same limit it places on WhatsApp message forwarding in India.
Forgive me if none of these solutions seem up to the task of undoing the damage Facebook’s misinformation machine has already wrought abroad. As an unrelated reminder, here’s the link for deleting your Facebook account, if you still have one and feel so inclined.