Radio Free never accepts money from corporations, governments or billionaires – keeping the focus on supporting independent media for people, not profits. Since 2010, Radio Free has supported the work of thousands of independent journalists, learn more about how your donation helps improve journalism for everyone.

Make a monthly donation of any amount to support independent media.





‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation

Janine Jackson interviewed Global Witness’s Jon Lloyd about Facebook disinformation for the August 19, 2022, episode of CounterSpin. This is a lightly edited transcript.   Janine Jackson: Social media platforms’ role in shaping the sharing and fomenting of ideas that they purport to merely facilitate is a widely diagnosed concern. As with any kind of […]

The post ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ appeared first on FAIR.

Janine Jackson interviewed Global Witness’s Jon Lloyd about Facebook disinformation for the August 19, 2022, episode of CounterSpin. This is a lightly edited transcript.

      CounterSpin220819Lloyd.mp3

 

Janine Jackson: Social media platforms’ role in shaping the sharing and fomenting of ideas that they purport to merely facilitate is a widely diagnosed concern. As with any kind of media criticism, it’s important to look at broad patterns of societal impact and to track and unpack the distortions of these media in real time, as they have important real-time, real-world effects.

Our next guest’s recent work does both, really. Global Witness has been monitoring Facebook‘s failure to check outright disinformation in the run-up to elections in Brazil. Jon Lloyd is senior advisor at Global Witness. He joins us now by phone from London. Welcome to CounterSpin, Jon Lloyd.

Jon Lloyd: Thanks for having me.

JJ: Before we talk about what you found, let me ask you why you chose to conduct the inquiry. What were the questions or concerns that drove your investigation into Facebook‘s role in Brazilian elections?

JL: The reason that we chose Brazil is we’ve realized that the choices of the world’s major tech companies have had a big impact online, before and after high-stakes elections around the world. And all eyes are on Brazil this year.

Guardian: WhatsApp fake news during Brazil election 'factored Bolsonaro'

The London Guardian (10/30/19) reported that during the 2018 Brazilian presidential election, 42% of viral right-wing messages on Facebook-owned WhatsApp contained false information, versus less than 3% of viral left-wing messages.

The reason being is that disinformation featured heavily in the 2018 election, and this year’s election has already been marred by reports of widespread disinformation spread from the very top. The president, Bolsonaro, right now is already seeding doubt about the legitimacy of the election results, and that’s leading to some fears in Brazil of a January 6–style, “Stop the Steal” kind of coup attempt.

In addition to that, we’ve also done some research into Facebook‘s ability to detect hate speech in other areas which it’s called “priority countries,” so Myanmar, Ethiopia and Kenya. And what we found in those investigations was that, well, they didn’t detect any of it, and really with no explanation.

So we thought Brazil was a good opportunity to see if they’re putting their money where their mouth is, so to speak. They have highlighted that as a priority country when it comes to elections, and really, outside of the US midterms, there is no bigger election this year.

JJ: Well, then, tell us about the investigation itself. What did you do exactly, and what did it tell us?

JL: We sourced, firstly, ten examples of election-related disinformation. Some of those are real-life examples, and others we had pulled from the Brazilian Superior Electoral Court’s Counter Disinformation Program. The Superior Electoral Court has said that they’ve been working with social media companies, in terms of helping identify and do a bit of debunking of some common election disinformation.

So we chose examples that largely fell into two categories. The first thing we did, outright false election information. So [ads] that had the wrong voting day, different things about how to vote—for example, instructions on how to vote by mail, which is banned in Brazil.

And then we had a second category of ads, which was content aimed to delegitimize the election result. It was specifically about Brazil’s voting machines, which they’ve used without incident since 1996. So we created those ads, and then we set them up with an account which should have gone through their ad authorizations process—that’s where an account posting political, social-issue or election-related content has to be verified.

Global Witness: Facebook fails to tackle election disinformation ads ahead of tense Brazilian election

Global Witness’s investigation (8/15/22) found Facebook consistently approved ads from an unauthorized account with election disinformation, including ones advertising the wrong election day.

Really, we broke all the rules when it came to setting up that account: We set it up outside of Brazil; we used a non-Brazilian payment method; we posted ads while I was in Nairobi, and then back here in London, which is not allowed. And, of course, I’m not Brazilian—you need to be a Brazilian and present ID.

So there were lots of opportunities for Meta to detect that this was an inauthentic account. We created that account, and then we submitted our examples of disinformation. And all of them were accepted.

JJ: All of them. All of them, including the ones that said the wrong day on which you should vote.

JL: Yes! And actually, initially, one of the ads that we submitted was rejected under Facebook‘s ads about social issues, elections or politics policy, but just six days later, without any intervention from us, the ad was approved, again without any explanation.

So this bizarre sequence of decisions from Facebook really seriously calls into question the integrity of its content-moderation systems—especially, I think, because that was another opportunity for some sort of additional review, both of the authenticity of our account—we weren’t supposed to be allowed to post any political content—and then also to review the other ads that we posted. So it was quite confusing, and quite concerning, too.

JJ: Absolutely, and disheartening.

You have stated that you’ve also looked at Myanmar, Ethiopia and Kenya. So this isn’t just out of the blue; this is something that you chose to look at, Brazil, because there have been pre-existing problems and issues with this content-moderation process. So in other words, you would think that Facebook would be being extra-vigilant at this point, having already been called out on this in the past.

Jon Lloyd

Jon Lloyd: “Facebook will tout the ability of its content-moderation systems to pick this stuff up. And we just bypass it so easily.”

JL: Absolutely. And it’s really part of a trend, which is, Facebook will tout the ability of its content-moderation systems to pick this stuff up. And we just bypass it so easily.

And one thing that I’ll just say, that is important to note, is the reason that we choose ads is because we can schedule those ads in the future, and they still go through that same content-moderation process, but nobody ever ends up actually seeing the content. We can see that the ads go through the content moderation process and are approved, but then we take them down before the scheduled launch date of those ads.

But as far as we know, the content-moderation process is exactly the same for that organic content that people just post on Facebook, and for ads as well. And if anything, for election-related content, it sounds like for ads, it’s even stricter.

JJ: I appreciate that clarification.

You have stated that Facebook knows very well that its platform is used to spread election disinformation and undermine democracy around the world.

I’ve not read the very latest “shocked, simply shocked” corporate response, but it doesn’t matter, because we judge them by their actions and not by their press releases. So what are you at Global Witness, and I know others as well, calling for at this point? What needs to change from Facebook, and then maybe in terms of public understanding of or reckoning with Facebook?

JL: Yeah, we’re asking Facebook, really, to take this seriously. It has to consider all of this, putting our safety as a priority, as a cost of doing business. And with the US midterms around the corner, they have to get it right, and right now.

Our recommendations fall into two main categories. One is around resourcing and the other is around transparency. So we want to make sure that they properly resource the content-moderation and the ad-account verification processes, just getting all of that up to scratch.

But then on the transparency side, crucially, we need them to show their work. It’s not enough to dazzle us with statistics that have no base of reference. We don’t know what the common denominator is, so saying that they’ve removed 1,000 accounts or 100,000 accounts, I don’t know if that’s good or bad. Same with the amount of posts, because there’s nothing to compare it to.

But the one thing that we do know is that our content that we tested from my computer here in London all got through. So ultimately, it falls down to resourcing its content-moderation capabilities, and those integrity systems deployed on the platform globally as well, not just in countries that it thinks are more important.

And then we want them to publish their risk assessments that they do for each country as well. We know that they’re likely to have done one for Brazil, and, really, we want to make sure that in languages that aren’t English, and in countries that aren’t the United States, that they’re actually doing what they say they’re going to do.

So perhaps that means some verified, independent third-party auditing, so that Meta can be held accountable for what they say they’re doing, and aren’t just left to mark their own homework.

Then when it comes to people like me and you, there’s a real opportunity to be a bit skeptical about what you’re seeing online, and even things like the “paid for” disclaimers—we weren’t required to put one of them on any of our content, because we bypassed the political-ad authorization process.

So even things like that, I think it’s maybe doing a little bit of additional research if you’re seeing something and it’s shocking, probably designed to be a bit shocking. So you want to verify that from trusted sources.

JJ: All right, then. Well, thank you very much. We’ve been speaking with Jon Lloyd. He’s senior advisor at Global Witness. You can find their work online at GlobalWitness.org. Thank you so much, Jon Lloyd, for joining us this week on CounterSpin.

JL: Thank you for having me. Cheers.

 

The post ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ appeared first on FAIR.


This content originally appeared on FAIR and was authored by Janine Jackson.


Print Share Comment Cite Upload Translate Updates

Leave a Reply

APA

Janine Jackson | Radio Free (2022-08-26T21:45:29+00:00) ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation. Retrieved from https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/

MLA
" » ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation." Janine Jackson | Radio Free - Friday August 26, 2022, https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/
HARVARD
Janine Jackson | Radio Free Friday August 26, 2022 » ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation., viewed ,<https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/>
VANCOUVER
Janine Jackson | Radio Free - » ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation. [Internet]. [Accessed ]. Available from: https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/
CHICAGO
" » ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation." Janine Jackson | Radio Free - Accessed . https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/
IEEE
" » ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation." Janine Jackson | Radio Free [Online]. Available: https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/. [Accessed: ]
rf:citation
» ‘Bizarre Decisions From Facebook Call Into Question Moderation Systems’ – CounterSpin interview with Jon Lloyd on Facebook disinformation | Janine Jackson | Radio Free | https://www.radiofree.org/2022/08/26/bizarre-decisions-from-facebook-call-into-question-moderation-systems-counterspin-interview-with-jon-lloyd-on-facebook-disinformation/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.