Should It Be Removed? Facebook Seeks Help to Deal with Brazilian Post Against Lockdown

published in

9 de June de 2021

categories

{{ its_tabs[single_menu_active] }}

theme

Another incident taking place in Brazil has been referred to Facebook’s Oversight Board, the body responsible for reviewing the company’s decisions on removing or keeping accounts and content on the platform. This time, the case involves the battle against Covid-19 and the various (and more or less scientific) narratives that have flourished on social media.
In March 2021, a state-level medical council in Brazil, as stated in the case description, posted an image on its Facebook page containing text opposing the adoption of lockdown measures and the restriction of population’s movement as ways to reduce the spread of Covid-19.

The text stated that such measures were not only ineffective but also violated the Federal Constitution and were condemned by the World Health Organization (WHO). To illustrate the alleged disapproval by the health organization against the lockdown measures, the post published by the board featured a quote by Dr. David Nabarro, working with the WHO, stating that “the lockdown does not save lives and makes poor people much poorer.”

The project Comprova, of which UOL is a member, has already identified other posts that distortedly reproduce the words of the WHO consultant. In the same interview, Dr. Nabarro stated that lockdowns are justified in times of crisis to reorganize healthcare systems and protect Covid-19 frontline workers.

Edited shortly after the crisis caused by the shortage of oxygen in hospital units in Manaus, the text posted on Facebook claimed that the state of Amazonas had an increase in the number of deaths and hospital admissions right after putting the lockdown into effect, which allegedly proved the ineffectiveness of such restrictions.

The text further stated that restrictive measures such as lockdown would lead to an increase in mental disorders, alcohol, and drug abuse, as well as economic damage.

According to the case description, the content of the post was viewed about 32,000 times and shared more than 270 times. No notification was available on the platform reporting this post. Likewise, Facebook did not moderate the content either, opting instead to refer the case to its Oversight Board.

The body reported that by referring the case to them, Facebook initially found that the post didn’t violate the company’s policies. According to the big tech company, content infringing its Misinformation and Harm Policy is removed from the platform “when public health authorities conclude that the information is false and likely to contribute to imminent violence or physical harm.”According to Facebook’s assessment of the case, the medical council’s post doesn’t fall into this category. Although the company was advised by the WHO and other authorities to remove posts contrary to health practices used to tackle Covid-19, such as social distancing, the corporation stated that those guidelines did not include removing posts advocating against the lockdown.

Public consultation on the case has been opened by Facebook’s Oversight Board. Interested parties can contribute to the body’s decision-making up to June 16.

Among the points on which the Board is seeking more information are:
Comments on whether Facebook was right or wrong to have left the content in its platform in light of its policies, values, and guidelines concerning the protection of human rights;

Contributions on whether any form of moderation other than removal would have been more appropriate (such as inserting warnings or informative links into the post) and;

Information on the context of online posting in Brazil, emphasizing possible contradictions between publications uploaded by state-level medical councils and the recommendations of international health authorities, as well as their impact on the battle against Covid-19 in the country.

If the Board agrees with Facebook that no moderation was required, many are expected to say that the body has sanctioned the ineffectiveness of lockdowns (when, in fact, it would only be evaluating how that given post fits into the company’s rules).
Additionally, the decision not to moderate the content would cast doubt on the company’s efforts to combat misinformation during the pandemic, since the WHO consultant’s speech was taken out of context and used for purposes contrary to those recommended by the organization itself. Similar cases of misinformation in the United States about fighting the pandemic have been targeted for moderation by the platform.

But if the Board rules that Facebook has committed a mistake, removing the post at this point will certainly add fuel to the flames of the discussion on the role of platforms in controlling what is voiced and published on their social media.
Surely some will argue that moderation — however late — is a form of censorship. In this regard, the federal government has drafted a decree to limit the ability of ISPs to moderate content and delete their users’ accounts. If it were to be effected, Facebook would be rendered unable not remove, tag, or limit the visibility of posts.

Either way, the Facebook Oversight Board’s decision in the case of the post that opposed lockdowns will be yet another troublesome chapter in the narrative disputes that dominate the debate about how to tackle the pandemic on social media. Parodying Brazilian medical doctor Dr. Luana Araújo as she testified to a Parliamentary Inquiry Commission, “this decision will inevitably echo at every edge of the flat earth.”

{{ pessoas.pessoaActive.title }}

×