A group of more than 500 doctors, nurses, and other healthcare professionals is calling on Meta to rein in medical misinformation on Facebook and Instagram. They’re specifically concerned COVID-19 and content that negatively affects children and teenagers’ mental health.
“In our exam rooms and our hospitals, we’ve seen the detrimental impact that the mis- and disinformation circulated like wildfire across Meta’s platforms has had on individuals and communities alike,” the providers’ letter reads. “Meta has a responsibility to address its role in public safety and hold Mr. Zuckerberg accountable because disinformation and misinformation spread fastest on Facebook, compared with all other social media platforms.”
The message lambasts Meta for enabling the spread of covid-19 misinformation on its social networks, which signatories say has led to “real-world health consequences,” such as patients’ refusal to wear masks, vaccine rejection, and failure to comply with social distancing and quarantine measures.
“As physicians, nurses and medical professionals, we urge you, the shareholders of Meta, to support governance reforms to combat the rampant misinformation and disinformation enabled by the corporation,” the letter says.
The open letter was organised by the Committee to Protect Health Care, , and is addressed to Meta shareholders, who are convening for the company’s annual meeting today. The group is asking shareholders to vote in favour of Proposal 14, an item on the meeting’s agenda that, if passed, would mandate an independent assessment of the performance of Meta’s Audit and Risk Oversight Committee. The committee assists the company’s board with risk management related to public safety and the public interest. Shareholder Proposal 14 claims that Meta has “regularly” broken pledges to remove misinformation on covid-19 as well as other harmful content, such as alcohol and weight loss drugs targeted to adolescents as young as 13 years old.
“We can scream into the void, or we can do something like this,” said Dr. Rob Davidson, one of the letter’s signatories and executive director of the Committee to Protect Health Care, a national advocacy organisation
The proposal calls for the independent review to provide the committee with mitigation measures to promote a culture of risk monitoring and accountability. Some of these measures included: more access to internal and external experts on issues of significant societal risk and impact; a way for employees to anonymously report issues to the committee; and additional training to assess social impacts and risks.
Davidson told Gizmodo in a phone call that many in the medical community are very frustrated with the misinformation being spread on Meta.
“What can you do? I don’t think we can call up Mark Zuckerberg and make him change this practice, but if you can do something within the governance of the corporation to make an impact… and help stop the spread of misinformation, maybe we can save a few lives down the road,” he said.
In its response to proposal fourteen, Meta’s board of directors said it was “committed” to keeping people safe and preventing harm. The board also said that employees can already confidentially submit concerns to the committee on matters related to accounting, auditing, and other risks.
However, the board said that it did not feel that an independent assessment would result in any change and recommended shareholders vote against the proposal.
“Our board of directors believes that the preparation of the report contemplated by this proposal is unnecessary and not beneficial to our shareholders,” the board wrote in its response.
Gizmodo reached out to Meta for comment on the open letter but did not receive a response by the time of publication.