Sustainalytics Insight: Meta Content Moderation – Risk Considerations for Investors
Meta’s recent changes to its content moderation and fact-checking policies may pose some material risks for investors, according to new insights from global sustainable investment data, risk ratings and research provider Morningstar Sustainalytics.
In a recent post, ESG Research Director Jennifer Vieno digs into the potential impacts of changes to Meta’s content moderation policies for its Facebook, Instagram and Threads platforms, announced earlier this year. Notably, Meta will transition its fact-checking program toward a “community notes” approach (a user-generated system), remove some restrictions on hate speech rules and will no longer demote any content which it has fact-checked and deemed to be false.
Should investors be concerned? There are two resulting risks that investors should consider, according to the new report:
- Risk One – Societal: Decreasing the level of content moderation on social media platforms can increase the potential for disinformation and misinformation. This can in turn increase the potential regulatory, reputational and legal risks for companies such as Meta that get pulled into these issues.
- Risk Two – Financial: Meta generates nearly all its revenue from advertising. If advertisers – or potential advertisers – move away from Meta to protect their brands from appearing next to inappropriate content, it could have an impact on the company’s bottom line.
Jennifer Vieno - ESG Research Director, Technology Media & Telecommunications, Morningstar Sustainalytics
“Morningstar Sustainalytics rates Meta’s unmanaged product governance risks as high. These risks are likely to increase with Meta’s changes to its content moderation approach. Over the next year, we’ll continue to review the implications of this approach for Meta within our ESG Risk assessment of the company.”
To speak in more detail with Jennifer, reach out to Tim Benedict at tim.benedict@morningstar.com or (203) 339-1912.
Media Contacts
