Misinformation covered by the new policy includes content related to dangerous medical treatments, lies about Covid-19 vaccines, falsehoods “promoted by conspiracy networks tied to violence,” content that “undermines the integrity of a civic or political process” — including lies about election fraud — and content that could harm people during emergencies like wildfires and shootings.
The policy will also apply to Russian state-controlled media channels that spread misinformation, Twitch said, adding that it had found only one such channel so far, with very little activity.
Twitch generally has had stricter rules than other social media platforms about what views its users can express. But in 2020, after platforms like YouTube and Twitter clamped down on far-right conspiracy theorists promoting false theories about the presidential election, Twitch saw an uptick in such streamers, who used it as a new place to earn money and spread lies.
Followers of the baseless QAnon conspiracy theory — which posits that former President Donald J. Trump is fighting a cabal of Democratic pedophiles — were particularly well represented among this group of several dozen Twitch users.
In April, Twitch told The New York Times that it was developing a misinformation policy. It said it would “take action against users that violate our community policies against harmful content that encourages or incites self-destructive behavior, harassment, or attempts or threatens to physically harm others, including through misinformation.”
Also last year, the company announced a policy that would allow it to suspend the accounts of people who have committed crimes or severe offenses in real life or on other online platforms, including those who engaged in violent extremism or were members of a hate group.
QAnon content, though, was still allowed, because Twitch said at the time that it did not consider QAnon to be a hate group. A Twitch spokeswoman said the new policy included QAnon as a conspiracy theory that promoted violence.