TikTok has recently announced its new set of communities guidelines that includes a specific section about sharing misinformation on their platform.
TikTok’s explanation:
“The Community Guidelines we’ve published today give users far more detail than previous versions. […] Users will also notice that we’ve grouped violations into 10 distinct categories, each of which includes an explanation of the rationale and several detailed bullet points to clarify what type of misbehaviour would fall into that category. These changes offer clarity around how we define harmful or unsafe content that is not permitted on the platform. It’s important that users have insight into the philosophy behind our moderation decisions and the framework for making such judgements.“

TikTok has mentioned that they will be removing the content found spreading misinformation like:
- Provoke hate, fear, or prejudice
- Could be a reason to cause harm to someone’s health, like misinformation about health treatments.
- Phishing attempts, or “manipulated content meant to cause harm”
- Misleading member regarding politics or civic processes.