Our Community Guidelines provide guidance on what is and what isn't allowed on TikTok to help foster a welcoming, safe, and entertaining experience.
We've developed tools and technology to identify and remove harmful content and behavior that goes against our Community Guidelines. These tools help us advance the safety of our community and maintain the integrity of our platform.
How content moderation works on TikTok
Content on TikTok first goes through technology that identifies and flags potential policy violations, such as adult nudity and violent and graphic content. In the areas where our technology is the most accurate, the content will be automatically removed.
In other areas, content will be flagged for additional review by our Safety team.
• If the content violates Community Guidelines, we'll remove the video and notify the creator of the reason. We'll also give you the opportunity to appeal the removal directly from the app.
• If we don't identify a violation, the video will be posted to TikTok. Keep in mind, if the video is reported or flagged in the future, it could still be removed for violating guidelines.
What happens if your content is flagged or reported
Our violation system counts the number of violations you get, and is based on the severity and frequency of violations. If you have a violation, we'll notify you of the consequences in the Account Updates section of your Inbox. You can also see a record of your violations in your Inbox.
More frequent violations will result in more penalties and you'll be notified in different parts of the app. Here’s how it works:
• We'll send a warning in the app the first time your content violates our Community Guidelines.
• If the violation is a zero-tolerance policy, it will result in an automatic ban. We may also block a device to help prevent future accounts from being created.
After the first violation
After your first violation, we can take one or more of the following actions.
• We may suspend your account's ability to upload a video, comment, or edit your profile for a period of time (typically between 24 or 48 hours), depending on the severity of the violation and previous violations.
• We may restrict your account to a view-only experience (typically between 72 hours or up to one week). This means that your account can’t post or engage with content during that time.
• After several violations, we'll notify you that your account may be permanently banned. This means that if the behavior persists, the account will be permanently banned.
Important things to know about violations:
• Our zero-tolerance policies, such as posting child sexual abuse material, automatically result in banning your account. We may also block a device to help prevent future accounts from being created.
• Accrued violations will expire from your record over time.
What you can do if your content is flagged or reported
If you think something shouldn't have been removed, it's important that you appeal the decision. Learn how to appeal directly from the app.
If we determine that your content or account should be restored:
• The content or account will be reinstated (unless you've already deleted the account or content).
• The penalty will be erased and will not impact the account going forward.