Overview
Our Community Guidelines are designed to ensure our community stays protected. They set out what’s allowed and not allowed on YouTube, and apply to all types of content on our platform, including videos, comments, links, and thumbnails.
You'll find a full list of our Community Guidelines below:
Spam & deceptive practices
Sensitive content
Violent or dangerous content
Regulated goods
More
How do we develop new policies and update existing ones?
Each of our policies is carefully thought through so they are consistent, well-informed, and can be applied to content from around the world. They’re developed in partnership with a wide range of external industry and policy experts, as well as YouTube Creators. New policies go through multiple rounds of testing before they go live to ensure our global team of content reviewers can apply them accurately and consistently.
This work is never finished, and we are always evaluating our policies to understand how we can better strike a balance between keeping the YouTube community protected and giving everyone a voice.
How does YouTube identify content that violates Community Guidelines?
With hundreds of hours of new content uploaded to YouTube every minute, we use a combination of people and machine learning to detect problematic content at scale. Machine learning is well-suited to detect patterns, which helps us to find content similar to other content we’ve already removed, even before it’s viewed.
We also recognize that the best way to quickly remove content is to anticipate problems before they emerge. Our Intelligence Desk monitors the news, social media, and user reports to detect new trends surrounding inappropriate content, and works to make sure our teams are prepared to address them before they can become a larger issue.
Is there a way for the broader community to flag harmful content?
Though we are determined to continue reducing exposure to videos that violate our policies and have tasked over 10,000 people with detecting, reviewing, and removing content that violates our guidelines, the YouTube community also plays an important role in flagging content they think is inappropriate.
-
If you see content that you think violates Community Guidelines, you can use our flagging feature to submit content for review.
-
We developed the YouTube Trusted Flagger program to provide additional tools to non-governmental organizations (NGOs) with expertise in a policy area, government agencies, and individuals with high flagging accuracy rates. Participants receive training on YouTube policies and have a direct path of communication with our Trust & Safety specialists. Videos flagged by Trusted Flaggers are not automatically removed. They are subject to the same human review as videos flagged by any other user, but review by our teams may be expedited.
What action does YouTube take for content that violates Community Guidelines?
YouTube takes action on flagged videos after review by our trained human reviewers. They assess whether the content does indeed violate our policies, and protect content that has an educational, documentary, scientific, or artistic purpose. Our reviewer teams remove content that violates our policies and age-restrict content that may not be appropriate for all audiences. Our automated flagging systems also help us identify and remove spam automatically, as well as re-uploads of content we’ve already reviewed and determined violates our policies.
Community Guidelines Strikes
If our reviewers determine that content violates our Community Guidelines, we remove the content and send a notice to the Creator. The first time a Creator violates our Community Guidelines, the Creator receives a warning with no penalty to the channel. After one warning, we’ll issue a Community Guidelines strike to the channel and the account will have temporary restrictions. Channels that receive three strikes within a 90-day period will be terminated. Channels that are dedicated to violating our policies or that have a single case of severe abuse of the platform, will bypass our strikes system and be terminated. All strikes and terminations can be appealed if the Creator believes there was an error, and our teams will re-review the decision.
Resources
Age-Restricting Content
Sometimes content doesn't violate our Community Guidelines, but may not be appropriate for viewers under 18 years of age. In these cases, our review team will place an age restriction on the video so it will not be visible to viewers under 18 years of age, logged-out users, or to those who have Restricted Mode enabled. Creators can also choose to age restrict their own content at upload if they think it’s not suitable for younger audiences.