YouTube announced this week that it would be implementing a series of policy changes that improve the way the platform tackles harassment.
“We systematically review all our policies to make sure the line between what we remove and what we allow is drawn in the right place and recognized earlier this year that for harassment, there is more we can do to protect our creators and community,” YouTube said in an official blog post.
“We remain committed to our openness as a platform and to ensure that spirited debate and vigorous exchange of ideas continue to thrive here. However, we will not tolerate harassment.”
YouTube claims that it will now remove content that contains more subtle forms of threats as opposed to only those that are an obvious danger to viewers.
“We’ve always removed videos that explicitly threaten someone, reveal confidential personal information, or encourage people to harass someone else. Moving forward, our policies will go a step further and not only prohibit explicit threats but also veiled or implied threats. This includes content simulating violence toward an individual or language suggesting physical violence may occur.”
And beyond threats, YouTube says it will also be cracking down on hurtful language that goes too far – “We will no longer allow content that maliciously insults someone based on protected attributes such as their race, gender expression, or sexual orientation. This applies to everyone, from private individuals to YouTube creators to public officials”.
And while removing videos of a demeaning or threatening nature is a step in the right direction towards a more constructive and inclusive platform, YouTube has acknowledged that it’s not the only area where users are being harassed.
“We know that the comment section is an important place for fans to engage with creators and each other. At the same time, we heard feedback that comments are often where creators and viewers encounter harassment. This behaviour not only impacts the person targeted by the harassment but can also have a chilling effect on the entire conversation.”
Not only will the above policy updates apply to the comment section but content creators will also have more control over what comments are published and what are trashed.
“When we’re not sure a comment violates our policies, but it seems potentially inappropriate, we give creators the option to review it before it’s posted on their channel.”
The results of this tool have been promising as YouTube reports a 75% reduction in user flagged comments.
YouTube says, “As we make these changes, it’s vitally important that YouTube remains a place where people can express a broad range of ideas, and we’ll continue to protect discussion on matters of public interest and artistic expression”.