Friday, 4 December 2020

Youtube starts to warn users before posting offensive comments, supports diverse communities and respectful interactions

 YouTube has started to roll out a new feature that will automatically send potentially offensive comments for review so that the channel owner do not have to go through them. This filter will enable creators to refrain themselves from hurtful and abusive comments.



Social Media as well as streaming platforms are a great way to interact with one another, however there are certain users which use these mediums in a toxic and hateful manner. Creators keep receiving harmful and hurtful abusive comments on a daily basis, this is demotivating and insulting to them and could discourage them from creating videos as they wouldn't feel the space 'safe' anymore. To counter this issue, YouTube has planned to roll out an automatic filter which would warn users posting hateful comments prior to the comments being posted.

Majorly the comments come from a point where in they are demeaning to a particular community, gender, sexual orientation, nationality or other identities. YouTube is also looking at the pattern in which the comments have been posted, so that they harassments, hatred and discriminations can be avoided. 

YouTube has announced this update in its latest blog post and said it aims to bridge the gap between their existing policies and make it a inclusive space for every community. 

This feature will roll out for YouTube Studio and will give users a pop up if their comment s potentially hurtful, YouTube will also notify users after the comment has been posted if it has been flagged as inappropriate.

0 comments: