Twitter today dispersed the Trust & Safety Council, which was an advisory group consisting of roughly 100 independent researchers and human rights activists. The group, formed in 2016, gave the social network input on different content and human rights-related issues such as the removal of Child Sexual Abuse Material (CSAM), suicide prevention, and online safety. This could have implications for Twitter’s global content moderation as the group consisted of experts around the world.
This development comes after three key members of the Trust & Safety council resigned last week. The members said in a letter that Elon Musk ignored the group despite claiming to focus on user safety on the platform.
Twitter has dissolved the Trust & Safety Council pic.twitter.com/R2wS9BsqA2
— Anthony DeRosa (@Anthony) December 13, 2022
Twitter is going to move towards a more ‘automated’ and ‘algorithm’ based content moderation system. However, these are NOT a replacement for human content review as demonstrated by issue and shortfalls of said systems with YouTube, Facebook, Tumblr, etc. Yet, it is hard to have human content review when the staff have been fired or have quit. I can foresee two major impacts on Twitter with this news. The first already skittish advertisers are going to be unwilling to advertise on Twitter. All these grand incentives Twitter is offering to lure advertisers’ back mean nothing if their is chance their business could unintentionally become associated with or believe to be supporting certain hate content. Second, this could lead to a ban of Twitter in the EU, something they had already threated earlier this month.
via Tech Crunch