It may come to some as a surprise for the prompts that the Twitter platform is rolling out. The update will prompt the user to reconsider their tweet before posting it to see if the tweet is racist or hateful. On Wednesday, Twitter rolled out about these changes on the platform to encourage Android and iOS English-speaking users. This will encourage the users to “pause and reconsider” a tweet before they post it.
How hateful terms and phrases will be filtered out?
Twitter started testing out the idea last year. They began executing limited tests on the prompts for iOS users. The idea behind this was an algorithm that would detect hateful phrases or terms. These would include detection of “insults, strong language, hateful comments” or traditionally and plainly “offensive”.
If it contained any of it, Twitter would offer the user a chance to reconsider the tweet before posting it on the platform. With the prompt, the user will have the chance to edit or delete the offensive or hateful tweet that they are about to post.
These early tests would raise an alarm with the algorithms in the daily friendly conversations. This was because friendly banter generally includes offensive and sarcastic language. Twitter writes that these systems, “often didn’t differentiate between potentially offensive language, sarcasm, and friendly banter”.
About the testing algorithm:
The test rolled out in 2021 and is taking the errors into account, along with feedback from the audience. The feedback is required as it is being conducted as an experiment. One of the applied fixes is to see how often the tweeter and Twitter talk back and forth. This was mentioned in the company’s blog.
There is a chance that two people understand each other. Only then they would chat publicly frequently while following each other on the platform. Hence, it is impossible that they are discussing something with someone very randomly.