Under pressure for some time already to combat the growing problems of bullying and harassment on Facebook, the platform has announced that they will begin to allow users to block comments from their posts by selecting keywords which they don’t want to see.
Instagram and Twitter have both offered keyword blocking since 2016, in response to similar problems on the platform.
Improving The User Experience
This move comes in the wake of Facebooks recent commitment to improving the experience of users on its site, and focusing not just on the amount of activity on the site, but also the quality of the interactions.
Facebook CEO Mark Zuckerberg said earlier this year that he wanted to encourage “meaningful” conversations, and “time well spent” on the site, instead of just “time spent.”
In addition to the keyword blocking, Facebook will allow users to report bullying on behalf of a friend or family member, as well as providing an appeal function, (if you don’t think your comment should blocked), and a review function (if the commentators appeal is accepted, but you don’t think it should be).
The Pro’s And Con’s
The effectiveness of keyword blocking is still largely unproven. Despite its implementation on other platforms, I’m not aware that Twitter, for example, has seen a marked decrease in bullying or harassment since its been put in place. Apart from anything else, it may simply encourage a more refined, or subtle kind of harassment.
It’s impact on advertising on these platforms is also not yet known. Certainly while blocking so-called “hate” ads would be beneficial, it’s possible that keyword blocking could block perfectly normal advertising. (Although it should be theoretically possible, I’m not aware of anybody having either used the function for this purpose, or of any advertiser having felt an impact from it.)
On the plus side, people are apparently using it to effectively prevent themselves from seeing spoilers for movies and TV shows, so that’s probably a good thing.
At the end of the day, I’m largely in favour of tools that give us the ability to manage what we consume online. However, we can’t deny the potential pitfalls of the filter bubble either, where more and more, people are served only with the content they want to see and believe in, creating echo chambers that reinforce our differences, rather than celebrate our similarities.