Estimated Reading Time 2 minutes
In recent years, the growth of sinister behaviour on the internet has caused many people to re-evaluate how they interact with news and media outlets. The political lines of left or right leaning publications have become a little blurred as the consumer has become far more sophisticated.
Though this may have been the case all along, it has become more pronounced as the proliferation of easy to access information has become so freely available.
According to recent Data & Society research, 47% of people online have experienced some form of abuse, leading to 27% of Internet users to censor what they say online out of concern that they may become a target themselves. Furthermore, 70% said they had witnessed online abuse.
Trolls, or people who continually attack individuals or organisations with nasty comments with a level of impunity that leaves the user confused, hurt and helpless. Enter Jigsaw.
Perspective by Jigsaw
Using machine learning, Perspective will allow publishers to use an API code to set levels of control in comments that will balance free speech and abuse.
Perspective identifies whether a comment could be perceived as “toxic” to a discussion. Toxic is defined as disrespectful or harmful.
A number of organisations have signed up.
- The Wikimedia Foundation is researching ways to detect personal attacks against volunteer editors on Wikipedia;
- The New York Times is building an open source moderation tool to expand community discussion
- The Economist is reworking its comments platform; and
- The Guardian is researching how best to moderate comment forums, and host online discussions between readers and journalists.
Watch this space as more tools come from the Perspective API and trolls, know you are being watched.