Global: Twitter still fails to protect women online – new report


Twitter is still not doing enough to protect women and non-binary users from online violence and abuse, according to a new analysis from Amnesty International.

The Twitter Scorecard notes the social media company’s record in implementing a series of recommendations to tackle abuse against women and non-binary people on the platform.

Twitter still fails to deliver on promises to protect users at increased risk of online abuse

Michael Kleinman, Director of Technology and Human Rights at Amnesty International USA

Despite some welcome progress stemming from the recommendations made in Amnesty’s 2020 dashboard, Twitter needs to do much more to tackle online abuse against women and / or marginalized groups. The company has only fully implemented one of the report’s ten recommendations, with limited progress in improving transparency around the content moderation and appeal process.

“Despite our repeated calls to improve their platform, Twitter still falls short of its promises to protect users at increased risk of online abuse,” said Michael Kleinman, director of technology and human rights at Amnesty International USA.

“For a company whose mission is to ’empower everyone to create and share ideas instantly without barriers’, it has become very clear that women and / or marginalized groups are disproportionately faced with threats to their lives. their online security. “

A survey commissioned by Amnesty International also shows that women who are more active on the platform were more likely to report being abused online, compared to those less active – 40% of women who use the platform. form more than once a day report being abused, compared to thirteen percent who use the platform less than once a week.

Amnesty International also asked the women who chose not to report the abuse why they did not. Notably, 100 percent of women who use the platform several times a week and who did not report abuse responded that it was “not worth it”.

While Twitter has made some progress, it’s nowhere near enough. They increased the amount of information available through their help center and transparency reports, while launching new public awareness campaigns, broadening the scope of their hateful conduct policy, and improving their reporting mechanisms and their privacy and security features. Although these are important steps, the problem remains.

In response to this report, Twitter shared with Amnesty International: “We are committed to publicly experimenting with product solutions that help solve fundamental problems our users face and empower them to take control of their own experience. While many of these changes are not directly reflected in your dashboard, we believe these improvements will ultimately enable our most vulnerable communities to better engage in freedom of expression without fear, a goal that we share with Amnesty.

Yet Twitter needs to do more to ensure that women and non-binary people – as well as all users, in all languages ​​- can use the platform without fear of abuse. As a business, Twitter has a corporate responsibility and moral obligation to take concrete steps to avoid causing or contributing to human rights violations, including providing an effective remedy for any real impact it may have. inflicted on its users.

“It has seemed to us time and time again that Twitter has still failed to deliver effective remedies for the real harm and impact that its platform has caused to women and / or marginalized groups,” added Michael. Kleinman, director of technology and human rights at Amnesty International USA.

“As our world has become increasingly dependent on digital spaces during the COVID-19 pandemic, it is essential that Twitter meets this moment with a demonstrated commitment to improving online experiences for all users, regardless of their background. identity. “


This dashboard summarizes all of the recommendations Amnesty International has made to Twitter since 2018 and distills them into ten key recommendations on which to assess the company. These 10 recommendations fall into four high-level categories: transparency, reporting mechanisms, review process for abuse reports, and privacy and security features. The analysis focuses on these four categories of change because of the positive impact each can have on women’s experiences on Twitter.

Each individual recommendation is made up of one to four distinct sub-indicators. Amnesty International then determined whether Twitter had made progress against each sub-indicator, classifying each indicator as not implemented, underway or implemented. As part of the ongoing public awareness campaigns, Amnesty International has verified whether these campaigns have addressed all of the issues that we have raised, as well as whether these campaigns and related material are available in languages ​​other than English.

Before releasing the scorecard, Amnesty International wrote to Twitter requesting an update on the progress of implementing our recommendations and the company’s response has been reflected throughout the report.


Comments are closed.