Twitch releases transparency report outlining policy enforcement

Aiden Strawhun, Wednesday, March 3rd, 2021 4:32 pm

Twitch has released a “transparency report,” detailing all of its data on how it enforces policies platform-wide, not just in regards to hateful conduct and harassment.

More recently, on Jan. 22, 2021, an updated version of Twitch’s original harassment policies has been in effect. Now, just over a month later, the platform has released new data, but it’s not on how the new policies have been enforced. Instead, the report covers Jan. 2020 to Dec. 2020. 

We’ll be looking at this report regarding community and harassment, though the report covers a breadth of areas, including safety, definitions, spam reports, and more. 

“Creating a Transparency Report is an important measure of our accountability — it requires being honest about the obstacles we face and how we are working to resolve them to improve safety on Twitch,” the company said in a press release. “Moving forward, we’ll be releasing two transparency reports a year so we can track our progress as a community. We also have a responsibility to help you understand the work we’re doing to make Twitch a safe, inclusive place for our diverse global community.”

The highlights from the release applaud Twitch’s growth in 2020, noting the platform saw a 40% increase in new channels from the first half of the year to the second half, though it does not detail the numbers of new channels. However, Twitch ended the year with 33.4 million unique streaming channels total.

The report reads 24.4 billion chat messages were sent on Twitch in the first half of the year alone, 32.6 billion in the second half, for a total of 57 billion messages for the year. 

Of those messages, 160 million were removed by blocked terms and AutoMods, and 47.4 million messages were removed manually by channel moderators. 

Twitch attributes that data to the platform’s growth and the ModView dashboard launch in March 2020. 

Twitch’s community and harassment policies, both the original and the updated 2021 version, make it clear it expects streamers themselves to cull bad behavior within their communities. We’ve broken down what the new policies look like; however, Twitch’s policy page, post-update, only links to the updated policy, and not what policies were in place during the period the report is on. We’ve pulled the original policies via The Wayback Machine.

According to the transparency report, in 2020, streamers and their mods enforced 6.2 million channel bans and 7.7 million temporary channel bans. 

Regarding hateful conduct and harassment, user reports of violations increased by 19%  from the first half of the year to the second. The report does not detail the specific number of reports it received from users in this category. 

The report does detail the number of enforcement actions in regards to hateful conduct and harassment. In the first half of 2020, the platform acted on 19,532 violations and 61,235 violations in the second half. 

By the end of the year, Twitch acted on 80,785 violations total. That’s roughly a 214% increase in actions taken from the first half of the year to the second.

Twitch attributes this increase to improvements made in reporting tools for users. From May to August, the company also “increased its capacity” to review reports, improving its ability to respond to reports more quickly. 

Using the data above, streamers and their moderators made 13.9 million permanent and temporary channel bans total in 2020. Twitch’s total of 80,785 different violation enforcements (which include perma and temp bans, among other things platform-wide) is not even half of one percent of what users are doing themselves.

What’s important to keep in mind is how Twitch actually treats enforcement of its policies when reports go beyond channel-level intervention. In the report, it defines its enforcement strategy as this: “Twitch is a live-streaming service, and the vast majority of the content on Twitch is ephemeral. For this reason, we do not focus on ‘content removal’ as the primary means of enforcing streamer adherence to our Community Guidelines.”

The report explains that most of the platform’s violation reports are from users or machine detection. From there, it continues to its team of in-house moderators, who will then issue an “enforcement,” such as a temporary ban or warning. If there is physical evidence of the reported content containing a violation of the platform’s rules, Twitch’s moderation team will remove that content.

The “but” is this: “Most enforcements do not require content removal, because apart from the report, there is no longer a record of the violation – the live, violative content is already gone.”

Without the inclusion of channel-level moderation, Twitch’s platform moderators’ total enforcement actions increased from 788 thousand in the first half of 2020 to 1.1 million by the end of the second half of the year, leading to 1.8 million actions total. Twitch’s data does show a correlation between the number of reports it receives and the number of hours users watch content on the platform. 

Adjacently, the platform has shown an increase in the action it takes on child sexual exploitation. In 2020, it sent 2,158 tips to the National Center for Missing & Exploited Children, seeing a 66% increase from 812 reports the first half of the year to the second with 1,346 reports. Twitch attributes this increase to its improved investigation processes and internal teams who identify patterns of behavior. 

And as for further escalation beyond the platform, Twitch reported 38 cases of “credible threats of violence” to appropriate law enforcement agencies. Twenty-two of these cases were in the first half of the year, and the remaining 16 were in the second half. Twitch says this is likely because of COVID-19, in that there were “fewer places and events for people to direct violent threats toward.”

GameDaily.biz © 2024 | All Rights Reserved.