New Ofcom rules could mean fines for TikTok and Twitch
New measures laid out by Ofcom could mean fines for video-sharing platforms (VSP) like TikTok and Twitch.
The broadcasting watchdog said that one-third of users have seen hateful content on such sites. The new rules state that VSPs must take “appropriate measures” to protect users from content related to terrorism, child sexual abuse and racism. This would mean the platforms must:
- provide and effectively enforce clear rules for uploading content.
- make the reporting and complaints process easier.
- restrict access to adult sites with robust age-verification.
Ofcom stated that the progress taken by the eighteen VSPs in question would be published in a report next year.
Incidents of antisemitism have been reported on both TikTok and Twitch.
In July, we reported that according to a new study, antisemitic content on the social media platform TikTok had increased by 912%. According to research from Dr Gabriel Weimann of the University of Haifa and Natalie Masri of IDC Herzliya’s Institute for Counter-Terrorism, antisemitic comments on TikTok grew 912% from 41 in 2020 to 415 in 2021, and the platform saw 61 antisemitic postings so far this year compared to 43 last year. Antisemitic tropes and images that were used in video content included Nazi salutes, diminishing the impact of the Holocaust, and propagating caricatures of Jews with long, hooked noses.
In August, Twitch, the world’s biggest streaming site for watching video games, announced that it would introduce new measures to prevent “hate raids” that include antisemitic abuse, images of swastikas, and other racist or homophobic abuse. The move follows complaints from users in minority groups after some users of Twitch were subjected to high levels of abuse in recent months in so-called “hate raids.”
Campaign Against Antisemitism has long called for tougher regulations on social media sites and that social networks proactively search for and remove hate speech from their platforms.