Abstract | Illegal and (lawful, but) harmful content - most notably hate speech and fake news, but also violent videos, copyright infringement, or child pornography - is a crucial problem on digital platforms like Facebook, YouTube, TikTok and Twitter. The EU’s 2022 Digital Services Act aims at tackling this problem by introducing an updated horizontal framework for all categories of content and activities on intermediary services. This raises several questions. How far do - national and European - free speech guarantees go? If hate speech can be banned to protect the victims’ rights, how can the prohibition of fake news be justified? What is the remaining leeway of the platforms for private content moderation? Who is responsible for fighting and taking down illegal content? How can the victims of de-platforming, content takedowns or shadow banning claim their right to freedom of opinion? Finally, how will these legal responsibilities be enforced? These questions are addressed in the articles of the edited volume, proceeding from the 2022 Annual Conference of the Institute for Digital Law Trier (IRDT). |
---|