The Frankfurt higher regional court ruled that once the Meta becomes aware of defamatory content, it should delete it as well as posts that have the meaning.
The case goes back to 2015 when Renate Künast, a Green Party member of parliament, brought the case before the German court against Meta, accusing it of circulating and attributing to Künast a false quote on Facebook. The regional court then ordered Meta to “refrain from making identical or essentially similar content publicly accessible on the platform.” Additionally, the court ruled that the false entirely represents an unlawful interference with the plaintiff’s ‘right to have her own word.’
In making this decision, the court considered the EU’s e-commerce directive. It emphasized that concrete knowledge of the violation of the law obliges the defendant to prevent such disruptions in the future. Additionally, the court stated that the defendant’s responsibility (in this case, Meta) is limited to determining if differences in design, as perceived by an average recipient, clearly indicate a misquote. This manual assessment complements automated procedures for recognizing uploaded content. Additionally, AI systems can be employed for further automatic pre-filtering. The appeal is allowed because the decision is not final, emphasizing the fundamental question of a host provider’s obligation to check and handle similar content.
Why does it matter?
While Meta could appeal the case, this decision could set a new precedent where the intermediaries hold independent responsibility for preventing the dissemination of ‘similar content.’ However, the question to be answered is what can constitute a similar defamatory as there is a risk of misusing it.