Many private equity and venture capital-backed companies operating in Europe have had a busy first quarter: not only have they had to worry about the impending implementation of sweeping new data protection laws, but they have also had to get to grips with an anti-hate speech law passed by German legislators last year. The aims of this new law are laudable, although some are concerned about its effects on free speech. But the logistical challenges are considerable.
The new rule, the Network Enforcement Law (Netzwerkdurchsetzungsgesetz, or NetzDG), imposes a general reporting obligation and fines of up to €50 million for not reacting swiftly to reports of illegal content or hate speech. It is a particular concern for companies that operate domestic or international social media platforms, but also affects hosting services and a range of other media companies. The law was effective from October last year, but companies were given until the end of 2017 to get their houses in order (and the first reports will be due in June).
The NetzDG applies to “telemedia service providers” – which is very broadly defined, and would include any content-provider as well as those hosting content provided by others – operating social networks with more than two million users in Germany. The precise scope of the rules is unclear, and it is concerning that no clear guidance has been provided to help companies to determine whether they are covered. It is clear that social networks such as Facebook, Twitter and similar platforms are covered. It is also clear that the NetzDG does not apply to internet service providers in general, and online newspapers and retailers are explicitly excluded. There is a significant grey area in between.
For those businesses that are – or may be – covered, wherever in the world they are based, it is important to establish appropriate procedures to make sure that it is possible to comply with the law’s strict requirements. That means making sure that there is an effective and transparent complaint management system, as well as an ability to remove or block any obviously illegal content, including “criminal offences against the democratic constitutional state, sexual self-determination and personal honour”. The company is obliged to remove or block offending content within 24 hours of receiving a notification or a complaint. There are concerns that companies will err on the side of caution, blocking legitimate content rather than risking prosecution, and that this will curtail freedom of expression.
Indeed, the price of non-compliance might be significant: companies that persistently fail to address complaints by taking too long to delete or block illegal content will commit an administrative offence and face fines of between €5 million and €50 million. In addition, companies must report twice a year on the number of complaints received, details of the complaints and how they were handled.
Compliance with these new rules is proving a headache for some international businesses and, as other jurisdictions pass similar rules, navigating the various national systems for policing illegal content will be an increasing burden. Pan-European standards, or even global rules, would certainly make life easier.
European Funds Comment will take an Easter break and will return in April.