We want to keep the memory of the victims of Nazi persecution alive and promote tolerance, diversity, and democracy. Unfortunately, the content we publish on social media sometimes elicits a negative response from people who are unwilling to take it seriously. These people frequently trivialize Nazi crimes, deny the Holocaust, or want to dictate how we talk about history. In this interview, Oliver Saal from the Amadeu Antonio Foundation explains what is behind these types of online hate speech and explains what we can do in response.

What are the latest developments in hate speech on the web?

To answer your question about current developments and explain what has changed during the pandemic, I think it would be helpful to start by telling you a little about the status quo that was in place for a long time. Some topics have always provoked a lot of hatred. For example, anyone who mentions the topic of flight or refugees on social media or expresses opposition to extreme right-wing parties can expect to receive a hostile response. This can take the form of comments, hate mail, or even death threats. But some people don’t even have to speak out about a specific topic to provoke a response, they just get attacked for the very fact of being who they are. Those who are particularly affected include Black people, people of color, Jews, LGBTIQA+ people. And women.

 

What impact has the coronavirus pandemic had on the situation?

We have seen a significant overall increase in hate speech on social media during the COVID 19 pandemic. Asians and people who were seen as being Asian were the first people in Germany to be particularly affected. They were stigmatized for supposedly carrying the virus.

Then we saw an increase in antisemitism too. On the one hand, it took the form of open antisemitism: Jewish communities and people who showed solidarity with them increasingly found themselves the target of so-called zoom bombings. It also took the form of structural antisemitism: The pandemic has sparked a boom in conspiracy narratives. As usual, they focus on the supposedly sinister machinations of a secret elite, but the antisemitic narratives have been adapted in line with the new reality. The storming of the US Capitol and, six months before that, the storming of the Reichstag in Germany showed that agitation, disinformation, and conspiracy theories on social media platforms can have very real consequences.

Another change we have seen during the pandemic is that Facebook, YouTube, and the like have now been joined by Telegram, a messaging service that provides particularly fertile ground for the spread of conspiracy myths and for denouncing political enemies. What makes this hybrid medium special is that the operators hardly ever intervene – there is no moderation, hardly any bans, nothing gets deleted. Supporters of the extreme right and conspiracy ideologists publish the addresses and contact details of their political opponents on channels which can have over 100,000 subscribers. Atilla Hildmann’s channel has attracted attention for open antisemitism and threats of violence on numerous occasions.

»The users of social networks should adhere to the following general principle: If you see something – do something. If you see any content that you think might be against the law, report it and press charges.«

Oliver Saal (Photo: Viktor Schanz), Social media consultant Civic.net, Amadeu Antonio Stiftung

Why has whataboutism become so prevalent?

This is the practice of deflecting attention from a topic by asking a counter question that has nothing to do with the original topic. Most people who have ever been involved in discussions on political topics on social networks – even those who just read such discussions without participating themselves – will almost certainly be familiar with its use. For example, if people are talking about right-wing extremist violence and its victims, it often takes less than 5 minutes before someone comes along and posts a comment along these lines: “But what about left-wing violence? You don’t condemn that?”

This conversational strategy is not exclusive to the far right, of course. But right-wing extremists are particularly fond of using it. Because allegations that distract from the matter at hand and accuse the other party of having double standards disrupt conversations and can even bring them to a halt, because it is extremely difficult to find an effective response. The thing is, people who use arguments like this that brook no rebuttal are not really interested in discussing things at all. They use this type of argument intentionally to distract attention away from the topic at hand while appearing to be skeptics.

How can people deal with this type of argument? I would say you should try not to fall for the attempt to divert the conversation. Expose any such attempt to disrupt and divert a conversation as a deliberate conversational tactic, call it by its name, reject it, and then return to the original topic. Otherwise, it simply won’t be possible to conduct any kind of reasonable discussion.

 

What are toxic narratives, who uses them, and to what end?

At the Amadeu Antonio Foundation, we use the term “toxic narratives” to refer to arguments, content, and images that right-wing extremists disseminate on social networks. These are right-wing extremist narratives that are rooted in conspiracy ideologies because they poison the social climate by demonizing certain groups within society. Society is divided into friends and foes, and aggression is focused on those who are seen as foes.

One such narrative is the widespread assumption among right-wing extremists that the country and “the Germans” are threatened with extinction. Supporters of the far-right use social media accounts and their own media to spread and reinforce this narrative. They exaggerate events and invent causal relationships where none exist.

 

What can institutions and other users do to counter this type of hate?

In our view, the most important thing is for everyone who participates in public discourse in any way to recognize digital violence as being a real form of violence and not just say “it’s only the internet.” And for everyone to take responsibility for protecting those who are targeted as well as for democracy as a whole, and then get on and do whatever is in their power to improve the situation:

The state should take the threat of far-right terrorists who become radicalized online seriously. By taking a proactive stance and taking a look at the relevant channels on Telegram, for example. The state needs to facilitate criminal prosecution and should provide better funding and a secure footing for counseling services for the victims of online hate.

Social networks should implement their community standards more strictly and should ban hate groups, for example, but also disinformation – about vaccination, for example. If you look at right-wing extremist groups on social media, the operators always claim that it is not in their interest for these groups to be given a platform on their networks. Still, it often takes years of pressure from civil society before anything gets done and the groups finally lose their accounts.

 

What concrete steps can be taken?

We need more resources for professional social media work, for moderation on major media sites, for example. Campaigns for democracy and human rights. And we need civil society and academia to monitor right-wing extremist groups. As employees of organizations, we should report comments that clearly violate the law. Organizations like “Hate Aid” or the “Hass melden” platform in Germany are there to help and can provide assistance with legal proceedings.

The users of social networks should adhere to the following general principle: If you see something – do something. If you see any content that you think might be against the law, report it and press charges. Always report disinformation to the network immediately. When you see people being attacked or insulted, you should act in exactly the same way as you would if you saw the same thing happen on a bus. Intervene. Get involved. Protect the person who is being attacked. Look for allies. And don’t just rant on about how bad the conversation culture is on social networks. Do something about it yourself by sharing good content and by participating in positive campaigns or using positive hashtags.

Thank you, Oliver, for giving us this interview.

Support us
Learn more