Home > Digital > Article

Julia Ebner: The Extremism Researcher on Bot Armies and Online Hate

Digital ✍️ Anna Berger 🕒 2026-03-20 05:57 🔥 Views: 1
Julia Ebner warns of digital manipulation and online hate

It's an invisible war raging out there in the digital sphere. While we scroll through our feeds, like a photo from a long-forgotten acquaintance, or get annoyed by angry comments, they are already hard at work: armies of bots, controlled by extremists, trolls, and political strategists. No one else in Europe has scrutinised this phenomenon over the last few years as meticulously as Julia Ebner. The Austrian extremism researcher, who works at a leading research institute for strategic dialogue in London, has been warning for years about the systematic infiltration of our social networks. And her latest analyses are more alarming than ever.

The Method: How Bots Take Over Our Minds

It would be too simplistic to assume that every hate campaign is simply the work of a few angry individuals. What Julia Ebner and her team uncover through undercover research is highly professional, organised manipulation. It's no longer about individual trolls, but about bot armies controlling thousands of accounts simultaneously. They don't just post radical slogans; they interact, they amplify each other, giving extreme minorities an artificial reach they would never have in the real world. The tactic is always similar: in the comment sections under posts about refugees, vaccinations, or elections, identical narratives suddenly appear en masse. For Julia Ebner, this is a clear pattern: "What looks like a spontaneous outburst of public sentiment is often the result of carefully planned digital attacks," she summarises the findings of her undercover investigations. What's particularly insidious is that the bots are learning. They mimic human behaviour, first posting harmless cat pictures to build trust, and then they strike.

The Lethal Impact of Likes and Shares

Many still underestimate the explosive power of this digital manipulation. Yet, in her books like "Going Dark" or "The Rage," Julia Ebner has impressively documented how online agitation translates into real-world violence. She highlights how terrorist organisations and far-right groups use the same algorithms to recruit vulnerable young people. The platforms themselves become accomplices because their algorithms reward outrage and radicalism – they push the worst content to the top of timelines simply because it generates the most interaction. A particularly worrying example is the rise of deepfakes. In a world where you soon won't be able to trust video or audio evidence, Julia Ebner sees a new dimension of disinformation heading our way. "We are facing a serious test for democracy," she warns. Because when facts no longer matter, only the loudest and most unscrupulous voices win out in the end.

What Can We Do? The Expert Has Clear Demands

But Julia Ebner wouldn't be the most distinguished researcher in this field if she only offered grim predictions. She is calling on tech companies to finally implement radical transparency. It's not enough to delete a few obvious hate posts. The algorithms must be changed; they must no longer reward the spread of extremism. In addition, we need:

  • More digital literacy among the public: We need to learn to recognise manipulative content and question it critically.
  • Independent research: So far, platforms like Facebook or X (formerly Twitter) grant far too little access to their data.
  • International cooperation: Digital manipulation doesn't stop at borders. Only if countries like Ireland, the UK, and the EU act together can we stop these virtual mercenaries.

The work of Julia Ebner is an indispensable compass in these chaotic times. She dives into the darkest corners of the internet to show us all what's brewing down there. We should take her warnings seriously – because the battle for control over our minds has long begun. And we are all in the middle of it, whether we like it or not.