Nabeelah Shabbir on how to deal with gendered online violence

26/01/2024     1 min read     By Ahlem Khattab

Nabeelah Shabbir (by Dominik Osvald)

The co-author of “The Chilling: A Global Study On Online Violence Against Women Journalists” chats with us about the research behind the study and what it all means for journalists and journalism.

It’s heavy but necessary work. Since 2019, the International Center for Journalists (ICFJ) has been studying how online violence against women journalists occurs and operates in the help of ultimately coming up with solutions and systems that help prevent escalations that might lead to offline harm.

In 2022, the ICFJ published The Chilling, a report presenting their findings so far, case studies, and more than a hundred practical recommendations for action. Nabeelah Shabbir is one of its two co-authors. She is our guest in a new episode of our podcast Peer-to-Peer.

“We are grateful as researchers that those who are experiencing [online violence] and have managed are reporting it, and talking about it publicly. Because it’s not their problem alone. It’s a shared problem. It’s one which is very global in scale, and which we’re trying to show patterns for. (…) We’ve got to make a global thread because people are stronger when they know others have gone through it.” — Nabeelah Shabbir in Peer-to-Peer, season 2, episode 2

Before becoming a researcher, Nabeelah worked as journalist for a decade in Rawalpindi, Cologne, Brussels, Paris, London, Oxford and Amsterdam for different news outlets. She has notably reported on climate change for The Guardian, and she and the “Keep it in the Ground” team won a British Journalism Award in 2015.

In this conversation, she shares her thoughts on the evolution of online hate and what we can learn from what journalists like Nobel Peace Prize winner Maria Ressa have been through. She also highlights the importance of collaboration and solidarity in the face of online violence.

Content warning: Mentions of physical attacks and violent threats.


Ahlem Khattab (In the Balance) – A big part of our public conversations today take place on social media platforms, and we’ve gotten used to online hate just being a part of the online experience. But in recent years, it has become a real problem for many journalists, especially women. How did we get here?

Nabeelah Shabbir – I was following a discussion about deep fakes just this week from the organization Equality Now. And there’s a couple of documentary makers who’ve actually made a film about victims of deep fake online violence, right? In the course of that conversation, they were like, the first person who ever reported being a victim of deep fake violence was a woman in Perth in Australia 10 years ago. And at that time, there was not a name for what that was.

And I think part of this work that we’re doing in online violence – which, by the way, there’s so many incredible organizations around the world, and I’m just going to give an early shout out to the Coalition Against Online Violence, which has grouped together, I think, something like over 70 organizations who are doing some kind of work of advocacy or research or so on, mainly in the US, but also in other countries around the world. And it’s with this group that we’re always thinking about what the terminology is for what’s happening, what this phenomenon is, how do we understand it, actually, because maybe it was always there, and we never really understood it.

The take that we’ve had at ICFJ was to look at how it impacted women journalists. But you can arguably look at politicians, there’s been so much work done. If you’re a woman politician, the abuse that you’ll get for being in the public space.

And I’ll just say from a personal perspective, just speaking to a British journalist, Carole Cadwalladr. We had an event in London at the Frontline Club, which is like an investigative journalism place where you can hang out, a member club. And we’d put her in a room with Rana Ayyub, who’s an Indian muslim journalist who also experienced a lot of online abuse. We had Caoilfhionn Gallagher there, who’s the lawyer for Daphne Caruana Galizia’s family, a Maltese journalist who was assassinated in 2017. And Carole Cadwalladr. said she appreciated the work that ICFJ had done, because it was like a framework of understanding what’s happened to you. And it helps to hear people speaking about their experiences, because people will have approached it in a different way.

So I’ve gone a really long way around answering your original question, which is, what is this? How has it happened? You know, where’s it come from? But I think the thing is that it’s just constantly evolving. We’re still understanding it. And there’s different cultural reactions to what that question is.

Because for The Chilling, we had a group of academic researchers around the world go in depth in 15 countries. One of those countries was the United States. The journalists there sometimes might say, “Well, it’s in the Constitution, it’s freedom of expression.” You know, it’s like, what does it mean, actually? Like, what is the line? How do you understand this? And how do you accept this happening to you?

Of course, it’s not okay. And men get abused, too, especially if you’re a journalist. But women get it more. And I think that’s also a massive part of the conversation.

How does it get from hate comments or somebody expressing their opinion in a corner to something more violent with actual real-life consequences?

It’s about impunity. It’s about the fact that nothing is going to happen to you if you target that journalist, go after her, coordinate a campaign. If you’re a state-paid troll, as we’ve seen in certain countries, if you’re a member of partisan media, even if you’re a member of, I don’t know, a blog, if you’re part of some deep, dark web forum where you feel like someone is fair game, you can get away with it. And I would say that’s the problem, right?

And increasingly what we found in our data is that some of those trolls or people who go after you and attack you for even breathing online, as it were, are people who actually use their real names. And this is something, again, I’m going to bring up Rana Ayub, something that she found astonishing in her case that people are sending her rape threats, but they are using their real names and even their profiles. And sometimes that profile might have a flag of India in it or, you know, just… Very blatant criminal, I would say, actions to threaten someone to death or to rape, which is one of the most common ways that you can be harassed online. We’ve identified these like 15 escalating points, indicators of what those are. Death and rape threats is the first.

And it’s impunity. It’s knowing that you can get away with it. And it’s where we have a frustration with the tech platforms who excuse themselves from all of that. From, you know, the content moderation is very light. It doesn’t even exist in different languages. A common thing that you hear from women journalists is that they reported – or anyone indeed who’s being harassed online, they’ve reported it and been told, “This doesn’t violate facebook’s community standards.”  That’s just the stuff that you can report, let alone if you’re being sworn out in another language.

With the computer scientists we work at the University of Sheffield we’re able to work with them in Hindi, in Spanish. We’ve used Mexican Spanish. We’ve worked in Russian, English and Maltese now. And a lot of that stuff doesn’t even go through the filter because it’s in another language, right? Let alone memes, let alone deep fakes; let alone image-based abuse. It’s like… the list is long.

And even bilingual text, I imagine. Like the mix of languages, it must be difficult to detect for algorithms.

Yeah. In Maria Ressa’s case, it was in, what did we call it? “Taglish.” Like it’s a mix of Tagalog and English. And it’s harder to detect those.

Of course, her media company is called Rappler. They’ve been around for over a decade, and they do incredible work at sort of flipping the journalism back on that harassment. They built something called Sharktank. They had somehow got access to some of that data where they were being abused online, and they were able to analyze and see that all of that was just coming from a few small accounts. You know, it was coordinated, it was networked. And that’s a very insidious part of this online harassment, and it’s something that’s another escalation point that we put when we were looking at what are the indicators which show that you’re going to be more at risk. You’ve got a death threat, and it’s also coordinated by just a few accounts. Can we track who those accounts are? But then you’re asking so much of the journalists. Like we’re not asking them to self-report everything.

Those guidelines that we produced with the escalators are for newsrooms, but they’re also for intergovernmental organizations to know about, they’re also for states to know about, it’s also for civil society organizations who already do too much of the heavy lifting in these cases. But we’re trying to look at the data behind it. And so, ICFJ has published big data case studies. They’ve looked at Maria Ressa, Carole Cadwalladr, Ghada Oueiss in Lebanon, Carmen Aristegui in Mexico, Rana Ayyub in India. Emblematic women journalists. Who’s going after them? And our computer scientist colleagues at the University of Sheffield are investigating that data where you can get it.

And there are patterns. Like actually, you could boil it down to just a few points of how women are being harassed as journalists around the world. At least according to our research. But another key problem that it brings up is that researchers don’t have access to that data, and this is another serious problem because up until Elon Musk took over Twitter, we were able to access the API, and that’s just become incredibly difficult. With Facebook, you could never do it. You’d actually get sued if researchers tried to go in there. And having access to the data sets is one thing that we asked for to be able to do our work, which the tech platforms, we think, should be doing ultimately because these are the vectors of where the online violence is happening.

You just mentioned the changes that have been happening on Twitter (now X), and with all these rules on what can be accessed online by journalists, by researchers, how do you manage to do this work to scrape data sets?

So yeah, no scraping allowed (laughs). We don’t do that, I’m just gonna say that out there for legal reasons as well.

With Daphne Caruana Galizia, which is a very poignant case that we’re doing right now, there was a public inquiry into her death because it was connected to a state actor. It was coordinated abuse. She herself, as a journalist in Malta who worked on the Panama Papers, wasn’t on social media platforms, she blogged. But they did collect together a bit of Facebook data in presenting that public inquiry into her death to prove that, you know, she was hounded to her death, basically.

And that has been very helpful to be able to see a collection of Facebook groups, which were private at the time, but where you can see people who honestly… rejoiced in her death. They were the people who were trolling her when she was alive, stalking her online, all of those insidious things. You know, doxing, which is releasing your private information to the public domain: your address, your commuting patterns… She’s walking around a farm on a weekend, people are taking pictures. And in that case, we’ve got that data because it’s been collected by her colleagues at another media organization called The Shift.

Our work is difficult because it might be image-based abuse. It might be fake profiles. It might be bots. You know, it’s frightening. You know, it’s just like there are so many facets of it. But certainly with the data that we had up until now, we were able to access the Twitter API. So we do have that data that our computer scientists, you know, we could use the API up until a certain point. And that is how our computer scientist colleagues were doing the work, and ultimately generating word clouds where you could see the most common insults, which is revealing in itself. So in the case of Maria Ressa, liar was one of the biggest ones. That’s one of the worst things you can tell a journalist. Very gendered, very misogynistic, and so much more if you come from an intersectional background, right?

And I think for me, it was speaking to an ethnic minority background journalist at The New York Times in the U.S. who was doing the exact same reporting, she said, on COVID-19 as her male colleagues. But she always got it far worse than them,. Gendered, but also against her ethnic minority background. And they even tweeted and supported her and spoke up. They said, the reporting we do is practical. It’s practically the same. She’s scrutinized far more. She’s not trusted. She’s not believed. And she’s stuck at it, right? But she’s had emails telling her she’s going to be beheaded.

(Silence) It’s really chilling.

It’s chilling. They’re chilled out of profession, and it’s chilling. It’s just, yeah, it’s really sad because we need those voices. We need those people to keep doing their work, and we’re asking them to shoulder that load.

And I think ICFJ, from that initial research that we did, which started way back in 2019-2020 resulting in that sort of landmark report, really was a global picture of what online violence looked like.

We then were lucky to get some funding from the U.K. FCDO, Foreign and Commonwealth Development Office, to carry on with our big data case studies, and see if we could eventually come up with some kind of experimental – because it is experimental at this stage. It’s like an online violence alert and response system, which is something that we’d like newsrooms to put into place, where you can protect your journalists from this happening to them before it happens to them. That’s what we’re working on right now with our computer scientist colleagues, who are all women, by the way, which is amazing.

So obviously there are different levels here, but what can journalists who are listening to this actually put into place and do at their own level to better protect themselves, even if it’s short term? 

And of course, aside from taking a break from social media. It’s really hard for me to speak about personally, because I have been a journalist all my life, professional life, and I personally don’t tweet my opinions or even my work sometimes because I don’t want to get blow back. It’s happened to my colleague, the lead researcher of all of this work (Julie Posetti). She’s been targeted in conjunction with supporting a journalist, right? And so what you’re asking is what they can do in the short term. It’s a difficult thing. You know, I wonder if–

It’s a two-sided situation because so many journalists have told us they need to stay online because that’s where their community talks to them. It’s where they get their sources. It’s where they’ll get information. It’s where they can keep a line on what’s happening. And I think it’s almost why a lot of people accept that this is part of the job. I think what we’ve been saying in our research is that it’s not acceptable. And we’re looking for collaborative ways with people outside of the industry, like these machine learning experts, to try to collaborate with us, to find ways for us to build systems where maybe your data can be plugged into something if it’s possible, if the tech platforms allow it, and we can stop you from ever seeing this.

And we have identified lightning rod topics. If you’re interested in learning more about this, if you write about gender, if you write about politics, if you write about these certain things, you’re certainly likely to get targeted.

So many people do take social media breaks. If you work in a newsroom which has a digital security desk, they might hand over your account to someone else to monitor for a while. Your information will be blocked online, of course, but it’s not easy. And I don’t see how it’s going to look in the future with more generative AI ways of, you know, developing that abuse. I can’t imagine what that looks like in the future. So it is a difficult question.

And I guess at the moment, we are grateful as researchers that those who are experiencing it and who have managed are reporting it and talking about it publicly because it’s not their problem alone. It’s a shared problem. It’s one which is very global in scale and which we’re trying to show patterns for. And that hopefully gives a bit of solidarity to the fact that it’s happening, and we don’t really know how to stop it happening at the moment.

So contrary to what we’ve always been taught about hate, you’re saying don’t ignore it, report it.

Report it and help us understand it. And it’s not easy because you might not want to be associated with that story. And there are people who, you know, are fighting for their lives while doing their reporting. I think about Taylor Loren, who’s a technology reporter. She’s now at the Washington Post, but she’s gone through so much. And she’s a target of everything online, I feel like. And she’s been public about it. But speaking about your mental health as well, is something which makes you a target, too. But you should be able to do your job.

Daphne Caruana Galizia should be alive today. Why is she dead? She reported for the Panama Papers. You know, you just think, what is this all a part of? And the online harassment that she got is something else. The offline harassment, too. I mean, her dogs were killed. Her house was set on fire while she was in it with her family sleeping. You know, but she was also cyberstalked and she was harassed online. There were all these Facebook groups rooting for her to, you know, shut up. And she was abused in the street.

It’s like all these pictures of the journalists who do die and online violence has been an element of their harassment and led to their deaths. We just don’t want that happening anymore. And it’s like, what can you do to stop it? You’ve got to put all these cases out, and we’ve got to make a global thread because people are stronger when they know others have gone through it.

And that takes me back to my very first point that I said, organizing this little panel at the Frontline Club where we talked about online violence and hearing people say, “it’s helped me to hear you talk about being burnt out because I didn’t think of it like that. I just thought it was part of the job.” You know, so many people have been told that that is the way it should be. But gendered disinformation is not okay. It’s not okay when it’s networked. We should put names to what we’re seeing out there. It’s so much bigger. I feel like we’re just at the foot of the mountain.

And for journalists listening to us right now, what would you hope this conversation brings them?

I hope solidarity, not feeling like they’re alone, and understanding that it’s part of a wider, systematic problem. And also, they’ve got rights under international human rights law. They are protected as journalists internationally.

You know, so many White House reporters we spoke to, by the way, when President Trump was in office, who were targeted. And that’s not normal. Don’t normalize this. You’re protected. And it’s also why at the International Center for Journalists, we have brought more of a human rights framework into the research that we do behind what’s happening. Behind the coordinated accounts. Behind all of this. So that they can see that, you know, someone is trying to connect the dots. And hopefully it inspires the actual places where, the vectors of online violence, where that’s happening. For those people to come forward and say, “We can help, too.”

One thing I didn’t say, which I should say, is that there are journalists like Carmen Aristegui in Mexico who continue doing hard-hitting investigative journalism. In her case, of course, she’s got 30 years experience as a journalist. She might be a bit more cynical about online. She might be a bit more private. Hasn’t stopped her son from being targeted in connection to her reporting on the White House in Mexico. And some of them are just hitting back by continuing to do their work. And I think that is just the most incredible thing about it for me. Of course, there are those who are chilled out of the profession who you might be seeing reporting on something different now or moving into a tangential sort of sidebar place.

But there are so many people who are still hitting it where it hurts. You know, the goal, I think, of harassing a woman online, is like, “shut up.” You know, like you shouldn’t be talking. But she’s got important things to tell you. And I think that’s a very empowering thing also to see. I shouldn’t be completely doomsday about it because there are people out there who are continuing to serve with that heart-hitting journalism. So, more power to them.

Editor’s Picks