Scroll Top

Interview: Dr. Anja Kovacs of the Internet Democracy Project

It's an image of Anja Kovacs,wearing a blue cardigan addressing the media.

Dr. Anja Kovacs directs the Internet Democracy Project, an initiative at Point of View. Its research and advocacy efforts focus on issues of the Internet, freedom of expression, gender rights, online abuse of women and digital surveillance, and cyber security and internet governance. As Anja says, “Women have experienced surveillance for centuries, they are familiar with being surveilled. Surveillance is about policing norms. Gender is an axis along which everybody places themselves. So it becomes fairly easy for people to see the harms of surveillance as policing of norms when you use a gender lens.”

Shikha Aleya: Anja, thank you for talking to us about surveillance, sexuality, rights and the Internet. Please tell us something about how and why you decided to focus on this field of work.

Anja Kovacs: In October 2007, I googled Web 2.0.In the 90’s if you wanted to make a website you had to know coding. Web 2.0 is where you have content building platforms such as WordPress, Facebook, WhatsApp. This has shifted the way everybody can create content online. It has led to wonderful, but also horrible things, because it led to the systems of surveillance we are dealing with now.I was already a Facebook userin 2007, but my engagement with the Internet was limited. As activists so many of us were not looking at this, we were simply not engaging with the Internet. But when I started to understand what Web 2.0 stood for I thought it was problematic that we were not looking at this. It shifted something in my head. I remember that day so clearly because what I found really took my breath away. It made me think, this is changing the world.

Shikha Aleya: The Internet Democracy Project has presented research and case studies about gender and digital surveillance on the website Gendering Surveillance.   Could you give us an idea of the contextual frame you are focusing on when you use the lens of gender on issues of digital surveillance?

Anja Kovacs: We’ve been working on Internet and human rights issues for six years. The issue of privacy has become increasingly important, especially since surveillance is expanding on the Internet. What is interesting is that even though concerns about surveillance have been growing – starting with the revelations from Snowden(about the US National Security Agency’s secret data mining and intelligence gathering from the computer networks of other countries) and progressing to the Indian government’s use of the Internet for government benefits and schemes –we still do not see a concern about surveillance to the same extent as earlier we saw concern about censorship. For example, when Section 66A (of the Information Technology Act that allowed imprisonment for offensive messages online)was struck down, we saw a real movement against censorship.

And so we have been looking for ways to make issues of surveillance more concrete for people, ways by which people can see the harms of surveillance. We believe gender is an easy, relatable way of doing this. Women have experienced surveillance for centuries, they are familiar with being surveilled. Surveillance is about policing norms. Gender is an axis along which everybody places themselves. So it becomes fairly easy for people to see the harms of surveillance as policing of norms when you use a gender lens.

Shikha Aleya: Khap panchayat bans on the use of mobile phones by women and girls is a familiar, contemporary issue around the surveillance of women in India. You under took research on mobile phone bans by khap panchayats. Did you begin with specific assumptions about possible findings? Are there new insights that can be used as inputs to address this issue of khap diktats on the use of mobile phones by women and girls?

Anja Kovacs: When we decided to look at surveillance from a gender perspective, somehow these bans on mobile phones was one of the first things that came to mind. What we were really interested in was in knowing once the bans were put in place what actually happens. Media covers the bans – but what happens after the ban is passed? This was for us one of the big questions.

We found that the bans were generally not successful and did not lead to a complete ban on cell phone use. What we also found though, was that the issues or concerns that led to those bans are not restricted to khap panchayats but are common to wider sections of society,including teachers and college principals. It was about the space for privacy that the mobile phone allowed girls, the space to have conversations with men without parental approval. When girls were talking about their families being concerned about their using phones independently, some said, “When my brother uses his phone, my parents tell him ‘do what you want’, but when I use the phone the whole izzat (honour) of the family depends on it.”

And so rather than a complete ban, there was fairly widespread concern around not allowing young girls to use mobile phones unless supervised. So they were allowed to use mobile phones at home but not on the road, which is ironic, because, well, it’s a mobile phone! This is especially true of school-going girls. For slightly older girls, college-going girls, there was more space. But some of the arguments used by college-going girls to justify their mobile phone use were ironically also about surveillance.They would say, “My college is at a distance so if I have a phone you can check on me.” Overall, the concern that mobile phones allow women privacy and independence was very widespread and not restricted to the khaps. What the khaps did was a reflection of the values in the society around them.

We also found that it was harder for these kinds of bans to be implemented if families had greater aspirations of upward social mobility, especially if not linked to land. What I mean is that if the family dreamt of job opportunities for their children,then there was the recognition that integrating the ability to use technology was really important. Certain aspirations of modernity made things different. ‘Distance from the road’ was a phrase the girls used: If you live close to the road, to Delhi, people are more accepting, if you live further away, people are less accepting.  This sounds like a metaphor but they meant it literally, in terms of location. “My house is quite close to the road so my parents are ok with me using the mobile phone, but if I go to my village, which is inside and further away, people do not accept it,” as one said.

On the whole, we need to get more conscious of this when we talk of the digital gender gap in India – for example, the gender gap in mobile phone ownership stands at 36%, which is one of the highest in the world. What this research shows is that even if women and girls have access to mobile phones, the ways in which they are allowed to use them replicates offline inequalities in the digital age. The government is promoting Digital India in a forceful way, but if we do not allow girls to explore the potential of these technologies to the fullest in the same way we are doing with boys, then what kind of future are we setting ourselves up for? We are preventing them from the outset from becoming equal participants in a digital society.

Shikha Aleya: You have focused on issues of online abuse and sexual harassment of women. The phenomenon of trolling, particularly in the context of trolling women, is at the forefront of conversations about abuse, bullying and threats on the Internet. As an extension of social surveillance, harassment and abuse of women, what strategy or combination of strategies can work to change this?

Anja Kovacs: Three things are coming to mind. The first one is that I like the fact that you make a link between surveillance and harassment. Surveillance is about policing of norms and policing people who fall outside of that,whatever those doing the surveillance say is the acceptable norm. The sort of abuses we see of women on Facebook or Twitter is about policing norms. They are ways of getting women out of the public sphere. It is not just about agreeing or disagreeing, it is about policing whose voice will be heard and who is accepted as a legitimate participant. Just as women are not automatically considered legitimate participants in the public sphere offline, the attempt is made online to replicate this.

The second thing that comes to mind is that what we call trolling is not really something that’s accidental – there is actually a structural component to this, and the people who are particularly vulnerable are people who are generally more vulnerable or marginalized in society. This is not to say that men don’t get abused, but the kind of abuse menget is qualitatively different from abuse women get. So women get very quickly abused about their sexuality – even if the point being made has nothing to do with sexuality.

At the Internet Democracy Project we do not use the word trolling. Trolling makes it sound like an individual issue, but it is not an individual issue, it is a structural issue. By using the term abuse instead, we hope to bring out the structural aspect, the weight of the issue a little bit more.

The third point is about strategies. If you recognize that the problem is a structural one, then that somewhat shifts the way we look at solutions. So for some forms of verbal online abuse, for very severe forms, you may want to go to the police. But we need to look at non-legal forms to fight back as well.

Two things emerged from our research that I want to highlight. The first one is that it is so important to have a community online. This raises questions such as –what can we do to foster community? I would be really interested in having many more feminists thinking about what kind of collective organizing can we do such as is done by feminists in the offline space? Are there things we can learn from them?

Second, I also think we need to look at how we can use technology to create communities on the net that promote civility, the middle ground. What can we do to have the voices of the people in the middle be given more prominence? When we look to Facebook or Twitter for solutions, we often ask them to censor on our behalf. Censorship does not do well for women’s rights. We should ask them to give us tools to create civil communities, for healthy forms of interaction.In big online games with thousand of players people have already started to experiment with strategies – giving players ways to indicate amongst themselves who amongst them is participating in a positive way and who isn’t. Something like Periscope (a live video streaming app) is doing this, so while live video streaming is happening, you can flag this if there is inappropriate content or if someone is threatening harm to themselves or others. It might at the moment still be possible to rig this, but I think it is important that these experiments, looking at how you can empower users themselves, are important.

And there is one final point I want to emphasise: we need to be really careful that we focus on women’s empowerment when we discuss how to deal with online abuse. A lot of what goes in the name of women’s online safety is actually restricting – don’t do this, don’t do that, don’t put your pictures up online, don’t talk to strangers. I don’t know about you, but as a professional today, I end up talking to strangers online all the time. A lot of the advice that is given in the name of women’s safety is simply not helpful. You see the effect of such simplistic recommendations when you go to the police to report online abuse or harassment, when they ask for example, why did you put up pictures? Especially as feminists we must be conscious of this:Our solutions should be about expanding our space, not restricting our behaviour.

Shikha Aleya: Between surveillance and security is a vast territory, and the former does not automatically lead to the latter. Facebook has a ‘report abuse’ button that is easily used as a tool to abuse. What strategies would you suggest (to social media platforms / service providers) to take an effective gendered approach to monitoring surveillance and security on the Internet? For example, in the 2015 case of Preetha G, as reported in media accounts, she was harassed and abused for posting her views on Facebook. Her abusers ironically used the ‘report abuse’ button to have Facebook suspend her account and then went on to create a false page ‘Preetha the prostitute’.  

Anja Kovacs:If we ask for censorship by intermediaries, it will come back to haunt us. The case of Preetha is a good example. But Facebook’s policies censor gendered content really often, harming both women activists and people trying to promote women’s rights. When Point of View’s sexuality and disability website tries to post content on Facebook, it gets taken down repeatedly. Then each time it has to be explained to Facebook. This is what I tried to say earlier, that if we are going to ask intermediaries to take decisions on what to censor on our behalf, it will harm us.

The reason why the report button ended up working against Preetha is because Facebook and every one of these platforms will be much more responsive when its bottom line is at stake than when women’s rights are at stake. Preetha was reported because she wasn’t using her real name – as an activist, she did not want to use her caste name. Facebook says the use of the real name increases security, but we have been part of a campaign petitioning Facebook to change the real name policy, as in practice, it often hurts those who are already marginalised.(An analysis of Facebook’s response to the campaign can be found here.)If you give Facebook a really good reason they will now allow you to not use the real name on the platform, though you still need to submit proof that you are who you say you are to the company. But if you use your real name that you use across a variety of platforms, it helps them build a better profile of you. It’s their bottom line that really drives this policy. We shouldn’t be surprised, or disappointed either, that that’s how Facebook works. We’re dealing with a business after all, and we know that making a profit will be their first goal. But for that reason, we also need to start asking for a different type of intervention from them, think more carefully about what we want them to do – not to censor on our behalf, but to give us tools to create a different kind of community ourselves.

Shikha Aleya: India is currently in the midst legal processes on the matter of privacy and rights with respect to Aadhar and debate over mass surveillance projects including the Central Monitoring System (CMS) which is live in Delhi, New Delhi and Mumbai, DRDO NETRA and Lawful Intercept. What are the implications for a gendered frame of understanding the issues involved?

Anja Kovacs:Regarding CMS, NETRA etc., we have no evidence of misuse at the moment; in fact, we know very little how these technologies are being used, but there are indications from Snowden that where data is collected very broadly, it is often used for purposes that really are overly broad.

And so, at the very minimum, what we need in place though are very strong checks and balances. What should surveillance be allowed to do and who should have insight into what surveillance is allowed to do? When we ask for greater transparency, the intelligence agencies will always say:well we can’t tell everybody, that would defeat the purpose. But oversight doesn’t mean everyone needs to have full insight. I don’t need to know all the details of what data the intelligence agencies are collecting and how they are using this. In a democracy, parliament should at the very least surely be able to check this, though. So what we need is a strong system of checks and balances, and the problem is that we don’t have such strong oversight methods in India at the moment. The intelligence agencies are only under the oversight of the government, not of the parliament, for example.

What should these monitoring systems, such as the CMS, be allowed to do?With many of these, what they do is actually look at patterns of data and then try to see if there is a problem somewhere. This reverses the earlier process where there had to be a suspicion about someone before law enforcement looked at their data. Now there is a defacto criminalization of everybody: as our data is constantly scrutinized, we are all already treated as potential criminals. Those who are doing such surveillance might be looking at patterns in an anonymised way – but even then, they can start looking, without any warrants, at who is saying what whenever they decide that that is what they want to do. In a democratic society we really need to debate on whether that is the best way forward.

Where Aadhaaris concerned, we are starting to have more indications of possible gendered outcomes. For example, there have been newsreports from Madhya Pradesh that the compulsory use of Aadhaar for people who want to access free medicines and medical check-ups under the government’s AIDS control scheme has actually stopped some people from trying to avail these benefits. As they are worried that their identity will be made public, and that they will be stigmatized by society consequently, they prefer to turn to private hospitals instead. There are certain populations that are just much more vulnerable, that are likely to be much more harmed by any data leakages. So linking, without giving proper guarantees for the protection of people’s data and rights, has grave consequences for people who are vulnerable. Also for victims of sexual trafficking, so much struggle comes from social stigma. Imagine that the data in a database linked to rehabilitation is made public just as you are trying to find your feet again. The absence of privacy and data protection, in most cases, harms the people who already vulnerable. So there is a reproduction of vulnerability in the digital age because rights are not properly protected.

I think the way surveillance of our behaviour and movements is spreading to all aspects of our lives and is questioned less and less by people is a replication of what happens traditionally with women’s bodies. The government saying people do not have an absolute right to their bodies links to this, that in practice women’s bodies are regulated, controlled and directed by other people all the time. Now we have digital mirrors or data mirrors of our bodies all over the Internet.The digital body or data body of an individual is the data of where your physical body was, what it was thinking, doing. In the way that offline surveillance is meant to track our physical bodies, in the digital age there is data to track as well. And so, in a way, the government has said:we have the right to surveil this too. For women, this is not new, this has happened with women historically all the time. What the government has now made explicit is that it is claiming the right to do this, and claiming this right over all people.

It’s a fundamental shift in relationship between citizen and state – and a problematic shift, though you can only see this if you consider gendered social surveillance is problematic as well. The government will have one data body that they think of as Anja– and Facebook will have another data body they think of as Anja – and they are taking decisions about Anja based on that data body. The problem is when they are taking decisions of what they see as our data body, that image is not complete or correct. This links to the online abuse of women: if you face a lot of abuse, then what you say may be affected by that and there will be gaps and silences in the data available for you. So if that information is what is used to make a decision about who you are – it is partial or skewed information.

If others take decisions on our behalf based on what they see of our data body and we don’t know how they see us and have no idea of the decisions they are making about us, that’s dangerous. In a democracy these are things that we need to discuss and debate as citizens. We need to discuss when it is ok to do this, and how, and when and how it isn’t.

Cover Image: The Internet Democracy Project