Session
Dynamic Coalition on Gender and Internet Governance
Round Table - Circle - 90 Min
In IGF 2019, the DC GIG session was used as a learning space where research on data, bodies, gender, and surveillance through a feminist lens was shared and discussed in relation to internet governance and policies. Seeing the very positive reception of the session, and the high degree of engagement, we will be using the DC GIG session as a learning space this year too.
Since the 2019 IGF, the world has changed drastically with the COVID-19 pandemic. Several governments and companies have put out apps and used technology to track and monitor the spread of the disease in attempts to control the same. When this is done in the name of stopping the pandemic, the amount of data collected and surveillance involved is done with little or no policies and provisions to protect user privacy. With few exceptions, most apps do not have clear guidelines and regulations on data collection and retention.
This session, Future Unclear, is titled us for this very reason, that we do not know what the future of data and surveillance looks like in the post-pandemic world. It is essential that this is understood and discussed in the internet governance and tech policy context to ensure that we can advocate for stringent national, regional, and global policies on the same. And this needs to be done with a strong gender and sexuality perspective and a feminist lens as women, queer and trans persons, sex workers, and those belonging to oppressed communities are the most affected by increased data collection and surveillance. The session also aims to understand the impact of using apps and data collection through these apps keeping in mind the wide digital gender gap that exists globally and the politics of meaningful access. A study by OECD in 2018 found that on an average women are 26% less likely than men to own a smartphone, and this gap is wider in South Asia and Africa. In this scenario, policies and relief measures based on data from these apps will leave out huge chunks of the population. Any policies discussed and advocated for needs to take this into cognizance.
The session will be in the form of a roundtable. Speakers who are researchers on data and body, online surveillance, and gender and technology will present briefly on their findings from the COVID-19 apps, privacy policies around the same, and the concerns around human rights. A significant part of the session will be dedicated to hearing comments and inputs from the room to ensure that there is knowledge sharing and learning.
Internet governance plays a key role in shaping the evolution and use of the internet. Without women and queer and trans persons at the table, this evolution will not be one that is inclusive or even evolution in the true sense of the word.
The DC GIG session is one of the key spaces at the Internet Governance Forum where gender is given priority, not just as something to increase diversity, but as something which is integral to technology development and to building an internet which protects the rights of the people at all times. In the understandable frenzy around the COVID-19 pandemic, it’s important that we do not sideline the right to privacy and other digital rights. For this, we need to discuss how the pandemic has changed internet governance and internet governance conversations on data policies, surveillance, facial recognition and other emerging issues.
The Data track looks at “the massive collection, transfer and processing of data through the application of data driven technologies by public as well as private entities pose challenges around privacy, freedom of expression and the exercise of other human rights.” The track also wants to identify best practices and approaches for the development of human-centric data governance frameworks. With so many apps and technologies being used right now in relation to COVID-19, not enough time and resources are being dedicated to ensure data governance and privacy frameworks around the same. There is a real fear that these technologies will be used to crackdown on dissent by targeting activists, journalists, and those speaking up against the government.
Post the pandemic, there is no telling on how these systems will be used. From the past, we know that technology and tech policies which were put in place as counter-terrorism measures were then instated permanently and used for mass surveillance. This happened in 2001 post 9/11 in the USA, and as recently as in 2016 in Indonesia. With any mass surveillance and data collection systems, the first ones affected are women, queer and trans persons, and those belonging to marginalised communities. It is essential that we have these conversations to have a comprehensive discussion on data and talk about what policies we need to ensure a feminist, human rights approach.
Bishakha Datta - Point of View, India
Debarati Das - Point of View, India
Bishakha Datta, Point of View, India (Moderator)
Dr. Anja Kovacs, Internet Democracy Project, India (Speaker)
Joana Varon, Coding Rights, Brazil (Speaker)
Sadaf Khan, Media Matters for Democracy, Pakistan (Speaker)
Bishakha Datta, Point of View
Bishakha Datta, Point of View
Debarati Das, Point of View
GOAL 5: Gender Equality
Reference Document: https://bit.ly/2Vz5ki5
Report
1. Summary of Gender Report Cards (IGF 2019)
2. Data, privacy and boundary management
Speaker Dr. Anja Kovacs highlights in her essay highlights the concept of 'privacy as boundary management' as being central to whether and how we share our data, what happens with our data, what information about ourselves do we want to share or not share, etc. However, the concept of boundary management doesn’t reflect much in traditional conversations about privacy. In most dominant discourses, data is treated as a resource - separate from the medium that generates it. This has severe implications on a person's agency, privacy and rights. Boundary management is not only important because it allows us to control what we share with others but this control is crucial to living a life of dignity.
3. Gendered implications of data collection by apps
In her essay, Sadaf Khan reflects on menstrual apps, how they track data, and the gendered implications around it. People’s consent to data collection by apps is often not informed consent - they are not fully aware that the apps are ‘authorised’ to share their information, comment and stories. What do these apps do with the data? Who handles the data? Who has access to it? How is it used? For how long is it stored?
4. Feminist values for building transfeminist futures
As Joana Varon reflects in her essay, what would the future look like if algorithms that command our daily interactions were developed based on feminist values? What if the technologies we cherish were developed to crash, instead of maintain, the matrix of domination of capitalism, hetero-patriarchy, white supremacy, and colonisation? How can we build technologies based on feminist notions of consent? How can we use feminist frameworks and values to question, imagine and design tech?
Placing consent at the centre of the conversations around data. Data policies must consider consent & privacy not individualistic matters but collective matters. Policies must take into account economic structures that impact how tech is designed & marketed to men & women, impacting power dynamics.
Speakers:
Dr. Anja Kovacs, Internet Democracy Project, India
Joana Varon, Coding Rights, Brazil
Sadaf Khan, Media Matters for Democracy, Pakistan
Moderator:
Bishakha Datta, Point of View, India
The session directly engaged with gender, focusing on issues such as the gendered implications of data collection by technology, algorithmic biases impacting marginalised genders, the feminist principles of consent, etc.