Who CODEs the digital rules for gender justice?
- Nite Tanzarn
- Dec 31, 2025
- 7 min read

Can a digital tool be neutral? A government launches an online portal for reporting gender-based violence. The technical achievement is clear. Now consider the woman in a rural community with poor network coverage. She might share a mobile device with her husband or brother. Does she know the portal exists? Can she access it safely, without fear of someone seeing her search history in the phone’s log? Does the form require a stable internet connection she cannot maintain? Does it ask for details she is not prepared to recount in a digital form? Does her reported data lead to more police patrols in her area, or does it simply disappear into a ministerial dashboard to become a percentage in an annual report? The tool itself may be neutral, but its impact never is. Its design, deployment, and consequences are deeply political acts.
This example frames the central dilemma. Digital tools now define gender policy work across Africa. They present great opportunity and serious risk. Their ultimate function depends on who controls them. These tools can reinforce existing power structures or help dismantle them. The difference lies not in the code, but in the intention behind it. It lies in who was in the room when the requirements were drafted. It depends on which voices were considered primary users and which were statistical afterthoughts. A tool built for a government ministry to monitor trends is a different tool than one built for a survivor to secure safety. They may share a database, but they serve opposing masters. One serves the state’s need for knowledge. The other serves the individual’s need for agency. The collision between these two purposes reveals where power actually resides.
Data is not truth
This question is the core of digital gender policy analysis. It forces us to look beyond the dashboard. When we map disparities with geographic information systems, we must ask who defined the categories on this map. What local realities did those categories miss? A district may be coded as having high school dropout rates for girls. The map shows a crisis area. But the underlying data may not capture why. It may not record the prevalence of early marriage driven by economic drought. It may not reflect the lack of sanitary facilities. It may not account for the long, unsafe walk to school. The map provides a location. It does not provide a cause. It offers a where, not a why. This is the inherent limitation of quantified analysis.
A machine learning algorithm can process thousands of policy documents to find biased language. But who trained the algorithm to recognise bias in the first place. What assumptions about gender, family, or work are embedded in its training data? If the algorithm learns from historical policy documents, it may simply learn to replicate their ingrained prejudices. It may flag explicitly discriminatory phrases while missing subtler structural inequities. The real analysis begins where the tool’s output ends. It begins when human judgement interrogates the pattern the machine found. The tool offers a suggestive correlation. The analyst must demand the causal link.
Digital analysis promises more data. We get real time sentiment from social media monitoring. We get spatial patterns from mapping software. We get predictive trends from large data sets. This is valuable. It can show a stark gap between policy rhetoric and public experience. It can reveal that a national campaign against teen pregnancy is being mocked or ignored online. It can show service clusters are in urban centres while need is concentrated in rural regions. Yet data is not truth. It is a reflection of the choices made by the people who collect it. It reflects what they decided was important enough to measure. If we only measure what is easy to count, we will ignore what is hard to see. The lived experience of inequality often resides in the qualitative, the contextual, the narrative. The most robust analysis uses digital tools to find the pattern, and human insight to explain it. It uses surveys to get numbers, and focus groups to give those numbers meaning.
When the tool becomes the weapon
This digital shift has transformed advocacy. Research is no longer confined to academic libraries or institutional archives. It happens in collaborative online workspaces linking lawyers in Nairobi, researchers in Accra, and community organisers in Lagos. A policy brief can become a social media thread, an infographic, a voice note shared on WhatsApp groups. This multiplicity is powerful. It can amplify a marginalised voice into a national conversation. It can translate complex legal analysis into accessible formats for community dialogue. A woman in a remote village can hear a policy explained in her own language through a simple audio file. This is democratisation of information.
But power invites a counter reaction. Digital tools also enable sophisticated surveillance, coordinated online harassment, and targeted internet shutdowns. Governments and other actors can monitor digital rights campaigns. They can identify key organisers. Women advocates face particularly virulent forms of online abuse designed to silence them. This presents a stark question for every advocate. As our work moves online, how do we protect the most vulnerable from digital retaliation. How do we secure our communications, our sources, our data. Security is not a technical add on. It is a prerequisite for ethical engagement.
We must build digital capacity with this dual reality in mind. Training cannot just be about how to use data visualisation software. It must be about digital citizenship, critical thinking, and personal safety. It must address the real gender gap in access, skills, and autonomy. Women across Africa are less likely to own a smartphone. They are more likely to have restricted access to family devices. They often face higher barriers to digital literacy. A digital analysis that only hears from those who are already online is not analysis. It is amplification of existing privilege. It mistakes the digitally visible population for the whole population. Effective advocacy must bridge this gap. It must design for low bandwidth. It must prioritise offline feedback mechanisms. It must recognise that the most important stakeholder may be the one least likely to appear in a dataset.
Who builds the capacity?
Digital capacity building is a foundational justice issue. In the African context, it requires more than a series of workshops in capital cities. It requires physical infrastructure, affordable data policies, and devices designed for community sharing. It requires content in local languages and interfaces that respect oral traditions. A text-heavy platform excludes those more comfortable with oral communication. An app that requires high-resolution video uploads excludes areas with weak networks. True innovation must be context-specific.
We must innovate around access. This means supporting community digital hubs run by local organisations. It means advocating for public data subsidies for essential civic information platforms. It means developing and promoting offline-first data collection apps that sync when a connection is found. It means valuing the community knowledge holder who may not use a keyboard but whose insights are vital. Capacity building must be reciprocal. The technologist from the city must learn from the community organiser in the village. Each possesses critical expertise. One understands the technology. The other understands the context. Neither is sufficient alone.
True collaboration uses digital platforms to dissolve borders, not to create new digital elites. A shared document between activists in different countries is powerful. But we must ask if our digital spaces are truly inclusive. Do they value the participation of the woman who can only join a strategy call via a shared phone for thirty minutes? Does our process honour her input as much as the polished written submission from a capital city-based NGO? Capacity is not just technical skill. It involves creating structures and cultures where diverse forms of expertise can thrive on their own terms. It is about designing processes that are flexible enough to accommodate different levels of access without compromising the quality of contribution.
The ethics of the data point
Therefore, the central constraint in digital gender policy is not technological. It is ethical. We handle sensitive personal data on violence, discrimination, and health. This demands more than password protection and encryption. It demands a radical respect for the people behind the data point. Informed consent must be a meaningful, ongoing conversation, not a tick box exercise on a form. People must understand how their information will be used, who will see it, and what risks they might face. They must have the right to withdraw their data without penalty.
Data anonymisation is a moral duty, but it is also a technical challenge. Removing names is not enough. Other details – location, age, specific circumstances – can combine to re-identify individuals in small communities. We must constantly ask are we extracting information, or are we fostering a dialogue? Are we treating people as subjects of study or as partners in inquiry? The choice we make defines the legitimacy and justice of our entire work. Extractive data practices mirror extractive economic practices. They take value from communities and provide little in return.
The tangible risk is a digital colonialism of data. This is where information flows in one direction, from communities to distant analysts in capitals or foreign institutions. No clear benefit returns to the community. No actionable insight is fed back. The data is used to write reports, secure further funding, or build academic careers, while the situation on the ground remains unchanged. This practice does not simply corrupt the data. It reaffirms an extractive relationship where communities provide value and receive none in return. To counter this, we must design for reciprocity from the outset.
What does the community get back from this analysis? Does it receive clearer, actionable insights about its own situation? Does it gain leverage in negotiations with local authorities? Does it build its own internal capacity to collect and use data? The tool must serve the people it measures. Its success metric should be their strengthened agency, not the number of data points collected.
What happens next?
The future will bring more complex tools. Artificial intelligence could model the potential impacts of different policy choices. Blockchain-based systems could track and verify government spending on gender commitments. Predictive analytics could try to identify areas at highest risk of gender-based violence. These tools will magnify our intentions. They will automate and entrench bias if we are not vigilant. They will create unprecedented transparency and accountability if we demand it and design for it. Our primary investment must remain in the people who steer this technology. We need a new generation of policy analysts who are technologically adept, ethically grounded, and politically astute. They must read code as critically as they read legislation.





Who CODEs the digital rules. Who is included? Who is excluded? Key questions to ask now and not to lament later.
Who CODEs the digital rules. Who is included? Who is excluded? Key questions to ask now and not to lament later.