Automating Justice, Designing Power: Reflections from the Commission on the Status of Women

By Mallory Knodel

This week I spoke at the United Nations 70th Commission on the Status of Women in a session titled “Automating Justice: Can Artificial Intelligence Increase Women’s and Girls’ Access to Justice?” The recording is available on the UN’s WebTV. I also delivered a short intervention at an EU side event on Preventing and Combating All Forms of Cyber Violence Against Girls, my reflections of which you can read about on GenderIT.

On the topic of AI, over the past several years I’ve written guidance for two main audiences: for courts navigating AI systems in judicial contexts, and for technology companies building the tools themselves. This talk was different. This was guidance for feminists, civil society leaders, technologists, and advocates working toward gender justice and equity, because AI governance is no longer a niche technical issue. It is shaping the systems that structure everyday life.

I came to the discussion as a technologist with nearly two decades of experience building internet infrastructure and now working on AI systems. Through my work with the Social Web Foundation, I also focus on emerging digital platforms and the infrastructure decisions that determine who benefits from technology and who bears its risks. I shared three reflections.

First, tech governance starts long before regulation.

AI governance doesn’t begin when a system is deployed, it begins in design. We can and should question the intentions of systems design, intervene in their development and monitor deployments.

In research I’ve done for US courts, one salient example is predictive policing. These tools promise efficiency by analyzing historical crime data. But historical data reflects historical inequality. When systems optimize on the past, they risk reproducing its injustices. The intention was to make policing easier, which the technology might actually achieve, but it comes at the expense of justice itself. Another panelist from Equality Now said in very strong terms that bias deepens gender imbalance. 

AI systems don’t “generate” new, more equitable realities. The generation is a reproduction that mirrors existing patterns in data from the past. If that data encodes discrimination, the analysis and the generated outputs will scale it.

Bias is not a technical glitch at the margins. It can enter at every stage: design choices, training data, validation processes and deployment contexts. Governance by design means addressing these risks before systems negatively impact people’s lives.

Second, equitable tech democratization needs feminism.

We are living through a remarkable moment: extraordinarily powerful AI tools are widely accessible. They help people summarize information, draft documents, translate complex material, and navigate complex systems.

This creates real possibilities for expanding access to justice, but access alone is not empowerment. The room closed with comments from Estonia, a fully digitalized European state. Governments should hold their own use of AI systems to the highest possible standard. Estonia said digitalisation and AI is not about efficiency or economics alone. It’s the point at which we must ask, “who will benefit and who is left out?” These are fundamentally questions of power and responsibility and human rights, which feminism is well placed to answer. 

We’ve seen this dynamic with social media platforms, which were celebrated as democratizing technologies that would ultimately lead to social change. Yet the imperfect implementations of this idea within the context of surveillance capitalism meant that data generated by billions of users ultimately fueled advertising systems optimized for profit, often at the expense of privacy and freedom of expression.

AI risks repeating this pattern. In my research on privacy-preserving AI, such as in encrypted environments, I examine how technical architecture can protect sensitive conversations and ensure people seeking help are not exposed to new forms of surveillance or harm.

If AI systems are to support survivors, legal aid seekers, and marginalized communities, they must include strong technical guardrails: privacy protections, meaningful data minimization, secure system design, limits on exploitative data extraction. Feminist principles can guide us towards the democratization of technology that does not exploit the marginalized.

Third, governance happens through standardization and setting global norms.

AI governance is often framed as a regulatory problem, but many of the most consequential decisions happen earlier in technical standards, system architectures, procurement rules, and infrastructure design.

The global internet offers an important lesson. A shared set of open technical protocols allowed networks around the world to interconnect. That technical coordination also produced a unique governance structure: multistakeholderism, in which norms are debated and standards evolve by a multitude of expert stakeholders.

However, AI lacks comparable interoperability. The largest AI foundation models rely on proprietary data ecosystems and are in competition. The AI business model rewards enclosure. As I mentioned earlier, AI was largely build upon enclosed, walled garden social media platforms. As a result, technical interoperability, and the governance structures that accompany it, remain drastically underdeveloped.

Recent work on human rights in standards by the Office of the High Commissioner for Human Rights has emphasized the governance instruments that shape how rights are realized in practice: standards bodies, procurement frameworks, and regulatory regimes. They all must consider human rights from the start. Perhaps this is our opportunity for interoperability, and thus multistakeholder governance of AI.

The future of AI is not predetermined. These technologies will reflect the values embedded in their design and governance. Ensuring they expand justice for women and girls requires vigilance, participation, and collective responsibility.

Like this: