A new digital platform, Gender Rights in Tech (Grit), has launched in South Africa, offering an AI-powered chatbot and secure evidence storage to combat gender-based violence. Developed by Leonora Tima, the initiative aims to provide support and gather crucial evidence for survivors in a country facing alarmingly high rates of violence against women.
Key Takeaways
- Grit offers a free mobile app with a rapid-response help button.
- The app includes a secure 'vault' for storing evidence of abuse.
- An AI chatbot, Zuzi, acts as a trusted 'aunt figure' for users.
- The initiative is an African solution, co-designed with local communities.
- Experts caution that AI should complement, not replace, human support.
Addressing a National Crisis
South Africa grapples with some of the world's highest levels of gender-based violence (GBV). Reports indicate a femicide rate five times higher than the global average. Between 2015 and 2020, an average of seven women were killed every day, according to national police data. Leonora Tima's personal tragedy, the 2020 killing of her 19-year-old, nine-months pregnant relative, highlighted the urgent need for new solutions.
Tima observed that such violent deaths often go unreported in local news, reflecting a societal normalization of the issue. This silence became the driving force behind Grit, an app designed to empower victims and facilitate the reporting process.
The Scale of the Problem
Gender-based violence in South Africa is a pervasive issue. Many victims are hesitant to use traditional reporting methods due to fear, distrust, or the perception that the system fails them. Technology offers a new avenue for support and intervention.
Grit's Core Features and Community Design
Grit represents one of the first free AI tools developed by African creators to tackle gender-based violence. Tima emphasizes its local roots, stating, "This is an African solution co-designed with African communities." Her team engaged with over 800 individuals in Cape Town's townships, understanding how they used technology and what barriers prevented them from seeking help.
The research revealed a significant distrust in traditional routes like the police. Some women who posted about abuse on social media even faced defamation lawsuits. This feedback shaped Grit's design, aiming to offer a safer, more confidential platform.
Rapid Response and Evidence Collection
The Grit app, which is free to download but requires mobile data, has garnered 13,000 users and received approximately 10,000 requests for help in September alone. It is built around three key features:
- Help Button: A prominent circular button on the home screen initiates a 20-second audio recording and sends an alert to a private rapid-response call center. Trained operators then contact the user and dispatch help if needed. Tima notes that while some users have tested the button out of curiosity, serious misuse has been minimal.
- The Vault: This secure, encrypted digital space allows users to store evidence of abuse, such as photos of injuries, screenshots of threatening messages, or voice recordings. The vault timestamps and protects this data, preventing loss or tampering, which can be crucial for future legal proceedings.
- Zuzi Chatbot: Launched recently, Zuzi is an AI-powered chatbot designed to provide a listening ear, advice, and guidance to local community support services.
"We need to earn people's trust. These are communities that are often ignored. We are asking a lot from people when it comes to sharing data." – Leonora Tima
Zuzi: The AI 'Aunt Figure'
The development of the Zuzi chatbot involved extensive community consultation. Users expressed a desire for Zuzi to be an "aunt figure" – someone warm, trustworthy, and non-judgmental. This approach allows users to confide openly without fear.
Interestingly, Zuzi has also seen use from men. Some perpetrators have engaged the chatbot seeking help for anger management issues, while other men who are victims of violence have found Zuzi a safe space to discuss their experiences. "People like talking to AI because they don't feel judged by it," Tima explains. "It's not a human."
AI and Empathy
The design of Zuzi as an 'aunt figure' highlights a key aspect of AI development for sensitive topics: creating a persona that encourages trust and open communication, even if it cannot fully replicate human empathy.
International Recognition and Expert Cautions
Grit's innovative approach has garnered international attention. Leonora Tima and her team presented the app at the Feminist Foreign Policy Conference in Paris, hosted by the French government. The event saw 31 countries pledge to prioritize tackling gender-based violence.
Despite the growing interest, experts like Lisa Vetten, a specialist in gender-based violence in South Africa, urge caution regarding AI in trauma-centered care. She points out that Large Language Models primarily engage in "linguistic analysis and prediction" rather than true intelligence. Vetten acknowledges AI's potential but warns against its limitations, citing instances where chatbots have provided incorrect legal advice.
"I worry when they give women very confident answers to their legal problems. Chatbots can provide helpful information but they are incapable of dealing with complex, multi-faceted difficulties. Most importantly, they are not a substitute for human counselling. People who have been harmed need to be helped to trust and feel safe with other human beings." – Lisa Vetten, GBV Specialist
Vetten's concerns underscore the need for AI tools to complement, rather than replace, the critical human element of support and empathy that trained professionals provide. Survivors of abuse often require deep emotional connection and understanding.
The Future of AI and Gender Equality
The debate around AI's role in addressing gender inequality extends to its development. Lyric Thompson, founder of the Feminist Foreign Policy Collaborative, highlights the challenge of incorporating gender considerations into AI discussions, often met with resistance or dismissal.
Heather Hurlburt, an associate fellow at Chatham House specializing in AI, agrees that AI has "enormous potential either to help identify and redress gender discrimination and gender-based violence, or to entrench misogyny and inequity." The direction, she states, is "very much up to us."
Leonora Tima believes the success of AI in combating gender-based violence hinges not just on engineering but on who designs the technology. A 2018 World Economic Forum report indicated that only 22% of AI professionals globally were women, a statistic still widely cited. Tima argues that current AI models are built on historical data that predominantly reflects the voices of men, particularly white men.
For technology to truly represent the realities of its users, Tima concludes, there must be more diverse creators: women, women of color, individuals from the global South, and those from less privileged socio-economic backgrounds. This inclusivity is vital for building AI that genuinely serves all communities.





