Alfaisal UniversitySafeguarding victims of domestic abuse

Safeguarding victims of domestic abuse

Domestic abuse

Researchers at Alfaisal University are working to develop new technologies to help support victims and those at risk

The Covid-19 pandemic has forced people in communities across the world to stay home and reduce social contact in order to prevent the spread of the virus. But for some, home is not a safe haven.

Reports published by the United Nations warn of an alarming upsurge in domestic violence and abuse since the pandemic began. The rise in cases has been recorded across the world, with travel restrictions and “stay at home” laws leaving victims at risk from abusive partners. 

At present, the UN estimates that less than 40 per cent of women who experience violence seek help or report the crime. With support from innovative technologies, researchers at Alfaisal University are working to develop safer methods of detecting abuse and supporting victims at home. One project is led by Dr Abd-Elhamid M. Taha, an assistant professor in electrical engineering at the institution.

Dr Abd-Elhamid is working to create an alert system that can be triggered during domestic conflicts without making the perpetrator aware or putting vulnerable household members at greater risk. “We are looking into the possibilities of using artificial intelligence to recognise conflict or instances of abuse based on some kind of passive identification process,” he explains.

To achieve this, Dr Abd-Elhamid is drawing on his background in the internet of things. Smart technology such as voice-activated home hubs have provided new outlets for users to ask questions or seek help from external services. But it’s often difficult for domestic abuse victims who live with their abuser to use such services safely. Similarly, those at risk might not have their own mobile phone or personal internet access. “The big question for our research is: ‘How can we create a way of doing all of this safely without creating further risk of harm?’” says Dr Abd-Elhamid.

His approach to the research focuses on “affective sensing” – a method that brings together computing and social influences, such as emotion. “Simply put, it’s having emotion as an input to the computing process – what we sometimes refer to as emotional artificial intelligence, or emotional AI,” Dr Abd-Elhamid explains.

The vision is to create an AI program that can pick up on potential conflict subtly, learning from previous examples of dangerous domestic situations gathered across a range of different households. Crucially, Dr Abd-Elhamid aims to develop a non-intrusive system, one that cannot be exploited to allow third parties to spy on users by listening for particular words or recording conversations. 

As a senior member of the Institute of Electrical and Electronics Engineers (the world’s largest technical professional organisation dedicated to advancing technology for the benefit of humanity), Dr Abd-Elhamid is part of an international working group on emulated empathy in autonomous and intelligence systems. He is thus well versed in the ethics surrounding AI and voice-recognition software, and he acknowledges that this project has its challenges. 

“The data privacy concerns linked to this kind of technology are quite valid,” he says. “We have to balance the objective – which, in this case, is making sure that the vulnerable members of society are looked after – while also reliably addressing the fears we have around a ‘Big Brother’ society where people have too much personal information given away. In these circumstances, the way that we break it down is to first make sure that the technology is achieving what it’s supposed to, then move forwards and tackle the privacy concerns if such a technology was rolled out.” 

His team has already trialled voice-recognition software that “is not listening in on you all the time but picks up when certain activities begin to take place”. Using a publicly available dataset of parliamentary exchanges, one of these trials analysed conversations to assess when they became heated. The software was triggered, for example, by consistently loud or distressing noises. “In that situation, the technology begins to understand there’s some negative activity taking place and it goes into an active identification mode. From there, it could reach a new step where it can recognise whether this is a benign or a conflict activity.

“We were able to mark up different aspects of the language exchange without actually having to understand what was being said, which is particularly interesting because a lot of the time when we’re doing things in machine learning and AI we tend to get stuck with a model that works in English but not in Arabic, or vice versa,” he adds. “It shows there are features we can work with that translate across different cultures.” 

Dr Abd-Elhamid believes the software could one day be incorporated into the operating system of mobile phones. “It makes sense when your phone can already measure your heart rate, VO2 and other physical attributes; why not mental well-being and safety, too? The hope is that eventually all these things will run by themselves through an ordinary app and become normalised in that way,” Dr Abd-Elhamid says.

The key to creating a product that works both safely and ethically is collaboration with colleagues across other subjects and disciplines. Members of the engineering team have worked closely with psychiatrists and social scientists expert in victim and attacker profiles. The work is further supported by Alfaisal’s designated office of research and graduate studies, which provides financing for projects such as Dr Abd-Elhamid’s through its Covid-related research fund. The department also assists researchers with technology transfer and external contacts to help further the research during and after publication. “The campus has the positive atmosphere of a start-up in many ways,” Dr Abd-Elhamid says. 

Dr Abd-Elhamid is realistic in his expectations, explaining that “one piece of technology cannot solve the global tragedy that is domestic violence and abuse”. But the more we learn about domestic abuse and the more technology and AI algorithms advance, “the more useful and varied tools we have at our disposal to help the most vulnerable members of our community”, he says.

Find out more about Alfaisal University.

Brought to you by