UC Berkeley

UC Berkeley professor using artificial intelligence to help identify domestic violence

NBC Universal, Inc.

Artificial intelligence or AI is one of the hottest topics going.

AI has also been described as revolutionary with the promise to do everything from curing cancer to changing how we drive.

Now, a UC Berkeley study suggest the tech may actually be able to help predict if a patient is being abused by their partner.

“This is actually an example of machine learning being used for things that humans don’t do really well right now,” said UC Berkeley assistant professor Irene Chen.

Using data from past patients involved in abusive relationships, Chen says her team has created an algorithm that identifies potential high-risk victims, using radiology reports and medical records.

The result is an artificial intelligence tool that may be able to flag a potential victim years before the most visible signs of domestic violence.

“Maybe there is a clinical history of a lot of risk factors, like you get into a lot of accidents, or you have substance abuse or you have mental health issues that could all put you at higher risk for domestic violence, or the pattern of your injuries as seen from your x-rays,” she said.

Chen added the algorithm has already been successfully tested at UCSF Hospital. The goal is to give doctors and advocates a tool that can help them identify and help victims sooner.

With a shortage of trained domestic violence counselors, the tech could be a game changer.

“Learning from those types of patients, we can better understand the patterns of how domestic violence victims manifest in the healthcare system,” she said.

Next Door Solutions to Domestic Violence in San Jose works to get abuse survivors help.

While they think the technology could be helpful, Next Door Solutions manager Erica Villa worries the data could provide false positives and have unintended consequences.

“My worry is using this technology to identify demographics, identify ethnic groups, and will that then create a stereotype to identify victims of domestic violence,” she said. “Providing the resource to everyone so, that when they are ready to seek help, they know where to go.”

Its questions that Chen says she has thought about and is attempting to address. She admits no technology is perfect but hopes the tech can offer one more tool to save lives.

“What my research focuses on is how can we make sure AI is used to improve healthcare for everyone?" she said.

Contact Us