Smart speakers could be crucial to stopping domestic violence by detecting screams and shouts, and providing instant real-time information about what’s happening in homes, according to new research.
One in four Australian homes have a smart speaker with artificial intelligence (AI), like Amazon Echo, Apple Homepod, or Google Nest.
Monash University Researcher Robert Sparrow noted Google engineers are considering a future where smart speakers, equipped with infrared detectors, microphones, and cameras, can gather detailed information about a home and use it to predict domestic violence.
“Gunshots, screams, shouting, crying, crashing and thumping noises, or particular combinations thereof, might be taken to indicate that a violent incident is in progress,” he said.
“We suspect that detecting a significant percentage of assaults-in-progress will be well within the capability of smart speaker systems in the not-too-distant future.”
He said a smart speaker, upon detecting violence, could alert the police and social services, warn the victim of potential danger, and provide information on options and support.
Notably, he added smart speakers could also be trained to recognise emotional abuse.
“It would be possible for smart speakers to detect patterns of household behaviour that constitute ‘coercive control’ or are highly gendered to the point of gender injustice: there is compelling evidence that the former can be associated with fatal violence,” he explained.
Further, he mentioned that smart speaker data, including recordings, could be used as evidence in criminal cases to get restraining orders or pursue criminal charges.
It comes as one in five men and women (4.2 million adults) have experienced partner violence, emotional abuse, or financial abuse, according to the Australian Bureau of Statistics.
Women have a higher likelihood of experiencing partner abuse from the age of 15 compared to men.
ABS Head of Crime and Justice Will Milne stated that just over one in four women and one in seven men experienced partner violence or abuse.
Good Intentions May Turn Into More Control
However, Mr. Sparrow noted AI diminishes the political responsibility to address violence and puts the onus on women to be more responsible for their safety.
“Developing smart speakers to detect intimate partner violence could represent a privatisation of policy responses towards intimate partner violence,” he said.
Further, he pointed out that abusive partners could take control of devices; therefore, it’s unlikely women would use them to protect themselves.
He added that creating technology to detect domestic violence would need input from survivors who could provide valuable insights.
“If it is judged that the moral urgency of intimate partner violence justifies exploring what might be possible by developing this technology, then it will be imperative that victim-survivors from a range of demographics ... are engaged in shaping this technology and the legislation and policies needed to regulate it,” he said.
Smart speakers are also not a silver bullet to tackling intimate partner violence, they need “to exist alongside initiatives that address the socio-economic structures that drive violence against women,” Mr. Sparrow said.
However, criminology lecturer Robin Fitzgerald noted that increasing surveillance on offenders also increases surveillance on victims.
“Victims do not always want police to intervene in their lives. In some cases, this form of proactive policing might feel like an extension of control rather than help,” she said.
“What happens when police visit and discover a high-risk perpetrator and victim are living together again?
“Victims may fear child protection authorities will get involved and feel obliged to cover up the fact they are still with the perpetrator. And once a victim has been pressured to lie, they may be reluctant to call the police the next time they do need police intervention.”
Queensland Police Service representative Ben Martain mentioned police have considered reevaluating their involvement in domestic violence situations, suggesting social workers could play a more significant role instead.
He clarified that police can’t charge someone they door-knock for a suspected future offence detected by AI.
Concern Over Rising Tech-Based Abuse
Meanwhile, eSafety Commissioner Julie Inman Grant said technology gave way to numerous harassing texts, GPS tracking, deploying spyware on iPads given to children, and abuse through online banking transaction references in child support payments.
“At eSafety, we’ve seen things like manipulation of home thermostats, lighting and smart TV systems, drones monitoring safe houses, and cars programmed to stall the moment they drive one kilometre beyond their home,” she warned.
She argued addressing misogyny was key to tackling abuse.
“Exercised through technology, these kinds of attacks are often deeply rooted in misogyny and driven by the same behaviours and attitudes that are fuelling gendered online abuse and a rise in threatening and sexualised attacks on women and girls, particularly those in public life who risk being the victim of deliberately misleading slurs or information,” she said.
Esafety added digital disruptor tools, anti-harassment software, perpetrator intervention schemes, and a major national awareness campaign are among projects that prevent abuse.
1800RESPECT 1800 737 732
Lifeline 13 11 14
Isabella Rayner
Author
Isabella Rayner is a reporter based in Melbourne, Australia. She is an author and editor for WellBeing, WILD, and EatWell Magazines.