One in four Australian children aged eight to 14 experiences online bullying. Yet, most victims and witnesses stay silent, compounding the impact on mental health. Now, artificial intelligence (AI) experts are teaming up with community and industry organisations to tackle the problem in all its stages.
Xue Li’s youngest daughter was just 10 years old when he walked in on her typing intensely on the family computer, causing her to cover the screen.
“I was a little surprised because I used to know everything my daughter was doing – I took her to school. Suddenly I felt a bit strange,” says Xue, a Professor in the School of Information Technology and Electrical Engineering at the University of Queensland.
On learning his daughter was using social media, Xue’s first concern was that she might be talking to strangers. Yet, when she told him she was chatting with her school friends, his unease remained. Xue began searching Google for information on cyberbullying.
“There were dozens of photos of children, some as young as my daughter, who had committed suicide because of cyberbullying. Most of them were bullied by their classmates, their school mates, their close friends, actually.”
Xue resolved to do something to protect children from such “bad, ugly” experiences. So he set out to develop software to detect, and ultimately prevent, cyberbullying behaviour.
To begin with, one of Xue’s PhD students compiled a data set, manually tagging what did and didn’t constitute cyberbullying and suicide ideation. The information would be used to train a machine how to recognise the behaviours. But, Xue needed sponsorship to continue the project, so he approached researchers at UTS to collaborate in attracting an industry partner.
“UTS has a very strong research team in machine learning – one of the best in Australia,” says Xue. “And this is basically a machine learning, AI project.”
Fortuitously, when Xue made contact in 2014, researchers at the UTS Centre for Artificial Intelligence (CAI) were already “exploring how our research could be applied to social networks to benefit society,” says Senior Lecturer Guodong Long.
Led by CAI’s Professor Ivor Tsang, along with Xue, Guodong and Adjunct Professor Dacheng Tao, the team proposed to change existing cyberbullying prevention services from reactive keyword filtering to proactive early detection.
In 2015, they secured $550,000 through the Australian Research Council’s Linkage scheme, which supports research partnerships with business, industry and community organisations. This enabled the researchers to collaborate with the Australian Research Alliance for Children and Youth (ARACY) and the Global Business College of Australia (GBCA), bringing both additional expertise and partner funding to the project.
And the research evolved as a result. In addition to detecting cyberbullying, the team is now developing an automation tool to assist social workers and other support people to conduct effective conversations about cyberbullying with youth.
Director of Collaboration and Engagement at ARACY Kristy Noble says, “Because this pervasive digital world we live in is such a new thing, we don't know much about how these technologies impact child and youth mental health.”
“By collaborating, we bring out expertise in all areas to facilitate the best possible outcomes.”
While public discourse often focuses on the negative effects of digital technologies on young people, Kristy says we need to ask, “How can we utilise them as a positive tool, such as through the delivery of online cognitive behavioural therapy?”
By leveraging ARACY’s expertise in evidence-based approaches to child health and wellbeing, the team are looking to develop practical, preventative solutions.
“Sometimes it is hard for social workers and parents to convince children not to bully or to help them if they have been bullied,” explains Guodong. “We are trying to develop a new technology that can help you rank your sentence when you say something or type something.
“This software can tell you if your words are suitable for the situation or how much impact they can have.”
Having used publicly available data to inform the detection model, the team are now grappling with the personal privacy aspect of the research. Guodong explains, “We want people to feel safe and comfortable that when they use our software their personal information will not be available to others as part of a big data set.”
Their solution was to develop a new piece of software that uses small data (accessible, easy-to-understand information) to preserve privacy. “This software can be deployed to your mobile and the computations will happen ‘within’ your phone without extracting your data. So your personal information is protected.”
It’s a complex project, requiring diverse expertise in cyberbullying, machine learning and AI, mental health and youth. So multidisciplinary and multi-sector collaboration are essential.
“We live in a world that is so full of expertise that no single person or organisation can really have the ability to provide the best of every aspect,” says Kristy. “By collaborating, we bring out expertise in all areas to facilitate the best possible outcomes.”
From the researchers’ perspective, the knowledge, networks and access to children’s and parents’ perspectives that ARACY and GBCA have brought have been crucial to refining their ideas into workable solutions.
“Imagination is where it begins,” says Guodong, “but to have comments from them helps us adapt our technology for the real world.”