Exposure to VR or AR environments can lead to cybersickness, a form of motion sickness causing dizziness and nausea. Current research to alleviate symptoms often relies on a generic approach.
Khaza Anuarul Hoque, an assistant professor at the University of Missouri, with a team of researchers, aims to develop a customised method for detecting cybersickness by addressing the individual root causes with a personalised approach as it can vary for each person.
“Cybersickness is not generic. For instance, one simulation could trigger cybersickness in me while the same simulation may not cause cybersickness for someone else,” said Hoque.
The assistant professor went on to say, “One of the problems people typically face when wearing virtual reality or augmented reality headsets is the user experience can get bad after some time, including symptoms of nausea and vomiting.
“Especially if the user is immersed in a virtual environment where a lot of motion is involved. It can depend on many factors, including a person’s gender, age and experience,” said Hoque.
Explainable AI to the rescue
Hoque aims to focus on leveraging explainable AI as a new approach, as it has the potential to transform the AR and VR industry.
“Explainable AI is a great tool to help with this because typically machine learning or deep learning algorithms can tell you what the prediction and the decision may be, whereas explainable AI can also tell the user how and why the AI made the decision,” Hoque said.
“So, instead of imposing a static mitigation technique for all users, it will be more effective if we know why a particular person is developing cybersickness and give that person the right mitigation that they need. Explainable AI can help us do that without hindering the user experience.”
Hoque observed that his students and the industry have been struggling with cybersickness for 5-7 years. And that current detection methods mainly involve data-driven techniques such as machine learning and deep learning.
“Such approaches are often ‘black box,’ and thus, they lack explainability,” Hoque said. “I also realised that the explainability of the DL cybersickness models can significantly improve the model’s understanding and provide insight into why and how these AI models arrived at a specific decision.”
Hoque said that using explainable AI can help software developers pinpoint the crucial features for training AI to recognise cybersickness symptoms, particularly for users wearing standalone VR headsets.
Isa Muhammad is a writer and video game journalist covering many aspects of entertainment media including the film industry. He's steadily writing his way to the sharp end of journalism and enjoys staying informed. If he's not reading, playing video games or catching up on his favourite TV series, then he's probably writing about them.