top of page
Writer's pictureJoaquin De Losada

Talk Title: Loud and Clear: Improving Accessibility for low vision players in Cosmonious High.

Effective talk: How to make VR games for low vision players

Year of Talk: 2024


When starting to design ‘Cosmonious high’ the team wanted to allow as many people with vision loss to be able to enjoy the game.


The team first needed to determine how to communicate more information to the player through other means besides just visuals. Most VR games normally rely heavily on what the player can see around it and determine what it is holding.


Communicating information to impaired players:

An initial solution chosen was Text-to-Speech (TTS) which is a tool that has already been used for phones as well as menu options in some games. This was implemented using the Accessibility Object Model (AOM) which is normally used to create more accessible websites on the internet.


Some of the initial tasks for TTS to work on the team implemented included grabbing/dropping items, pointing at items, and hovering over UI elements. This helped the team create a baseline of the most common tasks that players will do during gameplay and will need the most help with. Shortly after this, the team realized that many items that might not be intractable would also need to be hooked up to the TTS system to allow players to know what is around them.


The next tasks that the team needed to work on were tutorials (Especially popup tutorials) and location descriptions to help initially navigate around the world space. For the tutorials many times instead of text appearing it would only show an image with a basic animation which would need to be translated some way. For both options many new lines needed to be created to explain the tasks as well as the ability to describe key areas around the player before and after teleporting.


When testing the location description with external testers many of the devs realized that the testers had no experience with the map layout. Forcing the devs to rewrite the lines to include a lot more descriptions as well as many more markers around the map.


Problem: Finding the objects

As the game required a lot of interacting with items the team needed to find ways to communicate what items could be interacted with and which ones couldn't.


By trying to convey the information through multiple ways then it would drastically improve how likely players would understand what they could interact with.


A unique feature that 3D games allow is auditory feedback specifically spacial audio. Where sounds would come from the locations of the objects instead of just being in the general ambiance. Many people can use audio to determine the general location/direction that the item is located and start approaching. Helping to find the general location of the item.


VR denies the ability of the user to feel the texture, size, and weight of items. The devs needed to use the haptic feedback from the controllers to help the players know whether there are items they can or cannot interact with. This combined with the TTS helps drastically with players to understand what items they have in their hands or are about to grab. By also including haptic feedback when the player attempted to leave a room they couldn't it allowed them to have a better understanding of how big the room was.


Another important feature was improving the visual feedback, especially with contrast colors. As the goal was to help people with different visual imperatives (Including color blindness) the team included highlighting many items to help distinguish them from background items.


Problem: Constant, Unintentional “Assistance”

Something that came up that was unintentional was the fact that every item in the world would easily trigger the TTS. This could be very annoying and disorienting as the players could have multiple overlapping TTS appearing at the same time. There was no way of controlling what or how many TTS tracks would play at the same time.


The solution the team worked on was setting up a system of toggles that would allow the player to determine how much help they would need as well as a trigger and cancel the TTS. After this, the team also made sure to lower the game world audio when TTS was playing. This helps players understand what part of the game they are meant to concentrate on while also making sure that they can still hear the surrounding area.


Problem: Inconsistent Value of Information.

Although many players enjoyed the TTS and all the other features that helped them get a better understanding of the world, some of them would not use it as they felt it didn't prove enough value. At times players found it difficult to know what items descriptions they needed to hear and didn't want to spend a lot of time hearing out each item. It could also cause a problem where a player might have gotten distracted for a moment while the TTS was playing meaning they might need to replay it and spend extra time going over the information they already heard.


By clarifying the information and finding ways to make it more concise it allowed people to quickly get the information and move forward with the game. This was also done by rearranging the information being presented. This meant that the most important thing came first so that the player could quickly hear it and start acting on it. When the text was standardized so that a majority of the information was read in front allowed the devs to make sure the important info was given first.


An option was also given to remove any comedic text that would appear as the descriptions also helped clear up the confusion from many players. As many of the flavor text might not make sense to players if they hadn't been able to pick up the specific tone of the game beforehand.


Problem: Accessing Assistance Required Assistance

A major problem the team encountered was the fact that players would need assistance if they wanted to activate the assistance, defeating the purpose of part of the assistance. Early on the team had a hotkey that activated the assistance but that normally wouldn't work with consoles or standalone VR systems.


An idea the team used was using a sense called ‘Proprioception’, which is one's ability to determine where all their body parts are with one another and themselves. This means that players could be asked to perform a specific task that would signal to turn it on. Also making it double-tap meant that people wouldn't activate it by accident while playing normally. For the game, the devs tracked the player's approximate location of the year, and double tapping in the area would allow the assistance to be activated.


Even though some people who are closer to the completely blind area of the spectrum would still need assistance to set up the VR headset and launch the game. Many of the features still drastically helped improve the game's accessibility for many people who normally had challenges with vision.


According to the data collected by the Quest 2 system, there had been about 9.6 million different TTS requests between March 14, 2023, and September 25, 2023. Or around 1.16 million requests per month. So it can be considered a success.


9 views0 comments

Recent Posts

See All

Comments


bottom of page