Augmented Personal Experience - Run Time Video Game Accessibility

Taylor Ripke and Tony Morelli

NOTE: This paper was selected by the program committee as a Meaningful Play 2016 Top Paper. It has been submitted to the Meaningful Play 2016 Special Issue of the International Journal of Gaming and Computer-Mediated Simulations (IJGCMS). Due to the copyright requirements of the journal, only the abstract is available in the conference proceedings.


As technology naturally progressed over time, video games became increasingly complex as strategy and fast-paced first-person shooters require a high level adherence to finite details coupled with the ability to react to unpredictable situations without hesitation. These games coincided with advancements in game input devices, giving players more control, and allowed for game mechanics to require substantial thought and memorization skills. However, this complexity presents significant challenges for some players, as many modern games do not provide the necessary accessibility features for everyone to play. Most games do not offer multiple language subtitles or the option for extra assistance in the game. Nevertheless, preexisting games without accessibility features built in can be made more accessible through responsive programming. This paper presents a system for augmenting existing video games with assistive visual and audio cues. Image processing algorithms give players active feedback on the current state of the game and offer advice on how to complete a specific task with additional information added to the existing video game presentation.