Hard to use your fingers? Hard to utilize certain limbs? Have the urge to play Mario and other realtime games instead of slow, boring, turn-based RPGs? Then the Assistive Input Controller is for you! A customizable input controller using body movement and sound, for a small platforming game (created from scratch in Unity). Setup is relatively simple, and presets can be made to swap between body part gestures. Hardware used includes a Microsoft Kinect and audio input devices such as a contact microphone and a clip on microphone. We used Unity and Max, with UDP and OSC protocols to allow the Kinect and two computers to communicate locally. Input methods for controlling the character can be any body part (with a secondary body part to reference position). Other input methods that are integrated include yelling into the clip on microphone, tapping the contact mic like a button, and making a fist shape with a hand.
Figuring out how to develop a game in Unity, using UDP and OSC protocols between Unity and Max, and using skeleton tracking software with the Kinect and turning its output into meaningful control data were some of the challenges that we overcame during this Hackathon.
We'd love to continue to develop this tool and for a broad audience, so input can be customized to be more accessible for certain groups of people. We'd also love to integrate the control scheme with more basic control schemes, such as mouse and keyboard input.
Log in or sign up for Devpost to join the conversation.