Colour Ball Throwers

Colour Ball Throwers is a VR game for Meta Quest 2 that utilizes voice recognition for object manipulation. Players command colour changes to solve puzzles by throwing spheres at corresponding doors. The project emphasizes immersive voice interaction, informed by player feedback and human-computer interaction principles, while exploring the potential of voice-first game design in VR.

Exploring interaction in VR using voice-controlled object manipulation

Overview

Colour Ball Throwers is a VR game created in Unity for the Meta Quest 2 that experiments with voice recognition and colour-based object interaction. Built using Wit.AI for natural language processing, players speak colour commands to change the colour of throwable spheres in order to interact with the world, such as matching the colour of a ball to a door to pass through it.

Initially imagined as a spellcasting simulator, the game evolved into a more playful and engaging experience focused on throwing objects and solving simple logic puzzles through voice interaction. This project explores how natural language input and VR hand-tracking can combine to create novel modes of immersion and accessibility.

Design Philosophy

Inspired by commercial VR experiences like Waltz of the Wizard and Half-Life: Alyx, the game was grounded in the desire to create a voice-powered fantasy sandbox. However, scope constraints and player feedback led to a focused pivot, emphasising tactile throwing, speech-triggered colour changes, and environmental interaction.

Design decisions were informed by human-computer interaction principles, including iterative testing, accessible voice input, and embodied interaction. Player feedback guided key UX improvements, such as snap turning and teleportation to reduce motion sickness.

Key Gameplay Features

Voice-Activated Colour Commands

  • Uses Wit.AI to detect voice input and recognise colour and shape combinations
  • Example: “Turn the sphere red” changes the sphere’s colour to red
  • Supports synonyms and flexible phrasing (for example, “make it blue”, “red ball”)

Throwing-Based Puzzle Solving

  • Players pick up and throw coloured spheres at matching doors
  • If a ball’s colour matches the door, the door deactivates
  • Colour mismatch has no effect, encouraging experimentation

Voice Entity Recognition

  • Wit.AI trained on custom utterances to distinguish:
    • Colour (red, blue, yellow, etc.)
    • Shape (ball, sphere)
  • Colour (red, blue, yellow, etc.)
  • Shape (ball, sphere)
  • Dynamic runtime interpretation via Unity scripts

VR-Specific UX

  • Designed for Meta Quest 2
  • Implements snap turning and teleportation to mitigate motion sickness
  • Explorable environment with interactable objects

Technical Highlights

  • Built using Unity and Meta All-in-One SDK for VR
  • Integrated Wit.AI voice API using App Voice Experience plugin
  • Modular architecture for colour-changing objects and dynamic doors
  • Custom scripts for intent matching, collision detection, and RGB value comparison
  • Visual feedback when utterances succeed or fail

What I Learned

This project taught me the challenges and potential of voice-first game design, especially in VR. I learned how to build a working voice recognition pipeline using external APIs and train those systems to recognise variable player input. Additionally, I gained experience in managing project scope, user-centred iteration, and reconciling technical limitations with creative goals.