Harmonic Hustle

Demo: YouTube

Goals

  • Six-member team project for CPSC 427: Video Game Programming.
  • Build a game from scratch (i.e. without a game engine) to better understand what game engines deal with behind-the-scenes.
  • Participate in the complete design of a project from start to finish, with full creative control.
  • Simulate the software development life cycle by deploying a working version of the game for four biweekly milestones (M1, M2, M3, M4).

Features

Particle system

Description

Rhythm games often have very flashy visuals to make the gameplay experience more satisfying. As soon as I learned about what particle systems were, I became obsessed with including particles as part of our game. LearnOpenGL provided a solid foundation for implementing trail particles, which follow notes as they travel down the screen. From there, I made changes to prepare for extending the system such as factoring out shared functionality, and used instanced rendering to minimize the impact on framerate (FPS). Afterwards, I followed up and implemented more particle types, ultimately ending up with flames, sparks, and puffs of smoke all flying across the screen.

Roadmap
  • M3: Learned about particle systems and ported code for ’trail’ particles from LearnOpenGL as a starting point.

    • The solution was functional, but would not be easily extendible in its current state.
  • M3: Applied instanced rendering technique to improve performance and enable more particles in the future.

    • In a nutshell - instanced rendering updates the data (color, position, etc.) for all particles first, then draws them at all once.
    • Reduced thousands of draw calls per frame to a single-digit amount of calls.
  • M3: Split particle functionality into base and derived classes, allowing the system to easily support more particle types

    • Derived classes only need to override particle behaviour (e.g. particles that now drift upwards instead of falling down)
    • Wrote documentation outlining the steps to add more particle types.
    • Implemented an additional particle type to verify and refine documentation.
  • M4: Pushed limits of system by implementing an additional two particle types.

    • Refactored variables to allow greater/fewer maximum particles per type.
    • Polished particle type behaviours - position, trajectory, and fade-out duration.

Audio-visual synchronization

Description

A rhythm game requires precise timing between audio and visuals, so that it feels intuitive and satisfying to the player to press the correct keys at the correct time. By leveraging my knowledge of music arithmetic then researching and iterating on the solution over consecutive milestones, I was able to reach a reasonably accurate and scalable solution that allows users to achieve perfect scores (without trivializing the scoring system entirely).

Roadmap
  • M2: Simple arithmetic using delta-time, i.e. calculating how much time has passed in real life using standard library functions.

    • Synchronization was playable, but inaccuracies in timing would accumulate over time.
    • The solution would scale poorly as the audio length increases.
  • M3: Used linear interpolation to improve consistency of note positions.

    • Synchronization improved visually, but players felt that their score unfairly mismatched their performance.
    • Even though visuals matched audio better, gameplay still didn’t “click.”
  • M4: Applied Conductor technique to correct inaccuracies from using delta-time.

    • Updated SDL/SDL_Mixer to newer versions to gain access to Mix_GetMusicPosition.
    • By querying the ’true’ audio playback position directly, long-term synchronization improved drastically.
    • The solution scales very well regardless of the audio length.
    • Combined with more forgiving scoring adjustments, players were finally able to achieve perfect scores.

Original music & SFX

  • Created exclusively original sound assets for the game!