Developer, Spoke

Spoke is an augmented reality music visualizer for iOS and Hololens. It allows a user to browse their Spotify libraries and interact with music in a new way. Spoke creates an entity that responds differently relative to the type of music that is being played -- much like we do. This is accomplished by accessing the Spotify SoundNest API, which returns interesting characteristics about specific tracks, like the danceability, mood, energy, and more.

I created Spoke because I am excited to see the effect augmented reality has on our relationship with music. It is a first step to bringing music more physically into our world.

Spoke started as a thesis project my final year in college. I wanted to learn more about mobile graphics optimization and multiplatform AR development. I continue to develop the application and hope to finish this year.

The entity itself is adapted from Keijiro Takahashi's Swarm project, which was instrumental in my exploration of compute shaders.


  • HLSL
  • Mobile device limitations
  • Full stack development
  • REST API's
  • Information visualization
  • Hololens development
  • ARKit development
  • Unity to native plugin development

Screen recordings