I’m presuming this is a very quick pre-proof-of-concept idea: The synchronization offset between when beat one comes up aurally and when the green bar finally gets displayed is very disconcerting.

However, at the risk of completely missing the point, this feels like an extremely small niche in this presented format, and the problem has really been already solved: A more useful and general-purpose manifestation would be a time-based-event concept that is already pretty ubiquitous. That could be easily adapted for a concept such as this by knowing the tempo and transforming measures to time-code, etc (like any midi/recording software does already).

I’m thinking Soundcloud annotations as an audio-file representation of this concept, which means you just replace the waveform display with sheet-music display and voila. This proof of concept.

Ok, so what am I missing?