If you are interested in procedural generated music (music that interacts with what happens in game), you should check out an indie game developer called Nifflas.
He uses Unity nowadays, and he created his own synthesizer in Unity, so he can have the music interact with what happens in-game. For example, in his latest game Knytt Underground, there is a section that is rhythm based, where you hear and see the music interact with the game very clearly.
Here's one of the examples of his earlier implementations:
https://youtu.be/HLA3AekhB3U?t=98 The character is forced to transform every beat. In later puzzles, the puzzles themselves transform their state based on the beat or musical events.
Here is an example of the synth he build for his upcoming project Ondskan:
https://www.youtube.com/watch?v=bU8HEQm0zX0. Everything you hear is generated by his synth and as you can see that he presses some buttons, those button presses could be an event in a game just as easily. For example, entering a room could add a synth section or change a parameter. For example, entering a cave could add a delay section to simulate reverb.