Re: Dream Tracker
Posted: Sat Aug 19, 2023 6:07 pm
Great info!
Actually, this is only starting to scratch the surface of what I want to know about audio. I have been totally ignoring the "sound" aspect of the X16 and VERA up to this point because I didn't want to be involved in more "learning curves" than I can handle at a single moment. But as each learning curve is steamrollered flat, I can take on more.
Ok, so ZSM (Which I've heard of, but thought only did sampled audio.) sends data to the chips in real time, I.E. as a stream just as it's needed. This seems to be the proper way to do this, as you can just open a file and stream it straight into a player that wakes up every frame and decides what to do that frame. But I get that you need the entire data set for development.
Ok, so the diamonds mean the channel is active, not necessarily making actual sound at the moment. Ok, that makes sense.
MIDI and sync. I can see how this could be problematic. While I expect that a noob like me couldn't tell the difference, someone with an "ear" could probably tell if incoming MIDI data is artificially constrained to the 60 hz rate of the vsync interrupt. It probably sounds like a guitar string out of tune to them, or something.
I'm definitely going to be following your efforts and your github.
Actually, this is only starting to scratch the surface of what I want to know about audio. I have been totally ignoring the "sound" aspect of the X16 and VERA up to this point because I didn't want to be involved in more "learning curves" than I can handle at a single moment. But as each learning curve is steamrollered flat, I can take on more.
Ok, so ZSM (Which I've heard of, but thought only did sampled audio.) sends data to the chips in real time, I.E. as a stream just as it's needed. This seems to be the proper way to do this, as you can just open a file and stream it straight into a player that wakes up every frame and decides what to do that frame. But I get that you need the entire data set for development.
Ok, so the diamonds mean the channel is active, not necessarily making actual sound at the moment. Ok, that makes sense.
MIDI and sync. I can see how this could be problematic. While I expect that a noob like me couldn't tell the difference, someone with an "ear" could probably tell if incoming MIDI data is artificially constrained to the 60 hz rate of the vsync interrupt. It probably sounds like a guitar string out of tune to them, or something.
I'm definitely going to be following your efforts and your github.