Tracker vs. piano roll (How to compose music on the X16?)
Posted: Sat Dec 19, 2020 4:12 am
1 hour ago, TomXP411 said:
The only difference between piano roll, tracker view, and an event list view is the way the data is presented.
On a piano roll, you are simply drawing a graph where X=Time and Y=Note. Other information, like velocity (volume), modulation, pitch bend, and other parameters are usually shown as a segment or line graph below the roll.
On a tracker view, you are listing the events on a grid where Y=Time and X=Channel or effect type.
No matter the display format, the underlying data structure is the same. Each event consists of a time index, an event type (note on, note off, patch change, controller change), and a data value (note number, controller value, patch number)
The rendering time is going to depend more on the size of the display than on the presentation method.
It's a bit more complicated than that though. If using a piano roll, it's expected you can do chords and to me the fundamental method of handing what music is by instrument vs by channel. The software could do some trickery to be able to cleverly reuse channels I'd imagine, but that would be handled more by the software. The trackers, by contrast, hands that over to the musician.
So elaborate on my previous example, say I have something like a SID chip that only has 3 voices. But I want to do more than just chords, so I would interleave other sounds in between when the chords aren't playing. So channel1 might play the base note, but then also play a bass drum. Channel2 the middle note, but then might interleave a melody in between. Channel3 might be doing some rhythmic stabs.
In a tracker, that's a frequent tool to add more apparent channels than there are but you can also more precisely control effects while keeping channel usage in check. When I have more channels, I tend to use fewer tricks of course, but it's an interesting way to think about it and sometimes I wish I did this more with modern music. For instance, if I'm using my Deepmind, I might want to map voices to multiple patches and change them around. I *CAN* do that already but in a modern DAW it's a lot more work than just changing an instrument number in the pattern data within a tracker and I often don't think in this way. Instead I'll record the synth to audio then add the other part in after, which kinda also breaks the creative flow.
Likewise I have a MidiBox SID (actually a monster MB6582 I just finished) which has EIGHT SIDs in it. You'd think that would allow me to create some insane things, but interfacing with the chips via MIDI and a DAW, while I can make some incredibly complicated individual patches, is overall a bit more limiting than if I were to more precisely control each SID voice directly. I can do that, but it's real complicated to do in a DAW, but would be trivial in a tracker.
I'm not trying to speak ill of piano rolls mind you - actually I'm fond of them as well and I think they are much more approachable than a tracker interface, which can be quite intimidating at first. I just wanted to elaborate a bit on the power of trackers which shouldn't be overlooked.
Now ALL that said (and it was a lot, sorry!) - with a rich sound implementation like the YM2151, there's enough channels where folks can write lovely music without resorting to full on tracker trickery to the point that a more musical interface could make sense. Likewise the chip is kinda built around the concept of "instruments" which lends itself more favorably to composing music more with that frame of mind. Tracker-music often sounds different than erhm not-tracker music for lack of a better term, and I think there's plenty of opportunity for both approaches to eventually exist on the x16. I'm eager to see what the final sound design is though!