Questions about period and frequency

Started by wow25, April 30, 2021, 06:08:41

Previous topic - Next topic

wow25

Hi! I've recently been working on making a simple mod player, and for the most part everything has been going well. However, translating the period of a note into its frequency has been giving me some clearly incorrect results. I've been using this page as my main reference: https://ftp.modland.com/pub/documents/format_documentation/Protracker%20effects%20(MODFIL12.TXT)%20(.mod).txt

It claims that the formula to find frequency is (7159090.0 / (2 * period)). But for most amiga period values, this produces a very high frequency. For example: C-1 is said to have a period of 856, which implies a frequency of 4181.7 Hz.

Since I'm not very good at reading other's code, I haven't been able to figure it out from other sources, like libopenmpt. My current guess is that my player interprets frequency differently (it's not derived from a mod player). If it helps, my code assumes a 2Hz frequency should try to play any sample (regardless of size) twice a second, which admittedly seems a bit off.

Saga Musix

QuoteC-1 is said to have a period of 856, which implies a frequency of 4181.7 Hz.
That sounds correct. I think it's just your interpretation of what frequency means in this context is a bit off. A frequency of 4181 Hz doesn't mean that the whole sample repeats 4181 times a second. The result of this formula is a sample rate, i.e. it's the frequency at which the sample position is incremented - it means that 4181 sampling points will play per second. So if your sample is 4181 samples long, its duration will be exactly one second (if it was 8362 points long, it would take two seconds to play, and so on).
What OpenMPT does is first translate the period into frequency (the same way you do), and then converts this frequency into a ratio of this sample mix rate and the output mix rate, and uses this to determine how fast to increment the sample. Let's choose more simple numbers for the sake of an example: Suppose your output mix rate is 48000 Hz (that would be a very common value, alternatively "CD quality" 44100 Hz), and period to frequency conversion gave you a sample rate of 4800 Hz (somewhere between D-1 and D#1). The ratio between those two numbers is 4800/48000 = 0.1, i.e. for every increment in the mixer's output, you advance the offset at which you read from your sample data at 0.1.
Small example:

time  |  offset
0     |  0
1     |  0.1
2     |  0.2
3     |  0.3
...
9     | 0.9
10    | 1.0
11    | 1.1
...
after one second...
47999 | 4799.9
48000 | 4800.0
48001 | 4800.1
....

Now the next thing you might be asking yourself is "how do I read sample data at offset 0.5", and the answer is "it depends". An old-skool MOD player will simply truncate the result, i.e. round down. A more sophisticated player will interpolate the output if the offset is fractional, for which it will use at least the previous and next sample point (linear interpolation) or even more surrounding sample points (e.g. sinc interpolation). But you shouldn't care about that for now, one step at a time. :)
» No support, bug reports, feature requests via private messages - they will not be answered. Use the forums and the issue tracker so that everyone can benefit from your post.

wow25

Ah I was so close, it seems so obvious in retrospect. Thank you very much! Coincidentally, I recently learned about a couple kinds of interpolation in university, so I've already implemented it (linear at least)! On a related note, do you have a recommended way of storing sample data? I tried to use pointers and std::vectors, but had some weird issues. At the moment, I'm using the FastTracker 2 approach, which is just pointers to arrays, but I've heard that's unsafe.

Thank you as always!

Saga Musix

I'd just use vectors in a modern codebase. Currently OpenMPT uses raw pointers for historical reasons, but this will probably change in the future.
One thing that's slightly more complicated with vectors is handling 8-bit and 16-bit data: You cannot simply cast a vector<int8> to vector<int16>, so you'd either have to cast the vector data each time you access it (which doesn't cost anything performance-wise but it can lead to mistakes if not done properly), or maybe for a more modern approach, use a std::variant<std::vector<int8>, std::vector<int16>> to be able to distinguish between 16-bit and 8-bit data.
» No support, bug reports, feature requests via private messages - they will not be answered. Use the forums and the issue tracker so that everyone can benefit from your post.

wow25

Hi again! I have some more questions, unfortunately. My player is working very well actually, but implementing some effects has been a nightmare. In particular stuff like Slide to note (3xy) and Vibrato (4xy). It feels like I have to override the whole player just to accommodate. What do you think is a good way to work with these?

Saga Musix

#5
Tone Portamento (3xx): Your note change routine needs a new parameter that tells it whether there is a 3xx effect or not. In the routine, calculate the period of the new note (as you would if there was no 3xx effect) but store this in a new variable in your channel state in addition to the channel period. Now on every row with a 3xx effect, you just do regular portamento up / down depending on whether the current channel period is higher or lower than the target period that you stored in the new channel variable.

Vibrato: This is typically an offset on top of the channel period, so you can add this on top of the channel period close to the end of your channel processing code (but don't store the result in the channel period of course, as it's just a temporary offset on top).
» No support, bug reports, feature requests via private messages - they will not be answered. Use the forums and the issue tracker so that everyone can benefit from your post.

wow25

Thank you! I managed to get the Slide to Note work, though it's still surprising how much goes into such a simple-looking effect. And Vibrato was way easier than whatever I was thinking.

This question is a bit far out, progress wise, but I was wondering how I can go about mixing the final audio? At the moment I'm just piling the audio on top of each other. I vaguely know how compression works, but I don't know the other options I might have.

Saga Musix

Quote from: wow25 on June 05, 2021, 23:31:08
This question is a bit far out, progress wise, but I was wondering how I can go about mixing the final audio? At the moment I'm just piling the audio on top of each other. I vaguely know how compression works, but I don't know the other options I might have.
Please just don't. The audio channels are just supposed to be mixed together and that's it. Anything else will just alter the sound compared to how it was meant to be played.
» No support, bug reports, feature requests via private messages - they will not be answered. Use the forums and the issue tracker so that everyone can benefit from your post.

wow25

Haha works for me.
Thanks again for the help. Hopefully I won't have any other questions until it's done!

wow25

And it's done!
It's not quite perfect, some effects are a little off, but it sounded pretty good on the 10 or so mods I tested it on.

In case somebody else wants it for reference, it's https://github.com/Wow25/ModPlayer

This has been a really fun thing to work on. I have so much respect for all the audio programming that goes into OS's, games, players, ect. Thank you so much Saga for basically providing everything I needed, I definitely would have given up without your help.

Saga Musix

Nice work! I didn't look at the code in detail but if you want to test some demanding modules, I can recommend trying to play Black Queen and Ode To ProTracker.
» No support, bug reports, feature requests via private messages - they will not be answered. Use the forums and the issue tracker so that everyone can benefit from your post.