440hz > 432hz

outbreak525

Member
Jun 15, 2010
571
0
16
This may seem like a strange question.
I was wondering, in Cubase, EXACTLY how many digits I need to "Finetune" DOWN from 440hz to achieve 432hz.
I've conducted a simple experiment. I have recorded a riff tuned to 432hz and then recorded that same riff in 440hz. I am now playing them both simultaneously while messing with Cubase's "Finetune" feature to get it so the 440hz recording matches up 100% with the 432hz.
I've come to find that the range that sounds closest to my ears is between -20, and -30.
But that's 11 digits, which is NOT the exact measurement I'm looking for.

Just wondering what the exact measurement would be to achieve 432hz from something originally recorded in 440hz.
 
There is no exact even measurement, as frequencies measured in hertz and altering pitch in cents don't equally correspond (i.e. one cent difference in pitch isn't always the same difference in hertz). The number you are looking for is about but not exactly 32 cents.
 
-31.76665363342928 cents

this is the correct number to 14 decimal places. Converting a change in a reference frequency to cents is an exact and relatively simple calculation.

edit: confusion happens when people start thinking about shifting the entire spectrum by a certain number of Hz, which indeed does not give you a constant shift in cents. This is not what's happening when you shift reference frequency. You're actually just shifting one note (a above middle c) by 8Hz and then tuning all others to that by pitch not frequency. A below middle C become 216Hz, which is only 4 Hz from it's original value of 220Hz.
 
I discovered a while ago that I disagree with tunings that are "standard."
I use this program called TabIt to write my songs before I record them, and I have always pitchshifted everything down 25%. Makes it sound better.

The Beatles did it.
 
Didn't they also tune higher? And standard for the most part?
 
Just play everything faster and drop your rate down to .95 :D

-Paul