-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] Input calibration and input gain knob(s) #542
Comments
I'll start by answering a few of your questions line-by-line, but I'll try to explain why after that so read to the end :) Answers
No. All you need to do is tell the plugin how many volts your input clips at. The plugin does the rest (if the model has its metadata filled out).
Nope. The point of this is that you just say how your interface is set up and the plugin does the rest.
When you do that, how many volts does it take to clip the input? Figure that out, in dBu, and put it in the plugin. That's all you need to do :)
Just btw, you don't need to know that if the model maker put it in the model metadata. The point is that the plugin takes care of the model; you take care of the interface. (If your model doesn't have it in the metadata, then you need to do it the old-fashoined way). Why?Simply put, your interface convert voltage to a digital signal between -1 and 1. The point is to tell the plugin what the "conversion factor" is. If you had your interface's input gain set to zero, then it'd be 13 dBu like you said. Since you turned it up to 4.5, then it's something else. You need to figure it out for yourself. At 4.5, it'll probably be somewhere close to about -15 dBu, but it's hard to say exactly because it's hard to say exactly how you set the knob by hand. So your options are to (A) measure it (break out the multimeter!), (B) set your gain to zero and read it off the manual (13 dBu), or (C) don't worry about calibration and dial in by ear what you want to hear. Hope that helps. Please either close the Issue if this answers your question, or let me know if you've still got questions. |
Thanks for the in-depth response. I still have questions though. So, in a sense, when I set input calibration to 13 dBu it adds 13 dBu to the input signal? I assumed that it subtracts it.
But setting input gain to zero is not recommended because then the plugin basically raises quantization noise floor, as the Ghost Note's video explains. Again, my assumption was that input calibration was exactly to combat that, allowing to record at just-below-clipping level, which is 13 dBu, as my audio interface suggests.
and
What's the proper technique to measure it? All I can think about is to record a sample with input gain set to 0, then record a second one at right-below-clipping, and normalize tracks in something like Reaper, to see what amount of db increase it gives to the signal. That's basically what I did before to calculate the input level compensation needed. Perhaps this is the value I need to put into the plugin? Again, sorry if these questions are stupid, but I'm lost :) |
Here's the important thing: "dBu" is specifically a measure of voltage. There's no such thing as "dBu" in a digital signal in a WAV file.
So I'll correct this question:
No. It just indicates that if the digital signal is at clipping, then the analog signal was 3.5 volts. Nothing happens until you have a model (and when there is, the plugin does it for you, which is why I'm purposefully being vague about what happens under the hood--the point of having this feature is that you don't need to worry about it 🙂)
I didn't say it was recommended. I just said what the calibration would be.
It's only
It's worth saying how to do it without using the manual because I think it'll make it make more sense.
This is essentially what I wrote in the documentation. What you said might be right. But hopefully this makes sense: As you increase the input gain, the "dBu" goes down because a lower-voltage input signal will now clip the interface. If the interface clips at 13 dBu when the gain is at zero, and you turned up the input gain by, say, 10 dB, then the interface now clips at 13 - 10 = 3 dBu. Hope that helps. |
It's worth saying how to do it /without using the manual/ because I
think it'll make it make more sense.
1. Generate a sine signal however you'd like. (One way is to make it in
your DAW and send it out the interface's output).
2. Plug it into your interface's input and adjust the signal's level
(/not your input gain/--presumably that's set where you want it, so
don't move it!) until it's at the clipping threshold.
3. Pull out the cable and measure its RMS voltage with a multimeter.
4. Convert from voltage to dBu. That's the number you put in the
plugin.
This is very helpful, thanks!
I haven't seen the manual yet, didn't know it was a thing, I don't think
it was mentioned in the blog.
I'll do the measures soon, and close the ticket later, if I don't have
other issues/questions.
Thanks again! I remember when I learned about DPI 15 years ago, was
confused in the same way as here. Some things never change, I guess :)
…--
Andrey Listopadov
|
Hello!
First of all, thank you for an amazing plugin - it's a great way to achieve quality sound in my projects, have been using it since April, and it's amazing.
I have a question, sorry for posting it as an issue, I don't know where the general discussion of this plugin goes. (And I don't use social networks at all, so GitHub is kinda my only option.)
I've updated the plugin to the 0.7.12 release and downloaded some calibrated models from ToneHunt.
After reading the blog post about calibration I don't think I got a clear understanding of how I need to set my input levels.
My audio interface is a Scarlett 2i4 2nd gen, which has a Maximum Input Level of +13 dBu for the instrument input. I set it in the plugin as the blog post suggests. Then I loaded a model, which has input and output calibration metadata, turned on the input calibration, and chose the normalized output option, as I frequently experiment with different models. So far so good.
Now the question is - do I need to adjust the input level in the plugin to compensate for input gain on my audio interface?
The aforementioned video by Ghost Note Audio suggests that I need to compensate for the input gain in the plugin, but it was recorded some time ago and explained with plugins that do not feature input calibration, as it wasn't a thing before. I may be misunderstanding this, but if I'm correct I still need to do this, even with input calibration enabled, right?
Speaking in the numbers, here's how I set up the plugin:
This is more or less set up the same as how the diagram looks in the blog post:
The only thing I'm unsure of are the first two points - input gain on the interface, and input level in the plugin. The diagram shows that the audio interface gain is brought up (and it's clipping), but I assume it's just a stock image of a card.
Hope I was clear enough with this, and my question can be understood - English is not my first language.
Thanks again!
The text was updated successfully, but these errors were encountered: