Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Input calibration and input gain knob(s) #542

Open
andreyorst opened this issue Dec 8, 2024 · 4 comments
Open

[Question] Input calibration and input gain knob(s) #542

andreyorst opened this issue Dec 8, 2024 · 4 comments

Comments

@andreyorst
Copy link

Hello!

First of all, thank you for an amazing plugin - it's a great way to achieve quality sound in my projects, have been using it since April, and it's amazing.

I have a question, sorry for posting it as an issue, I don't know where the general discussion of this plugin goes. (And I don't use social networks at all, so GitHub is kinda my only option.)

I've updated the plugin to the 0.7.12 release and downloaded some calibrated models from ToneHunt.
After reading the blog post about calibration I don't think I got a clear understanding of how I need to set my input levels.

My audio interface is a Scarlett 2i4 2nd gen, which has a Maximum Input Level of +13 dBu for the instrument input. I set it in the plugin as the blog post suggests. Then I loaded a model, which has input and output calibration metadata, turned on the input calibration, and chose the normalized output option, as I frequently experiment with different models. So far so good.

Now the question is - do I need to adjust the input level in the plugin to compensate for input gain on my audio interface?

The aforementioned video by Ghost Note Audio suggests that I need to compensate for the input gain in the plugin, but it was recorded some time ago and explained with plugins that do not feature input calibration, as it wasn't a thing before. I may be misunderstanding this, but if I'm correct I still need to do this, even with input calibration enabled, right?

Speaking in the numbers, here's how I set up the plugin:

  1. ~4.5 input gain is set on the audio interface
  2. -4.5 is set on the input knob in the NAM
  3. 13 dBu is set in the input calibration section
  4. The model is calibrated to +9.0 dBu
  5. output is set to normalized.

This is more or less set up the same as how the diagram looks in the blog post:

image

The only thing I'm unsure of are the first two points - input gain on the interface, and input level in the plugin. The diagram shows that the audio interface gain is brought up (and it's clipping), but I assume it's just a stock image of a card.

Hope I was clear enough with this, and my question can be understood - English is not my first language.

Thanks again!

@sdatkinson
Copy link
Owner

I'll start by answering a few of your questions line-by-line, but I'll try to explain why after that so read to the end :)

Answers

Now the question is - do I need to adjust the input level in the plugin to compensate for input gain on my audio interface?

No. All you need to do is tell the plugin how many volts your input clips at. The plugin does the rest (if the model has its metadata filled out).

The aforementioned video by Ghost Note Audio suggests that I need to compensate for the input gain in the plugin, but it was recorded some time ago and explained with plugins that do not feature input calibration, as it wasn't a thing before. I may be misunderstanding this, but if I'm correct I still need to do this, even with input calibration enabled, right?

Nope. The point of this is that you just say how your interface is set up and the plugin does the rest.

  1. ~4.5 input gain is set on the audio interface

When you do that, how many volts does it take to clip the input? Figure that out, in dBu, and put it in the plugin. That's all you need to do :)

  1. The model is calibrated to +9.0 dBu.

Just btw, you don't need to know that if the model maker put it in the model metadata. The point is that the plugin takes care of the model; you take care of the interface. (If your model doesn't have it in the metadata, then you need to do it the old-fashoined way).

Why?

Simply put, your interface convert voltage to a digital signal between -1 and 1. The point is to tell the plugin what the "conversion factor" is.

If you had your interface's input gain set to zero, then it'd be 13 dBu like you said. Since you turned it up to 4.5, then it's something else. You need to figure it out for yourself. At 4.5, it'll probably be somewhere close to about -15 dBu, but it's hard to say exactly because it's hard to say exactly how you set the knob by hand. So your options are to (A) measure it (break out the multimeter!), (B) set your gain to zero and read it off the manual (13 dBu), or (C) don't worry about calibration and dial in by ear what you want to hear.


Hope that helps. Please either close the Issue if this answers your question, or let me know if you've still got questions.

@andreyorst
Copy link
Author

Thanks for the in-depth response. I still have questions though.

So, in a sense, when I set input calibration to 13 dBu it adds 13 dBu to the input signal?

I assumed that it subtracts it.

If you had your interface's input gain set to zero, then it'd be 13 dBu like you said.

But setting input gain to zero is not recommended because then the plugin basically raises quantization noise floor, as the Ghost Note's video explains. Again, my assumption was that input calibration was exactly to combat that, allowing to record at just-below-clipping level, which is 13 dBu, as my audio interface suggests.

When you do that, how many volts does it take to clip the input? Figure that out, in dBu, and put it in the plugin. That's all you need to do :)

and

You need to figure it out for yourself. At 4.5, it'll probably be somewhere close to about -15 dBu, but it's hard to say exactly because it's hard to say exactly how you set the knob by hand.

What's the proper technique to measure it?

All I can think about is to record a sample with input gain set to 0, then record a second one at right-below-clipping, and normalize tracks in something like Reaper, to see what amount of db increase it gives to the signal. That's basically what I did before to calculate the input level compensation needed. Perhaps this is the value I need to put into the plugin?

Again, sorry if these questions are stupid, but I'm lost :)

@sdatkinson
Copy link
Owner

Here's the important thing: "dBu" is specifically a measure of voltage. There's no such thing as "dBu" in a digital signal in a WAV file.

  • Decibels (dB) measure the relative magnitude of two things.
  • Decibels full scale (dBFS) is relative to full scale i.e. clipping for a typical fixed-point digital audio signal in this case. If you're at clipping, that's 0 dBFS.
  • Decibels unloaded (dBu) is relative to 0.7746 volts, meaning it's only meaningful for electrical signals. You can't "add 13 dBu in your DAW" because the signals in your DAW are digital. So, 13 dBu means (approximately) 3.5 volts.

So I'll correct this question:

So, in a sense, when I set input calibration to 13 dBu it adds 13 dBu to the input signal?

No. It just indicates that if the digital signal is at clipping, then the analog signal was 3.5 volts. Nothing happens until you have a model (and when there is, the plugin does it for you, which is why I'm purposefully being vague about what happens under the hood--the point of having this feature is that you don't need to worry about it 🙂)

But setting input gain to zero is not recommended

I didn't say it was recommended. I just said what the calibration would be.

[...] allowing to record at just-below-clipping level, which is 13 dBu

It's only 13 dBu 3.5V if the gain is at zero. If you turn up the gain, then a smaller voltage will be enough to clip.

What's the proper technique to measure it?

It's worth saying how to do it without using the manual because I think it'll make it make more sense.

  1. Generate a sine signal however you'd like. (One way is to make it in your DAW and send it out the interface's output).
  2. Plug it into your interface's input and adjust the signal's level (not your input gain--presumably that's set where you want it, so don't move it!) until it's at the clipping threshold.
  3. Pull out the cable and measure its RMS voltage with a multimeter.
  4. Convert from voltage to dBu. That's the number you put in the plugin.

This is essentially what I wrote in the documentation.

What you said might be right. But hopefully this makes sense: As you increase the input gain, the "dBu" goes down because a lower-voltage input signal will now clip the interface. If the interface clips at 13 dBu when the gain is at zero, and you turned up the input gain by, say, 10 dB, then the interface now clips at 13 - 10 = 3 dBu.

Hope that helps.

@andreyorst
Copy link
Author

andreyorst commented Dec 20, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants