Visual Analyser - Calibration

 

 

Introduction

We continue in this set of tutorials for the audio test software suite, Visual Analyser.  So far, we've had:

In this article, we look at calibrating the system.  If you haven't gone through the previous tutorials, I'd suggest doing that first. 


Why Calibrate?

Without calibration, VA can measure levels in the only language it understands, proportion of full scale.  After calibration, VA can translate that to real-world quantities like Volts.  Much more useful!  Indeed, given the computing power available in even the most modest computer, really useful!


Input Gain

Before calibration, I did some tests.  The test bench computer sound-card is a plain vanilla SB5.1, so certainly nothing special.  I wanted to know at what input levels it runs into clipping, to make sure I remained well below that level.  I fed both channels from my test bench (hardware) oscillator, and monitored the level with a Rigol DS1053 Digital Storage Oscilloscope and on speakers.  Input clipping of the soundcard occurred at 5.92V peak-to-peak.  Interestingly, it was a pretty hard form of clipping, visible on VA, but also on the external DSO and audible in the speakers.  So it was actually killing the oscillator.  I imagine it's some form of input clamping to protect the soundcard.

I'd done that test with the sound card's input gain set low - I wanted to detect hardware clipping, not running out of bit range.  So next question was when do I run out of range?  I increased the input gain to max, and found that the maximum input signal was now 2.9V peak to peak, or 1.02V RMS.  With VA's Zoom channels set to 1 (minimum zoom) the wave takes up most of the screen.  Any increase in level shows as clipping on the screen.  The two headroom indicators (bargraphs) in the Scope input channels showed levels to be 0.6dB down from full scale.  I'd definitely used up my 16 bits!

But reducing the soundcard input level doesn't really help that situation.  VA clips on Zoom 1 at top and bottom of screen, no matter what level that really represents at the input.

And setting the input gain lower will probably have the undesirable effect of bringing up the noise floor - no real issue to the Scope function, but probably significant to the FFT (spectrum analyser) and Noise & Distortion measurements.

So, what to conclude?  Clearly 1V RMS is the max I can input if I leave the input gain at Max (100%).  There's a lot to be said for doing that - you are in no doubt when you are at max, while setting anywhere else is not so reliable.  So I'm going to calibrate for 100% input gain and see what trouble I get into. 


What levels can we expect anyway?

Standards, as someone once quipped, are so useful; that's why we have so many of them!  And that definitely applies to audio.  For example, professional audio levels are usually +4dBu or +6dBu, depending on where you are, and domestic ones -10dBv.  Since our sound card is of the domestic variety, that implies levels peaking to around 0.316V RMS, leaving us 10dB headroom but still a theoretical 86dB above the noise (assuming 16 bit operation).  Should be safe.


Preparing to calibrate

We're going to need a sine wave of convenient and known level, somewhere well within the level and frequency range of the sound card.  I used a combination of my hardware oscillator and my Rigol DSO, as it has True RMS, Peak and Peak-to-peak measurements where my multimeter doesn't.  I do have a home-made Noise and Distortion analyser/millivoltmeter, but it has an analogue meter, so I can't rely on that.  I think everyone is going to be a bit different on this one - it will be a case of looking at your options and picking the one that you have most confidence in.  Some calibration source options:

  • Use VA's signal generator, monitored by your best multimeter.  Make sure to pick a frequency within the multimeter's range!  Your local mains frequency seems like a good start.*

  • Ditto, but using a mains transformer feeding a resistive divider

  • Build the calibrator from the Nuova Elettronica magazine article.  It appears intended to generate a reliable 1V AC.

  • The probe calibrator on your hardware oscilloscope?

  • You might be able to think up other ideas - let me know!

(*Interesting aside.  You'll often see cautions about the poor frequency response of your multimeter.  I think this might be old-fashioned advice - most multimeters I have checked have a pretty wide and flat response.  But you probably can't go wrong at power frequencies!)


Calibrating

  1. Have VA running in A and B mode

  2. Input the known signal mentioned above either to both channels, or to the left channel at first

  3. Tick the Values box on each channel so you can enjoy watching the calibration kick in.  At the moment, all the amplitudes will be shown as %fs.

  4. Open VA's Settings, and choose the Calibrate tab

  5. Select your input source from the drop-down list

  6. Use the vertical fader(s) to set the level at which the soundcard is to be used (I set 100%)

  7. Choose the Unit you want to calibrate (I chose Volts and RMS)

  8. Enter the level of the calibration signal being applied to the inputs (I chose 1.00V as read on the DSO.)

  9. Press Start Measure Signal (L).  A few seconds elapses and a value for Detected Levels (% full scale) appears. 

  10. Tick Apply calibration left channel

  11. Now check the Values columns - you'll find the amplitude values are now reading in volts!

  12. Repeat for the right channel

  13. SAVE!  If you don't save, you will loose this calibration when you close down the program.  Press the Save button near bottom left of the Calibrate window.  It will prompt  you for a name.  I chose Volts RMS 100%, but I don't really know what you should consider as the most important issues.

  14. Now I pressed Save Config (top left of window) as well.  My guess is that this tells VA that you want it left with that particular calibration chosen (assuming you have a range of saved calibrations).  Note that I could also have chosen to save as a specific config.  I guess time and experience will suggest why you might do that.


Windows 7 - VA 2012 Bug?

The process above worked fine with my test-bench machine, using an SB5.1 card running under XP.  But I cannot so far get it to work on my office/lab machine, using its built-in Realtek high-definition soundcard and running under Windows 7.  The machine appears to calibrate and save, but once relaunched the calibrations are clearly not reapplied properly.  However VA version 2011 seems to calibrate fine on this machine.  This might suggest a bug in the 2012 Beta version.  (Alfredo?)

I can get around this in the meantime by recalibrating any time I need to depend on absolute values.  Alternatively, calibrating at a precise 1V level appears to work.


Test your calibration

In theory, you should now be able to close down A, relaunch it, restart the generator (if you were using it) and the analyser, and the calibrations should still be in place.  Go  ahead....

Let me know if that doesn't work for you, especially if you can find a solution!


Enough for now

Because there appear to be some issues with the calibration function at least with some machines, I might leave it there until we get resolution of those.  There are a few remaining buttons and options, but they don't appear essential to the operation.  If you delve a little deeper and want to tell us what you found, do get in touch!

It will now be tempting to look into the Voltmeter option.  Next episode!
 


Back to McGee-flutes Index page...

Created 29 May 2012