Customers commonly ask me these questions:
- Why do my measurements change as I adjust the volts/division knob (also called scale)?
- Why does noise increase at larger volts/division?
- Why might it be a mistake to compare 2 oscilloscopes from different manufacturers at the same volts per division setting? (this is a big issue as you will see below)
To understand these effects, it is necessary to understand how an oscilloscope works internally. For this post, we will focus on the vertical volts/division knob and what effect it has on your measurement. The architecture I will describe is not a precise description of any particular oscilloscope, but a generic model to explain how to view the volts/division.
Almost all real-time oscilloscopes are based on 8-bit digitizers, and the digitizer itself has a fixed input swing. Ignoring offset, let’s suppose the A/D (digitizer) can see a signal that swings from DC to +1V. Since it is an 8-bit digitizer, that means there are 255 levels it can see (2^8 – 1). Spread across 1V, it means each bit is 1V / 255 or 3.922mV. If the signal was 1V in amplitude, it would be digitized with the full 8-bits of resolution.
So what if my signal is only 400mV, not 1V in amplitude? It means that I would only use half of the digitzer, and only half of the bits. The steps in my digitized waveform would be bigger and I’d have less vertical resolution on my signal. It would be like viewing a profile photo from Twitter blown up to 8x10. Less resolution means less information. My measurements will be less accurate.
In a real-time oscilloscope, a variable gain amplifier is placed before the A/D. By turning the volts/div knob, the user is adjusting the gain of this amplifier, making the signal fill the digitizer. Suppose I wanted my digitizer to use all of it's bits across just 400mV instead of 1V. I would set the variable gain amplifier to amplify my signal by a factor of 2.5. So even though the user’s input signal swings 400mV, the scope digitizes a full 1V signal. Internally, the oscilloscope knows that each bit represents a signal that was amplified by 2.5.
So instead of a bit being 3.922mV, each bit is 3.922mV / 2.5, or 1.569mV. The A/D is the same, the ADC codes are the same, but the software in the scope interprets each bit as being a smaller voltage. This partially explains why noise changes at various volts/division setting. We’ll discuss noise in a later post, but suppose that the A/D in the oscilloscope has 3 bits of noise. Those 3 bits of noise would be interpreted as 11.765mV (3x3.922mV) if the oscilloscope was seeing 1V directly, but with 2.5x of gain, the noise is really interpreted as 4.706mV (3x1.569mv).
Suppose my signal was 200mV in size? The variable gain amplifier would multiply my signal by 5x, each bit would be 980uV, and the 3 bits of noise would look like 2.941mV. Nothing has changed in the noise, but the 3 bits of wiggle on the digitizer are interpreted differently.
What happens to larger signals, ones beyond 1V? Suppose my signal is 2V? The attenuator will “click” in and divide your signal by 10. Then the variable gain amplifier boosts the signal back up to fill the input. So if you want to see a 2V signal, the attenuator would drop the signal to 200mV, and then the amplifier would gain it up by 5. The internal variable gain amplifier is actually set identically to when you have a 200mV signal on the input. Different vendors will size and place this attenuator differently (and could even use multiple attenuators on a switch), but the concept is the same.
Back to the unattenuated case, how does this 2.5x or 4x of gain equate to the volts/division knob? We all know the user does not set gain, but rather volts per division. This is a complex subject which has been further confused by misleading advertisements and side-by-sides in the oscilloscope industry.
Some oscilloscope manufacturers use 10 divisions to divide their screen. So suppose the amplifier was set to 2.5x to see a 400mV signal. If there were 10 divisions on the screen, the volts/division knob would read 400mV/10 or 40mV/div. Other vendors use 8 divisions on the screen. The amplifier would be set to 2.5x to see a 400mV signal, but with 8 divisions, the volts/division knob would read 400mV/8 or 50mV/div.
Volts Per Division is also known as "Scale" |
Ignoring other differences, both oscilloscopes are now “set the same”. Rohde and Tektronix both use 10 divisions on their screen. Agilent and LeCroy both use 8 divisions on their screen. So that means if you want to compare noise, flatness, ENOB, or anything else between 2 oscilloscopes, they must be set to the same full-scale voltage, NOT the same volts/division.
The two oscilloscope images below show an example. Both are set identically, showing 400mV on the screen, but the Tektronix scope readings 40mV/div and the Agilent reads 50mV/div. Both show the exact same signal filling almost the entire screen.
To compare a Rohde oscilloscope to a LeCroy, I would set the Rohde oscilloscope 20% lower than the equivalent LeCroy. To compare a Tektronix to an Agilent, I would set the Tektronix 20% in volts/division. To compare Rohde and Tektronix, I could use the same v/div setting. To compare LeCroy and Agilent, I could use the same v/div setting.
So which is better, 8 divisions or 10 divisions? It's like asking what's better, 1 kilogram or 2.2 pounds? They are the same, just different units. Below I use Tektronix and Agilent, but the same comparison could be done with Rohde and Agilent or Tektronix and LeCroy.
Understanding this distinction is important because a major T&M manufacturer has actually built advertising campaigns around comparing the noise of their scopes to their competitors. As I looked at the graphs supplied on their datasheet, I couldn't help but notice that all measurements were done at equal volts/division setting with no indication of the true full scale voltage. As you now understand, this is not a valid way to compare performance.
So which is better, 8 divisions or 10 divisions? It's like asking what's better, 1 kilogram or 2.2 pounds? They are the same, just different units. Below I use Tektronix and Agilent, but the same comparison could be done with Rohde and Agilent or Tektronix and LeCroy.
Understanding this distinction is important because a major T&M manufacturer has actually built advertising campaigns around comparing the noise of their scopes to their competitors. As I looked at the graphs supplied on their datasheet, I couldn't help but notice that all measurements were done at equal volts/division setting with no indication of the true full scale voltage. As you now understand, this is not a valid way to compare performance.
Tektronix MSO4104B showing a 350mV pk-pk sine wave at 400mV full scale or 40mV/div. Note the 10 vertical divisions. |
Agilent DSO6054A showing a 350mV pk-pk sine wave at 400mV full scale or 50mV/div. Note the 8 vertical divisions. |
Also note that the Volts/Division knob goes in steps of 1,2,5, so it goes 10mV, 20mV, 50mV, 100mV, 200mV, 500mV, 1V. There is also a “fine gain” setting on scopes that allows you to go to intermediate steps. If my signal is 600mV pk-pk, then 50mV/div on a 10 division scope would only show 500mV, but 100mV/div would only use 60% of the digitizer. So I’d be best off using 70mV/division so that my 600mV fit well into 700mV.
Why did I not recommend 60mV/div? Because the top of the screen is often the end of the digitizer, so I want some headroom in case my signal has noise on it so I don’t clip the digitizer.
One other point on volts per division. When you go below 10mV/div, some oscilloscopes will read out 5mV/div or even 1mV/div, but the datasheet clearly calls out it is a software zoom. In other words, even though the screen says 1mV/div, the scope is still at 10mV/div, so the signal is not fully digitized. All that happens is the signal is graphically expanded, but the noise and ENOB will stay the same as if the signal was at 10mV/div. If you have a small signal, you need to make sure you understand how your oscilloscope is acquiring it.
Hope that brings some clarity to vertical settings. There is a lot more to be said, but that will wait for next Thursday!
Hi, this is a very informative article, however, isn't that 3 bits of noise being 8 levels instead of 3 levels you mentioned when calculating how the scope sees the noise?
ReplyDeleteThanks. Great (simple and easy) explanation of the V/Div setting.
ReplyDeleteHi,
ReplyDeletein the example of the 8bit digitizer you have chosen a full scale voltage of 1V. I suppose different scopes could have different such full scale voltages ? If so how do I find out what is the full scale voltage for any given scope ? Is there a particular specification that I could look for ?
Please advise.
Anonymous-Jan11-good catch. I should have said "3 discrete quantized levels of noise" instead of "3 bits of noise". It is a good distinction.
ReplyDeleteAnonymous-March14-You are correct that full scale voltage will vary based on the oscilloscope model, and these full scale voltages are rarely advertised by the vendor. You can figure it out by adjusting the V/div knob from low V/div to high until it "clicks" internally. That is the sound of the relay switching from an unattenuated to an attenuated gain range. Then use fine scale to go low again and see where the switch point is located.
ReplyDeleteImagine an oscilloscope where the knobs goes from 10mV/div to 20mV/div to 50mV/div and then it clicks at 100mV/div. Now use fine scale to drop back to 99mV/div and you will hear the click again. 99mV/div to 100mV/div is the cross over point, so full scale on the A/D is likely close to 1V (99mV * 10).