Digital Multimeter Accuracy Explained


The digital multimeter is the most widely used test instrument in the electronics industry.  It is the standard tool for electronics Technicians and it’s usually the first test/diagnosis tool that a newcomer to electronics will purchase.
Despite this, multimeter capabilities and especially the concepts of multimeter accuracy are often misunderstood or ignored.  I have worked in the electronics trade for 14 years and it has been my experience that surprisingly few people actually understand (or care about) their multimeter specifications.  In particular, I have discovered that a large number of Technicians and even Engineers are ‘blissfully’ ignorant of their instrument’s capabilities and the implications for the measurements they make.

If you don’t know and understand your instrument specifications, how can you choose the right tool for the job? And, more importantly, how will you know when you’re using the wrong tool for the job?!

Digital Multimeter Specifications Explained

Modern digital multimeter accuracy specifications are actually quite easy to understand once you become familiar with all the jargon.  It is important that you fully understand what is meant by counts, digits, and the effects they have on instrument resolution and accuracy.  In terms of resolution and accuracy, there is an important distinction to be made here as well – in my experience lots of people get them confused.
In this tutorial we’ll tackle counts and digits first, and this will allow us to very easily interpret the accuracy specifications afterwards.

Digits, Counts and Resolution


When we talk about resolution we’re talking about the smallest possible change that the instrument can detect.  This means we’re looking at the least significant digit.  The resolution at any given time is the amount that a single count of the least significant digit is worth.  So, for example, if the display is showing us ‘4.0005‘ volts, then one count of the least significant digit is worth 100µV (0.0001V).  This means that the instrument’s resolution for that particular measurement is 100µV.  The resolution will change depending on what range you select, but for the most accurate results you should always use the lowest possible range, which gives maximum resolution.  I’ll show you why this is important for accuracy (accuracy is a different concept) later.


My Fluke 28II multimeter is a twenty-thousand count, 4½ digit instrument.  This refers to my instrument’s resolution, but what does it mean? Well, the counts and digits are effectively two ways of saying the same thing, but both terminologies are in common use so it’s good to have a handle on both.  I’ll tell you my personal preference and offer justification for it later.  In this section let’s deal with the counts first.
To start with, it should be noted that the practical count figure is almost always one count less than the naming convention we use to refer to it.  For example, in my case (for a Fluke 28II), the practical resolution of my instrument is 19,999 counts.  That is what the instrument is actually capable of.  However, when we refer to the counts by name we call this “twenty-thousand count”, and this is purely because a round number is easier to say! What we mean in practice is one less than that.  The instrument specifications will usually quote you the practical counts as an actual figure, so with a well written specification there should be no ambiguity:

Fluke 28II Resolution Specifications
Fluke 28II Resolution Specifications
Displaying 1.9999V with 100uV Resolution
Displaying 1.9999V with 100uV Resolution

The implications in terms of multimeter resolution are that the Fluke 28II is capable of displaying a maximum of 19999 on its screen.  A point to note here is that the most significant digit can ONLY be a 0 or a 1.  It can of course move a decimal point to indicate different orders of magnitude.  So if we’re measuring <2V, the instrument can display up to 1.9999V.  What happens when we try to measure voltages higher than this? Well, the instrument has to abandon the most significant digit because it can’t display a ‘2’.  This has the following consequences:
In the case of a 1.9999V measurement the least significant digit being displayed is worth 100µV per count (0.0001V), and therefore the instrument has 100µV resolution up to 1.9999V.  Once we enter the 2V realm the instrument has to sacrifice some resolution because the most significant digit cannot display a ‘2’.  Therefore in order to display 2V it has to shift the displayed measurement to the right, and the current least significant digit gets bumped off the end of the display in the process (i.e. we lose it).
The displayed voltage would be 2.000V, and the least

Displaying 2.000V with 1mV Resolution

significant digit is now worth 1mV per count.  It’ll then maintain this 1mV resolution all the way up to 19.999V after which it’ll be forced to drop a least significant digit again and the resolution will become 10mV per count.
You can see, then, that once you know your instrument’s maximum number of counts you can use this information to determine what the maximum resolution will be for any measured voltage.  The resolution will decrease in discrete steps as the measured voltage increases. The point that the steps occur and their effect on the resolution are determined by the maximum number of counts.


So how does all this relate in terms of digits? Very simple.  The multimeter is a 4½ digit instrument because it is capable of displaying four full digits (0-9) plus one half digit.  The most significant digit is called a half digit in this case because it is only capable of displaying 0 or 1.

4.5 Digit Multimeter Display
4.5 Digit Multimeter Display

Some instruments are capable of displaying higher numbers in their most significant digit.  Commonly you will see a ¾ digit quoted, and this usually refers to a digit that can display up to and including a numeric value of 3.  So, for example, a 4¾ digit multimeter could display up to 39999 on its display.  This would be called a “forty-thousand-count” instrument, and it is an improvement over the 19999 count display because it can go further into its range before it has to compromise its resolution by dropping a least significant digit.

There is a caveat here though – although a ¾ digit typically refers to a digit capable of displaying values between 0 and 3, this is not a safe assumption and in fact it can mean any digit up to 6.  This means that there is some ambiguity surrounding the use of fractional digits to define resolution.

Counts And Digits Are Equivalent And Interchangeable

Counts and digits effectively mean the same thing.  A twenty-thousand-count instrument is capable of displaying practical values of up to 19999 which is four full digits plus one half digit = 4½ digit.
Due to the uncertainty of meaning surrounding fractional (in particular ¾) digits, it is my opinion that the use of counts to define resolution is preferable because it accurately defines the instrument’s capabilities and leaves no room for ambiguity.

The Display is not the limiting factor!

Before I leave my explanation of multimeter counts, digits and resolution, I want to clear up a common misconception.  Some might reasonably question why the instrument manufacturer would choose to hamper themselves with a most significant digit that can only display a 0 or a 1.  Would it not be easier to have a full digit there as well, thereby avoiding the complications and maintaining better resolution for more of the range?
Well, the answer is that the display is not the limiting factor here.  The display itself is almost certainly quite capable of indicating numerals from 0-9.  The limiting factor is the measurement circuitry in the instrument itself .  All instruments obviously have a finite resolution, and it is this limiting factor that causes the instrument manufacturer to be tied to a smaller MSD.

The Meterman 37XR, for example, has a ten-thousand-count display (actual counts 9999).  The ten-thousand-counts refers to the resolution capabilities of the instrument itself (the lower the number, the less resolution the instrument provides), and in this case the consequence for the display is that it can indicate up to 9999V + decimal point.  So in this case the most significant digit really can display 0-9, and there is no fractional digit there to complicate matters.  But we only have 4 digits of displayable resolution across the range.  We don’t have access to an extra ½ digit or ¾ digit at all, so we never get to exploit the extra resolution that this part-digit would provide.  A part-digit that offers an order of magnitude better resolution for part of the measurement range, is better than no digit at all.

Multimeter Accuracy Specifications

Now that we fully understand the meaning behind counts, digits and resolution, we can quite easily interpret a digital multimeter’s accuracy specs.

What does ‘accuracy’ mean?

The accuracy of a measurement refers to how closely it reflects the true value of the property being measured.  Whenever you measure something in real life, the measurement you take is always an approximation of the actual property itself, and therefore there’ll be some uncertainty involved.  Today’s digital multimeters are very accurate instruments – the uncertainty in their measurements is extremely low – but there will always be some uncertainty in the measurement.
What will the error be? Well, it’s impossible to quantify the error exactly.  If you think about it, if we could determine the exact magnitude of the measurement error then we’d just correct for it in software and then we’d have no error at all!  That’s why we refer to it as “uncertainty” instead of “error”.
In practice all we can really do is provide a figure of uncertainty about the measurement which gives us a range for which the measurement can potentially be in error.   The multimeter specifications give us these limits, and they’re called the accuracy specifications.

So we have dispelled the jargon, and this makes our life easy.  Let’s now look at some practical accuracy specifications and determine what they mean.  Staying with the Fluke 28II, let’s have a look at its accuracy specifications for the VDC range:

Fluke 28II DC SpecificationsFluke 28II DC Specifications

As you can see, the Fluke 28II’s DC voltage range is quoted as being accurate to “±0.05% of the reading +1”.  The ‘+1’ refers to an additional uncertainty in terms of ‘numbers of counts’.  Some manufacturers refer to this uncertainty as ‘numbers of digits’, but they both mean exactly the same thing – it’s basically the number of counts in the least significant digit.  In this case we’re only talking about one count of uncertainty but some instruments suffer more than that.  I prefer the former terminology (counts) because it sounds less confusing! Notice that the +1 count is contained within the ± bracket so the actual uncertainty in terms of counts is plus or minus 1 count.  The easiest way to understand what this means in terms of measurement uncertainty is to take an example.

Example: Measurement uncertainty for a known 1.8000V source with the Fluke 28II.

Let’s imagine we decide to measure a voltage reference whose true voltage is known to be 1.8000V.  If we measure this with the Fluke 28II using the most appropriate range (more on this later!) we can expect that the instrument’s measurement uncertainty will be:

Effect of 0.05% accuracy on 1.8000V measurement

This means we should expect a measurement of somewhere between 1.7991V and 1.8009V.  However, this isn’t all of the uncertainty we can expect to see on the display because we also have an additional uncertainty (which is due to ADC errors, offsets, noise etc) of ±1 count, and this gets added on to the least significant digit being displayed.  So, adding that to the measurement uncertainty we get 1.7990V to 1.8010V.  We should expect to see a measurement on the display that is somewhere between these two values.  Easy! Let’s have a look at what this means for an instrument with slightly lower resolution and accuracy specifications:

Example: Measurement error for a known 1.8000V source with the Meterman X37R

Let’s try this same task with the Meterman X37R.  The specifications for the VDC range are:

ACCURACY: ±(0.1% Reading + 5 digits)

RESOLUTION: It’s a 4 digit instrument (no partial digits) which is 9999 count so our maximum resolution when the most appropriate range is used for this particular measurement will be 0.001V = 1mV.

Using all this information, the uncertainty in our measurement will be:

Measurement uncertainty for 0.1% accuracy on 1.800V


This means we should expect a measurement somewhere between  1.798V to 1.802V.  But then we have the additional uncertainty of 5 counts on top.  Not only is there a greater uncertainty of counts to add in this case but now they’re more meaningful too because the least significant digit is more significant than it was for the same measurement with the Fluke 28II – the 37XR has less resolution.  The 5 counts get added to the 1mV column, where as the Fluke’s ±1 count uncertainty only got added to the 100μV column!
This gives us an overall expectation of a displayed reading on the 37XR of somewhere between 1.793V to 1.807V.  You can see how an instrument with lower accuracy and lower resolution can start to make a difference.

Always use the most appropriate range!

There’s a consequence for all this here that we haven’t talked about, and it refers mainly to the count (or digit) errors quoted in the specifications.  You must always use the most appropriate (highest resolution) range for the property being measured.  If you don’t, the resulting measurement errors can end up being quite large because the count uncertainties carry more weight.  Let’s say we do the same experiment with the 37XR, but this time we use the 1000V range to take the measurement.  The displayed measurement will then be somewhere around 1.8V – we’ll be wasting the other two digits that are set to take tens and hundreds units, because there are no tens or hundreds to measure!  We’ll still end up with the same measurement error in this case (it’s still ±0.1% Reading + 5 digits), but the 0.1% uncertainty is too small to be registered on such a low resolution display.  The counts, however, do register because they always affect the least significant digit being displayed – which in this case is the 100mV digit (the 8).  So the actual reading displayed could be between 1.3V and 2.3V! That’s a total error of ±28% which, as I’m sure you’ll agree, is completely unacceptable.  So watch out for that and always make sure you make use of the best possible measurement range!

That’s all folks!

So there you have it, digital multimeter specifications explained.  It’s really quite simple once you get down to it.  The topic is a little bit more hard work for analogue instruments – I’ll tackle that little minefield in a separate tutorial.

Good luck, and happy measurements!

13 responses to “Digital Multimeter Accuracy Explained”

  1. For the Fluke the calculation of ±1.8 / 100 * 0.05% should be ±0.0009V (not ±0.009V as shown). Otherwise excellent presentation!

  2. That was brilliant, thankyou for making my brain light up with understanding 😀
    Ive allways known that measuring on the lowest voltage scale improves the accuracy of the reading, but ive never thought much about the errors that creap in.
    Brain refreshed, thank you brian 😀

  3. I have been trying to understand for long time but when I have red your explaination I fully understood but whydid you divide 1.8to100

    • I want to find out how much voltage makes up 0.1% of 1.8V. So first I divide 1.8 by 100 to find 1% of 1.8. Then I multiply this by the percentage of interest (0.1) to find how much voltage is 0.1% of 1.8V


  5. please correct the value in last para “EXAMPLE: MEASUREMENT ERROR FOR A KNOWN 1.8000V SOURCE WITH THE METERMAN X37R” must be type 1.803V instead of 1.807V.
    thank you

    • Thanks for the comment. 1.807V is the correct value and does not require any change. We have 2mV of uncertainty because of the 0.1% basic accuracy, and then an additional 5mV of uncertainty which is due to the 5 counts that is added on to the least significant digit. The Meterman has 1mV resolution in this case, so we end up with a total of 7mV measurement uncertainty.

      • for the 1.8/100*0.05%=+-0.009V but when I compute it..its it the same as 0.05/100*1.8=+-0.0009V..

        But in 1.8/100*.0%=+-0.0018 +(0.0001 as LSD) = +-0.002V..

        Please do email me sir for the computation of +-0.009V..
        Thank you very much..

  6. Thanks for posting this. You should do more as that was very clear and dam helpful. Could not understand the accuracy thing at all. But now I do. Good on ya.

Leave a Reply

Your email address will not be published.