Measurement is one of the most prominent ways to introduce some sense of order in our chaotic universe. But how accurately can we measure anything? What are our limitations, and how can we improve our accuracy? I’m sure you’ve also had questions like this inside your mind. That is why today we’ll try to answer the question “Can You Measure Anything 100% Accurately”. We’ll also share some tips on how you can increase the accuracy level of your measurement process. So without any further ado, let’s jump right in.
What is Measurement?
In a nutshell, measurement is the process of associating numbers with physical quantities and phenomena. So checking your weight on the bathroom scale is a measurement, just like calculating the mass of the sun. This is a fundamental aspect for almost all the technical fields, scientific disciplines, and everyday activities. The measurement process begins with a definition of the quantity that is to be measured. One other thing, it always involves a comparison with some known quantity of the same kind.
For example, if you want to buy five pounds of meat from your local butcher, you’ll have to define what a pound is first, right? Once you have a solid definition/measurement of “pound,” this is called a unit/original unit.
Then you just make sure that the amount of meat you are buying is at least five times that weight you defined. Both the comparison and the defining process is under the term “measurement.”
Accuracy vs. Precision
Both accuracy and precision are fundamental concepts in every aspect of science. We often interchange them thinking they bear the same meaning. But that’s not the case. For those who are from a different background or need a little reminder, let’s have a quick refresher on what those two terms exactly mean and how they differ from each other. Shall we?
No matter what type of measurements we make, there always is an actual value that we are trying to obtain, right? Now, accuracy refers to how close measurements are to this “true” or accepted value. Accuracy showcases the difference between the measurement and the part’s actual value. It is possible to be accurate without being precise.
Precision shows how close measurements of the same item are to each other. It doesn’t depend on the accuracy. It is also possible to be very precise but not very accurate.
Here is an example to explain the concept further. Let’s imagine a game of darts where the bulls-eye (center) of the dartboard is the true value. Now, the closer you land the darts to the bulls-eye, the more accurate you become.
Now, there are 4 case scenarios:
- There is both accuracy and precision if you manage to land the darts close to the bulls-eye and close to each other.
- If all darts land very close together but far from the bulls-eye, you’ll get precision, but not accuracy.
- If the darts are spread evenly around the bulls-eye, there is mathematical accuracy, but not precision. I mean, if you can’t hit the bulls-eye, you lose the game, right?
- If you land the darts far from the bulls-eye and they are randomly placed, you’ll get neither accuracy nor precision.
So now you have a decent idea about what accuracy and precision mean. Though that hardly matters for you and me in our everyday life, many engineers, technicians, and researchers greatly depend on the accuracy and precision of the measurement process.
Can We measure anything 100% accurately?
The short answer is, “NO.” We can’t measure anything with 100% accuracy. There will always be some kind of error due to the practicalities of measurement. We can only get a reasonable estimate and try to increase the accuracy level of the process. Now, you may ask why? Here is the long answer.
Let’s just say you want to measure the temperature of the room you’re in right now. Now I can say that the temp is between 15 and 60 degrees celsius. Is this information accurate? Yes. But is it 100% accurate? Well, that’s a bit complicated.
Accuracy just showcases whether a statement is true or false. Now, if you want to measure the accuracy of a statement, not only do you need to know the statement, but also the true state of the world. Otherwise, you’ll have nothing to compare the statement to. There will always be some non-zero variability from factors outside our control. Some of these variables are the instruments, environmental conditions, and lab personnel. And if you want to go deeper, the “Heisenberg’s Uncertainty Principle” prohibits 100% accuracy even at the quantum level.
So what to do now? Now that we know we can’t achieve the ultimate accuracy, should we stop prioritizing it? No, absolutely not. Though we can’t get 100% accuracy while measuring something, we can get pretty close by taking a number of steps and minimizing random errors. Let’s have a look at how we can improve the accuracy rates in our measurement process.
How To Improve Accuracy
Calibrate Your Tools
Calibration, in a nutshell, is the comparison between a known measurement (the standard) and the measurement using your tools. This way, you’ll be able to find out the accuracy of your instrument. You’ll also be able to determine the traceability of your measurement. This is one of the most crucial steps you can take to improve your accuracy rates as the accuracy of all measuring tools degrade over time.
Even if all of your tools are calibrated, there is a chance they may show faulty readings due to some wear and tear. So if you want to maximize the accuracy and precision of your measurement process, we suggest you conduct routine maintenance on all your devices/tools. So go through the user manual of the tool you are using and learn the best practice to keep them functioning accurately for a long time.
The more reading samples you collect, the more precise the representation of your measurement will be. So it is wise to take multiple reading samples and compare them afterward. This will eliminate the chance of getting accidental errors and improve the overall quality of the measurement process. If you can’t take multiple measurements, we suggest increasing the number of replicates.
Detecting Shifts Over Time
Some systems may become prone to drift over time. So it is best practice to take multiple measurements over a large period of time. Though you’ll notice nothing in most cases, it is an important step to increase your accuracy level. If you ever notice any drift in a single direction over weeks or months. You need to either recalibrate your tools or take preventive measures.
Considering the “Human Factor”
The accuracy of any measurement process varies with its user. The best way to reduce human errors is to have only one person responsible for a given measurement. But this may not always be possible. So instead, try to ensure that procedures are kept up to date and are as descriptive as possible.
Over To You
No more today. Though we can’t measure something with infinite accuracy, the desperate attempt to achieve more and more accuracy is one of the driving forces of mankind. We’ll try to discuss more ways to increase the accuracy rates of your measurement process in your upcoming articles. Till then, take care. Don’t forget to share your opinions and experiences with us in the comment section below. Thanks for reading this far. Happy measuring to you.