Carrying on a conversation with a native from another land who doesn’t speak English is not a likely scenario for me. As I don’t speak a lick of the local language, I can ask him or her where to find a taxi, how to get to the nearest hotel, or to recommend a good place to get a decent meal, but I won’t get very far.
Understanding vocabulary or terminology also applies to technology. It can be difficult or next to impossible to interpret the information that a device provides if the specification is not understood. For example, who cares if a measured specification is found to be 35 °C if the meaning of that measurement is unknown? Is it the ambient temperature in the application location? Or is it the peak temperature of a product, or the minimum, or even average?
Day-to-day, I’ve seen this confusion sometimes extends to the realm of laser distance measurement sensors. Laser distance measurement sensors, (also called laser displacement sensors), like Pepperl+Fuchs’ VDM28 and VDM100 Series, measure the distance to an object. (You can see one potential use in my previous blog post about laser sensors.) Someone may ask “How well can this sensor measure distance?” without knowing the metrics used to determine this or “What is its resolution?”, when they really want another specification. In order to clarify this, a primer on the terminology of laser distance measurement sensors is needed.
The first specification most commonly attributed to these devices is resolution. Resolution is the amount of movement needed in order to see some change in the sensor’s output. So if you move an object in front of a laser distance measurement sensor by 0.01 mm and the analog output remains constant at 4.51 mA, then its resolution is something larger than 0.01 mm. Resolution does not tell you how well a sensor measures. It does tell you at what incremental distance its output will change.
The next term that is frequently seen is repeat accuracy (sometimes called repeatability). Repeat accuracy indicates the sensor’s ability to give you the same output at the same distance. Say that a sensor’s output indicates that an object is 500 mm away. Using the same sensor and object, if you bring the object toward the sensor again and again, then the sensor will measure to within X mm of 500 mm. X is the repeat accuracy. If X = 5 mm, then that means the second time you measure distance to the object, you may see 498 mm. The third time may show 501 mm. But all subsequent measurements will always be within 5 mm of the first measurement.
Absolute accuracy is the absolute maximum difference between the actual physical distance and the sensor’s measured distance. This is more of a “worst case scenario” for laser distance measurement devices, because it takes into account various application and product factors. It even accounts for tolerances from sensor to sensor, as seen when replacing one sensor with another of the same model. If absolute accuracy is 5 mm, that means that if the sensor measures 2300 mm distance to an object, you can take a tape measure and physically measure the same distance, to within 5 mm.
In most applications, repeat accuracy is the most helpful specification. For a given sensor looking at the same object (like a crane moving back and forth on a track or the position of a press or door edge), repeat accuracy indicates the repeatability with which the position of that object will be known.
So in short, resolution indicates how much something has to move to change the output, repeat accuracy is how consistent the output is for the same sensor and setup, and absolute accuracy is the maximum possible difference between the physical distance and the measured distance.