Measurement accuracy very often becomes a matter of properly calibrating an instrument against known standards. The instrument may be as simple as a ruler or as complicated as a million-dollar analyzer, but the principles are the same. They generally involve the following steps:
Acquire one or more known standards from a reliable source.
Known standards are generally prepared and certified by an organization or a company that you have reason to believe has much more accurate instruments than you do, such as the National Institute of Standards and Technology (NIST) or a well-respected company like Hewlett-Packard or Fisher Scientific.
If you're calibrating a blood glucose analyzer, for example, you need to acquire a set of glucose solutions whose concentrations are known with great accuracy, and can be taken as "true" concentration values (perhaps five vials, with glucose values of 50, 100, 200, 400, and 800 mg/dL).
Run your measuring process or assay, using your instrument, on those standards; record the instrument's results, along with the "true" values.
Continuing with the glucose example, you might split each vial into four aliquots (portions), and run these 20 specimens through the analyzer.
Plot your instrument's readings against the true values and fit the best line possible to that data.
You'd plot the results of the analysis of the standards as 20 points on a scattergram, with the true value from the standards provider (GlucTrue) on the X axis, and the instrument's results (GlucInstr) on the Y axis. The best line may not be a straight line, so you may have to do some nonlinear curve-fitting.
Use that fitted line to convert your instrument's readings into the values you report. (You have to do some algebra to rearrange the formula to calculate the X value from the Y value.)
Suppose the fitted equation from Step 3 was GlucInstr = 1.54 + 0.9573 x GlucTrue. With a little algebra, this equation can be rearranged to GlucTrue = (GlucInstr – 1.54)/0.9573. If you were to run a patient's specimen through that instrument and get a value of 200.0, you'd use the calibration equation to get the corrected value: (200 – 1.54)/0.9573, which works out to 207.3, the value you'd report for this specimen.
If done properly, this process can effectively remove almost all systematic errors from your measurements, resulting in very accurate measurements.
You can find out more about calibration curves from GraphPad.