Thanks for posting that link. It's not a matter of being necessarily right or wrong.... It's about understanding how the refractometer works. By calibrating using only one specific gravity of liquid you are only verifying the refractometer at that specific point on the scale. Yes, a lot of them will in fact read perfectly from 1.000 to 1.035 in a straight line nailing all points dead on. However, it is important to understand that when a refractometer becomes miscalibrated it is because the line has either moved up or down or perhaps even has developed a curve in it. When this occurs the refractometer could still be accurate for 1.000 but not necessarily at other points along the scale. In order to 100% calibrate a refractometer and be sure it is correct, one would need to verify like 3-4 points along the scale.... Such as 1.000, 1.010, 1.020, 1.030, 1.035.... Because this is not very practical, the best method IMO and according to the article posted is to always calibrate and verify the refractometer for the point on the scale at which it will actually be used at.... In our case this is usually 1.026. All we care about is that the refractometer will read correctly at that point on the scale....