Newton
Definition and History
Isaac Newton discussed a practical thermometer scale around 1701 in the Philosophical Transactions. The modern reconstruction places 0°N at the melting point of ice and 33°N at the boiling point of water. Degrees are evenly spaced, so each Newton degree corresponds to 100/33
Celsius degrees (≈ 3.03°C).
Newton’s note predates the widespread adoption of fixed‑point definitions and international standards. It outlined a set of readily observable reference points (snow, ambient air, body heat, etc.) to build repeatable thermometry, influencing later linear scales.
Usage and Applications
The Newton scale appears in early British and continental discussions of heat and specific heat. Philosophers and instrument‑makers cited it alongside Fahrenheit and Réaumur during the transition to modern thermometry. By the 19th century it was largely of documentary value, but it remains important when editing or reconciling early laboratory notebooks.
Because the degree is coarser than Celsius, some historians note its convenience for coarse thermal phenomena and pedagogical examples, while precise laboratory work moved to Celsius/Kelvin.
Scientific and Engineering Applications
The Newton scale is linearly related to Celsius with a constant degree size; it is therefore affine‑equivalent to the SI framework and differs only by scale factor and origin. Because each Newton degree is larger than a Celsius degree, historical values are routinely expressed in Celsius/Kelvin when precise reporting is required.
International Standards
Newton is a non‑SI, historical scale with no standing in modern metrology. Recommended symbol: °N. When precise reporting is required, values should be converted through the Celsius/Kelvin definition.