If you don't have a login, click here to register
Forgot your password?

Temperature Measurement

Temperature is one of the most used variables in the process control industry multiple segments, in addition to being a basic reference for flow and density measurement and control, among other varieties. This article will approach the history of temperature measurement, the main features of the used technologies, as well as details on the market and trends.

The advancement of Physics and electronics has been remarkable in the last few years. Today it is impossible for us to live without the easiness and benefits provided in our daily routine. The same applies to industrial processes and control, where we identify the technological advancement of electronic components and micro-processors, the Fieldbus technology, the Internet, etc. And, moreover, with the search for new developments in renewable energy, new fuels, the nanotechnology, there are countless applications for temperature measurement and control.


A little bit of History

Science has long been focused on temperature measurement. On the other hand, the human body is a bad type of thermometer as it can only differentiate what is cold or hot in relation to its own temperature. Therefore, as time went by, humans began devising devices that helped on the task. Let us see next more details.

One of the first attempts to build a temperature scale occurred around 170 AD. Claudius Galenus of Pergamum (130-201 AD), a Greek physician, suggested   “hot” and “cold” sensations to be measured with basis on a scale with four divisions numbered above and below a neutral point. For such a thermal scale he attributed “four hot degrees” to boiling water, “four cold degrees” to ice and “neutral” temperature to an equal mixture of the first two substances. Galenus could  not  have been an excellent doctor but an excellent physiologist. He wrote several medical treaties based on his assistance to Roman gladiators and the dissection of live animals. He was the first doctor to provide diagnosis by measuring a person´s pulse.

The first thermometer was idealized by Galileo Galilei  (1564-1642). It consisted of a long glass tube with a bulb filled with wine and was first named thermoscope, a device that indicated the temperature through the change of volume. Some of them had the air removed from the bulb before pouring the liquid, which could be colored water instead of wine, forcing the liquid upwards. As the rest of the tube was heated or cooled, the liquid oscillated reflecting the change on the air temperature. Later, his colleague Sanctorius Sanctorius added a scale engraved in the tube to make the alteration easier to measure.

As the wine was highly influenced by the atmospheric pressure, in 1641 Ferdinand II, Gran-Duke of Tuscany (1610-1670) developed the first sealed thermometer. He used alcohol inside and made 50 marking degrees in its shaft and the device did not use any fixed point to calibrate the scale. The device using organic substances then began being called thermometer.

Robert Hook(1635-1703), Royal Society in 1664 added red ink to the alcohol. The scale, on which each degree represented an increment of volume equivalent to 1/500 part of the thermometer liquid volume, needed only one fixed point. He selected the water freezing point and became the Gresham College standard, having been used by the Royal Society until 1709. The first understandable meteorological reading was performed on this scale.

In 1701, Ole Christensen Rømer (1644-1710) invented the first thermometer with two points of reference, using red wine as temperature indicator. Rømer created the scale for his thermometer with the number 60 representing the water boiling point. He did not know that the boiling point depended on the atmospheric pressure, a fact later discovered by Farenheit. As to the lower point, it is matter for debate, as part of his notes was destroyed by fire. Some say that 0 represented a mixture of water, ice and ammonium chloride, others that he used the water defrosting point, which he marked with 7.2Rø. Later on, Rømer adopted for practical reasons other points of reference, such as frozen water and human blood temperature, which he marked as 22.5 Rø. However the creator of the thermometer, Rømer is better known for his work with the light speed measurement. 

Daniel Gabriel Fahrenheit (1686-1736) devoted most part of his life to the creation of meteorological devices. In 1708, Fahrenheit visited Rømer in Copenhagen and got to know his two-calibration point thermometer. Impressed with the gadget, he began using it when back in Germany. Later on, as he did not like Rømer degrees for the inconvenience of having to divide fractions in order to measure small temperature intervals, he multiplied the Rømer scale by 4. This made the water melting (?) point being 30 degrees and the human body temperature 90 degrees. Further on he changed these values to 32 and 96 degrees to simplify marking the scale with 64 divisions. Fahrenheit also added one more point of reference, the equilibrium temperature of mixture of ice and salt, which was defined as zero in his scale. Unfortunately, the use of three references caused more uncertainty than accuracy. After Fahrenheit´s death the human body temperature was considered inconstant to define temperature scale point and the scale was modified to restore its 2 points of reference. All this resulted in the awkward numerical standard, whose water freezing point was 32 Fand boiling point defined as 212 F on the standard atmospheric pressure. Fahrenheit also perceived that alcohol had not enough precision and repeatability for temperature measurement. In 1714 he adopted mercury, which came to be an excellent alternative thanks to its thermal expansion coefficient being highly linear and not soluble in the air. On the other hand, mercury is less sensitive to temperature changes.

In 1731, René Antoine Ferchault de Réamur (1683-1757) proposed a different scale calibrated with only one point and with the scale divisions based on the expansion of the thermometer fluid. Réamur made several experiments with the adequate thermal fluid and selected brandy diluted in a certain amount of water. The dilution chosen used the proportion of 80 in 1000, as the water heated from freezing to boiling point, because 80 was a figure easy to split into parts. Because of this selection, everybody believed that the water boiled at 80 degrees on Réamur scale. In view of this, the Réamur scale began being graduated using two fixed points, the freezing point (0) and the boiling point (80) and was officially adopted in Europe, except in Great Britain and the Scandinavia. But with the adoption of the centigrade scale by the revolutionary government in France in 1794 it gradually lost popularity and finally fell into disfavor in the 20th century.

A thermometer with scale similar to that of Réamur was invented in 1732 by Joseph Nicolas Delisle (1688-1768), a French astronomer who was invited to go to Russia by Czar Peter, the Great. That year he made a thermometer using mercury as fluid. Delisle chose its scale using the water boiling point as a fixed reference and measured the mercury contraction at low temperatures in hundreds of thousands. Earlier thermometers had 1400 graduations appropriate for the St. Petersburg Winter, where Delisle lived. In 1738 Josias Weitbrecht (1702 – 1747) recalibrated the Delisle thermometer with 0 degree as the water boiling point and 150 degrees for the water freezing point. This thermometer remained in use in Russia for over a century.

Many attempts were made to transform the Delisle scale into an interval of 100 degrees, before Swiss Anders Celsius (1701-1744) proposed in 1742 to graduate the thermometer with 100 degrees as the water boiling point and 0 degree as the snow melting point. Apparently wishing to avoid using negative numbers for temperatures, Celsius determined the number 100 for the water freezing point and 0 for the boiling point, with distances divided into 100 degree intervals.

In 1744, Celsius´ friend Carl Linnaeus (1707-1778) inverted the centigrade scale to satisfy a psychological feeling that the highest temperature should correspond to hot and not cold. The use of the Celsius scale was accelerated in the 19th century for the decision of the revolutionary authorities in France to adopt the decimal system for all measurable quantities. The centigrade scale became popular at first in Switzerland and France, (where it coexisted with the Réamur scale) and later on most parts of the world. The Weights and Measurements committee created by the French National Assembly decided that the thermometric degree would be 1/100 of the distance between the ice freezing point and the water steam, giving origin to the word centigrade. In October 1948 at the IX Weights and Measurements Conference the unit name was changed to Celsius.

In 1821, Thomas Seebeck(1770-1831) found out that when two wires of different metals are bonded in both ends and one end is heated, an electric current circulates through the circuit. It was then discovered the thermocouple, today the most important temperature sensor for industrial applications.

SirHumphrey Davy(1778-1829) was a brilliant scientist the responsible for the use of the laughing gas (nitrous oxide) as anesthetic and for discoveries such as elements sodium, potassium and boron, the electric arc welding and the safety mining lamp. In 1821, he also discovered that the resistivity of metals was highly influenced by temperature.

Based on the idea of metal resistivity,SirWilliam Siemens (1823–1883) proposed in 1861 the use of thermometers with platinum resistance, with which the temperature measurement would depend on the variation of the electric resistance of a platinum wire with temperature. The choice of platinum was due to the fact that it does not oxidize at high temperatures and for having uniform resistance variation in a wide temperature.

In 1848, William Thomson (1824-1907) developed a thermodynamic scale based on the ideal coefficient of gas expansion. This idea came from Jacques Charles´ findings about the variation of the gas volumes in function of the temperature variation, which enabled Charles to conclude with basis on experiments and calculations that all gases would be equal to zero at the temperature of –273 ºC. Kelvin proposed another solution: that the matter volume wound not be annulled at that temperature, but the kinetic energy of its molecules. He then suggested that this temperature should be considered the lowest possible and called absolute zero. It was created a new scale base on the centigrade degree scale. The absolute scale was later renamed Kelvin and its unit designated Kelvin degrees (ºK symbol). Note that the temperature on the SI (?) is called Kelvin (and not Kelvin degrees).

In 1859, William John Macquorn Rankine (1820-1872) proposed another temperature scale that specified 0 as absolute zero, but used the Fahrenheit scale as basis. The Rankine scale was the same size as the Fahrenheit and the water freezing point (32ºF) and boiling point (212ºF) corresponded to 491.67°Ra and 671.67°Ra, respectively. This scale was later named Rankine and its unit designated as Rankine degrees (ºR symbol).

 In 1887, Hugh Longbourne Callendar (1863-1930) improved the thermometer with the platinum resistance and got a large similarity of results between the platinum and the gas thermometer. In present days the temperature measurement through the platinum and the gas thermometers acquired great importance in multiple industrial processes control.


Temperature in present days

The creation of the several scales prompted the need for the definition of curves on the many sensors and their calibration points. This was met on the several meetings held since 1889 until today when we finally arrived to the ITS-90 (International Temperature Scale), but this is another story.

Currently the most used scales are Celsius and Fahrenheit. Kevin and Rankine are most used by scientists and engineers. As to the others, they were abandoned and forgotten.

Figure 1 – Comparison of temperature scales



Different standards and patterns in many countries and regions are used in temperature measurement: ANSI (US), DIN (Germany), JIS (Japan), BS (UK), etc.

This evolution showed how temperature transmitters are important in process automation and control. In addition to a diversity of sensors they contribute to the continued improvement of processes and final product quality. Next we will examine a few more details in this important device.


Intelligent temperature transmitters and the market


According to an ARC Advisory Group study the temperature transmitter market in 2007 amounted to U$281 million, the estimate for 2010 was around U$300 million and U$386 million in 2012.

A market analysis shows three lines of temperature transmitters associated to the application x cost ratio, as the intelligent transmitter combines its electronics with the sensor technology.


  •  Explosion-proof and weather-proof transmitters

Normally used on critical applications for high and average performance, they have double-compartment housing separating electronics and sensors, which gives them robustness, safety and reliability, in addition to local display, matching sensor (Callendar -Van Dusen equation), auto-diagnostics, digital communication, local adjustment, and they are used with the most different sensors on simple, double, differential measurement, sensor backup, etc.

Example: SMARTT301, TT302 andTT303


  • Panel transmitters, DIN rail mount

Their main application is monitoring, allowing easy installation, several options in closed places and for connection with sensors, high installing and maintenance flexibility, which offers safety and reliability. In addition, they include auto-diagnostics, sensor matching (Callendar-Van Dusen), digital communication and combine with the most varied sensors in simple, double, maximum, minimum, medium and differential measurement. Example:SMARTT411.


  •  Transmitters for head mount (in wells)

Their main application is well head mounting, one that allows easy installation and connection with sensors, high flexibility and maintenance, providing safety and reliability. They also provide auto-diagnostics, sensor matching (Callendar – Van Dusen), digital communication and are used with the most different sensors in simple, double, maximum, minimum, medium and differential measurement.  Example:SMARTT421.

As for protocols, like for any other field equipment, the market favorite is the open protocol, asHART, Foundation Fieldbus andProfibus PA.


Examples of HART (4-20mA) Transmitters

See below figure 2 showing the block diagram for the SMAR HART TT301temperature transmitter.

Figure 2 – TT301 transmitter block diagram



This transmitter has the following features:

  • Universal input with wide choice of sensors: RTD standard, Thermocouple standard, ohm, mV and Special Sensor
  • Simple or Differential Measurement: 2, 3 or 4 wires and sensor backup
  • Insulated
  • Cold joint compensation
  • Line resistance compensation
  • Linearization
  • Basic accuracy of 0.02% 
  • 4-20mA + HART Protocol
  • Re-range
  • PID block and SetPoint Generator
  • Auto-diagnostics
  • Burn-out detection
  • Easy upgrade for Foundation Fieldbus and Profibus PA
  • Display (4 mounting positions)
  • Field mounting
  • Explosion and weather proof
  • Intrinsically safe
  • High EMI and RF immunity
  • Robustness
  • Simple and complete local adjustment
  • Output current compliant to NAMUR-NE43
  • Writing protection
  • Double-compartment housing

High performance mathematical co-processor



  • Low maintenance cost
  • Remote auto-diagnostics
  • Only one inventory spare part model: a single transmitter for any application and wide range and type of sensor
  • Low installation cost
  • Remote or easy configuration and easy calibration (re-range)
  • Flexibility: a single transmitter for any king of application and wide range and type of sensor
  • Production costs reduction
  • Time reduction of process downtime
  • Better production uniformity
  • Reduction of process variability: raw material saving and better final product quality due to high accuracy and stability


Figure 3 shows the block diagram of the TT411 and TT421 HART temperature transmitters 



Figure 3 -TT411 and TT421 HART temperature transmitters block diagram 



These transmitters have the basic characteristics as the TT301. See details and benefits on figures 4 and 5.


Figure 4 – TT411 mounted on DIN rail


Figure 5 – TT421 mounted on head



Temperature measurement novelties

Optical Sensors


Optical sensors are still not well disseminated, but see below some innovations provided by the fiber optics evolution:

  • It was invented in 1952 by Indian physicist Narinder Singh Kanpany.
  • 1970: Coming Glass produced a few meters of optical fiber with losses of 20 db/dm.
  • 1973: A fiber optics telephone link was installed in the USA.
  • 1976: Bell Laboratories installed a 1 km telephone link in Atlanta and proved the feasibility of fiber optics for telephone communications.
  • 1978: Several world locations began producing fiber optics with losses below 1,5 dB/km.
  • 1988: The first submarine optical fiber cable submerged in the ocean and inaugurated the information superhighway.
  • 2004: Fiber optics amounted to 40 billion dollars yearly.
  • 2007: Brazilian fiber optics turned 30 and the American fiber optics sensor market amounted to 237 million dollars.
  • 2014:  The American fiber optical sensor market is estimated to amount to 1,6 million dollars.


The sensitivity of fiber optical sensors, i.e. the less intense disturbance that could be measured may depend on:

  • Infinitesimal variations on some parameter characterizing the fiber used, when the fiber is the sensor element itself;
  • Changes on the properties of the light used, when the fiber is the channel through which the light moves back and forth from the test location.


Fiber Optics sensors are compact and show sensitivity comparable to the similar conventional sensors. There are many Fiber Optics commercial sensors for measuring temperature, pressure, rotation, acoustic signals, current, flow, etc.

A current type with many applications is the temperature sensor with technology based on Bragg Networks.

Figure 6 – Bragg Networks



Bragg Networks are simple elements confined to the fiber optics nucleus having high mass production potential. The possibility of producing Bragg networks directly on the fiber optics nucleus by photolytic processes without harming the fibers physical integrity and optical features was enhanced during the last decade as one of the most fertile fields on scientific research in the optoelectronics area.

The resonance of the spectral response of the Bragg sensors is especially attractive for applications of wave length multiplexing. These characteristics may be conveniently exploited on a single optical fiber containing several sensor elements with different Bragg resonance.


It is then possible to associate each sensor to a given position along the fiber, making up a quasi-distributed deformation or temperature sensor. Self-referencing and multiplexing have been the main advantages associated to the Bragg sensors, as the basis for a huge technological development.

The Bragg networks are made up by a periodical modulation of the refraction of the fiber optics nucleus index.

The maximum reflection value of this microstructure occurs when the guided mode constant of propagation in the nucleus resonates with the index spatial modulation, with L period, thus establishing the well known Bragg condition (see figure 6).

Figure 6 illustrates the action of a Bragg network on the light propagated on the fiber optics nucleus.

The Bragg networks, being an intrinsic part of the fiber optics, are sensitive to applications of physical magnitudes, similarly to their silica matrix itself. The spectral properties of the Bragg networks depend on magnitudes such as temperature and mechanical tension, i.e. the application of any magnitude that may alter the effective index or the related period and induce a deviation on the resonant wavelength. The basic operational principle of the Bragg sensors are based hence in the measurement of deviations in the wavelength induced on the condition of resonance by temperature variation or mechanical, pressure and magnetic field deformity. However, in view of the practical importance given to temperature and deformity sensors, most demonstrations with basis on Bragg sensor have been focused on these applications.

The sensibility to temperature of the Bragg sensors results from the thermal expansion of the silica matrix and the dependence on the index of refraction with the temperature. The great attractiveness for using the Bragg networks as sensors is due to the fact that the information is contained in the spectrum, meaning that it is an absolute and easy measurement for multiplexing with high exactness. These sensors are greatly used in well-bottom temperature measurement. 



This article showed the importance of temperature measurement in automation and process control, a little bit of history of temperature measurement and the technological development of temperature transmitters, in addition to the three transmitter trends, their applications and benefits. We also saw the temperature sensor used in the Bagg network, one that is expected to bring novelties to measurement in the future.


  • SMAR TT301, TT302, TT411, TT421, TT423 Temperature Transmitters Operation Manuals
  • Web: http://www.smar.com.br/  e  http://www.smarresearch.com/.
  • Control  & Instrumentation, Edition 82 - “HART digital protocolo”, César Cassiolato.
  • Control  & Instrumentation, Edition 93 - “Temperature Transmitters”, César Cassiolato.
  • Mecatronics, Edition 48 – Temperature Transmitters, Cesar Cassiolato.
  • http://www.deetc.isel.ipl.pt/jetc05/JETC99/pdf/art_53.pdf
  • Internet research in different sites on Temperature Measurement (All illustration, brands and products used hereby are the sole property of their respective proprietors, as well as any other intellectual property mode).