Metrology is the science of measurement including all theoretical and experimental aspects, in particular, the experimental and theoretical investigations of uncertainties in measurement results. According to Nobel Prize Winner J. Hall, “metrology truly is the mother of science” [1].
Metrology is almost as old as humankind. When people began to exchange goods, they had to agree on commonly accepted standards as a base for their trade. Indeed, many of the ancient cultures such as China, India, Egypt, Greece, and the Roman Empire had a highly developed measurement infrastructure. Examples are the Nippur cubit from the third millennium BCE found in the ruins of a temple in Mesopotamia and now exhibited in the archeology museum in Istanbul and the famous Egyptian royal cubit as the base length unit for the construction of pyramids. However, the culture of metrology faded during the Middle Ages when many different standards were in use. In Germany, for instance, at the end of the eighteenth century, 50 different standards for mass and more than 30 standards for length were used in different parts of the country. This, of course, had been a barrier to trade and led to abuse and fraud. It was then during the French Revolution that the French Académie des Sciences took the initiative to define standards independent of the measures taken from the limbs of royal representatives. Instead, their intent was to base the standards on stable quantities of nature available for everyone at all times. Consequently, in 1799, the standard for length was defined as the ten millionth part of the quadrant of the earth, and a platinum bar was fabricated to represent this standard (Mètre des Archives). Subsequently, the kilogram, the standard of mass, was defined as the mass of one cubic decimeter of pure water at the temperature of its highest density at 3.98 °C. This can be seen as the birth of the metric system, which, however, at that time was not generally accepted through Europe or even in France. It was only with the signature of the Metre Convention in 1875 by 17 signatory countries that the metric system based on the meter and the kilogram received wider acceptance [2]. At the time of this writing, the Metre Convention was signed by 60 states with another 42 states being associated with the General Conference on Weights and Measures (Conférence Générale des Poids et Mesures, CGPM) (as of November 2018). At the General Conferences, following the first one in 1889, the system of units was continuously extended. Finally, at the 11th CGPM in 1960, the previous SI (Système International d'Unités) (see Section 2.2) with the kilogram, second, meter, ampere, kelvin, and candela as base units was defined. The mole, unit of amount of substance, was added at the 14th CGPM in 1971. Within the SI, the definition of some units has been adopted according to progress in science and technology; for example, the meter was defined in 1960 based on the wavelength of a specific emission line of the noble gas krypton. But then, in 1983, it was replaced by the distance light travels in a given time and by assigning a fixed value to the speed of light in vacuum. Similarly, the second, originally defined as the ephemeris second, was changed by the 13th CGPM and defined via an electronic transition in the Cs isotope 133. Thus, in the previous SI, the meter and the second were defined by constants of nature. In the present revised SI, as accepted by the 26th CGPM in 2018, all units are based on constants of nature [3–7]. In fact, in this context, single quanta physics has a decisive role as will be outlined in this book.
We shall begin with introducing some basic principles of metrology in Chapter 2. We start in Section 2.1 by repeating some basic facts related to measurement and discuss the limitations for measurement uncertainty. The present SI is then presented in Section 2.2 . The previous definitions of the respective units are also given for comparison.
Chapter 3 treats the realization of the present definition of the second employing atomic clocks based on the hyperfine transition in the ground state of 133Cs applying thermal beams and laser‐cooled atoms, respectively.
Chapter 4 is devoted to superconductivity and its utilization in metrology. Because of its prominent role for electrical metrology, we introduce superconductivity, the Josephson effect, magnetic flux quantization, and quantum interference. By means of the Josephson effect, the volt (the unit for the electrical potential difference) is traced back to the Planck constant and the elementary charge as realized in today's most precise voltage standards. We further discuss magnetic flux quantization and quantum interference allowing the realization of quantum magnetometers (superconducting quantum interference devices) with unprecedented resolution and precision.
The underlying solid‐state physics and the metrological application of the quantum Hall effect are discussed in Chapter 5. In the present SI, the unit of electric resistance, ohm, is traced back to the Planck constant and the elementary charge by the quantum Hall effect.
In Chapter 6, we describe the physics of single‐electron transport devices, which allow the realization of the unit of electric current, the ampere, according to its present definition based on the elementary charge and frequency. We further discuss the so‐called metrological triangle experiment aimed to prove the consistency of the present realizations of the volt, ampere, and ohm.
Chapter 7 is then devoted to the present definition of the kilogram and the mole based on, respectively, the Planck constant and the Avogadro constant. We present the Kibble balance and the silicon single‐crystal experiment, which have been seminal for the precise determination of the Planck constant and are now primary realizations of the kilogram replacing the International Kilogram Prototype (IKP).
Various experiments that have contributed to the precise determination of the value of the Boltzmann constant and that are potential realizations of the unit of thermodynamic temperature, kelvin, are described in Chapter 8.
In Chapter 9, we take an even further look into the future of the SI when we discuss optical clocks, which may in due time cause a change of the defining constant for the unit of time, the second, resulting in an improved realization. Further, we discuss the prospect of single‐photon emitters for a possible new definition of radiometric and photometric quantities, for example, for (spectral) irradiance and luminous intensity.
In an outlook in Chapter 10, we finally discuss a few examples how the present definitions of the SI pave the way to bring quantum metrology and quantum technology to the “workbench,” thereby considerably improving the quality of measurements for industry, science, and society.