Second

The second (SI symbol: s), sometimes abbreviated sec., is the name of a unit of time, and is the International System of Units (SI) base unit of time. It may be measured using a clock.

SI prefixes are frequently combined with the word second to denote subdivisions of the second, e.g., the millisecond (one thousandth of a second), the microsecond (one millionth of a second), and the nanosecond (one billionth of a second). Though SI prefixes may also be used to form multiples of the second such as kilosecond (one thousand seconds), such units are rarely used in practice. More commonly encountered, non-SI units of time such as the minute (min) and hour (h) increase by multiples of 60 and 24 (rather than by powers of ten as in SI).

The second was also the base unit of time in the centimetre-gram-second, metre-kilogram-second, metre-tonne-second, and foot-pound-second systems of units.

International second
Under the International System of Units, the second is currently defined as "The second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom."

This definition refers to a cesium atom at rest at a temperature of 0 K (absolute zero), and with appropriate corrections for gravitational time dilation. The ground state is defined at zero magnetic field. The second thus defined is consistent with the ephemeris second, which was based on astronomical measurements. (See History below.) The international standard symbol for a second is s (see ISO 31-1).

The realization of the standard second is described briefly in a special publication from the National Institute of Science and Technology, and in detail by the National Research Council of Canada.

In practice, the transition is measured in the quantum vacuum where vacuum fluctuations can lead to shifts in atomic energy levels in vacuum relative to their values in free space, for example, to a Lamb shift. Consequently, the transition in quantum vacuum may not have the same frequency as in free space. Free space (which, like absolute zero, is a theoretical reference state that cannot be attained in practice, with exact values for its electromagnetic properties: c0, μ0, ε0, and Z0) is the reference state for the SI units for the metre and ampere.

Equivalence to other units of time
1 international second is equal to:
 * 1/60 minute (but see also leap second)
 * 1/3,600 hour
 * 1/86,400 day (IAU system of units)
 * 1/31,557,600 Julian year (IAU system of units)

Before mechanical clocks
The Egyptians subdivided daytime and nighttime into twelve hours each since at least 2000 BC, hence their hours varied seasonally. The Hellenistic astronomers Hipparchus (c. 150 BC) and Ptolemy (c. AD 150) subdivided the day sexagesimally and also used a mean hour ($1/24$ day), but did not use distinctly named smaller units of time. Instead they used simple fractions of an hour.

The day was subdivided sexagesimally, that is by $1/60$, by $1/60$ of that, by $1/60$ of that, etc, to at least six places after the sexagesimal point (a precision of less than 2 microseconds) by the Babylonians after 300 BC, but they did not sexagesimally subdivide smaller units of time. For example, six fractional sexagesimal places of a day was used in their specification of the length of the year, although they were unable to measure such a small fraction of a day in real time. As another example, they specified that the mean synodic month was 29;31,50,8,20 days (four fractional sexagesimal positions), which was repeated by Hipparchus and Ptolemy sexagesimally, and is currently the mean synodic month of the Hebrew calendar, though restated as 29 days 12 hours 793 halakim (where 1 hour = 1080 halakim). The Babylonians did not use the hour, but did use a double-hour lasting 120 modern minutes, a time-degree lasting four modern minutes, and a barleycorn lasting 3$1/3$ modern seconds (the helek of the modern Hebrew calendar).

In 1000, the Persian scholar al-Biruni gave the times of the new moons of specific weeks as a number of days, hours, minutes, seconds, thirds, and fourths after noon Sunday. In 1267, the medieval scientist Roger Bacon stated the times of full moons as a number of hours, minutes, seconds, thirds, and fourths (horae, minuta, secunda, tertia, and quarta) after noon on specified calendar dates. Although a third for $1/60$ of a second remains in some languages, for example Polish (tercja), Turkish (salise) and Arabic (ثالثة), the modern second is subdivided decimally.

Seconds measured by mechanical clocks
The first clock that could show time in seconds was created by Taqi al-Din at the Istanbul observatory of al-Din between 1577-1580. He called it the "observational clock" in his In the Nabik Tree of the Extremity of Thoughts, where he described it as "a mechanical clock with three dials which show the hours, the minutes, and the seconds." He used it as an astronomical clock, particularly for measuring the right ascension of the stars. The first mechanical clock displaying seconds in Western Europe was constructed in Switzerland at the beginning of the 17th century.

The second first became accurately measurable with the development of pendulum clocks keeping mean time (as opposed to the apparent time displayed by sundials), specifically in 1670 when William Clement added a seconds pendulum to the original pendulum clock of Christian Huygens. The seconds pendulum has a period of two seconds, one second for a swing forward and one second for a swing back, enabling the longcase clock incorporating it to tick seconds. From this time, a second hand that rotated once per minute in a small subdial began to be added to the clock faces of precision clocks.

Modern measurements
In 1956 the second was defined in terms of the period of revolution of the Earth around the Sun for a particular epoch, because by then it had become recognized that the Earth's rotation on its own axis was not sufficiently uniform as a standard of time. The Earth's motion was described in Newcomb's Tables of the Sun (1895), which provide a formula estimating the motion of the Sun relative to the epoch 1900 based on astronomical observations made between 1750 and 1892. The second thus defined is

"the fraction 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time."

This definition was ratified by the Eleventh General Conference on Weights and Measures in 1960. The tropical year in the definition was not measured, but calculated from a formula describing a mean tropical year which decreased linearly over time, hence the curious reference to a specific instantaneous tropical year. This definition of the second was in conformity with the ephemeris time scale adopted by the IAU in 1952, defined as the measure of time that brings the observed positions of the celestial bodies into accord with the Newtonian dynamical theories of their motion (those accepted for use during most of the twentieth century being Newcomb's Tables of the Sun, used from 1900 through 1983, and Brown's Tables of the Moon, used from 1923 through 1983).

With the development of the atomic clock, it was decided to use atomic clocks as the basis of the definition of the second, rather than the revolution of the Earth around the Sun.

Following several years of work, Louis Essen from the National Physical Laboratory (Teddington, England) and William Markowitz from the United States Naval Observatory (USNO) determined the relationship between the hyperfine transition frequency of the cesium atom and the ephemeris second. Using a common-view measurement method based on the received signals from radio station WWV, they determined the orbital motion of the Moon about the Earth, from which the apparent motion of the Sun could be inferred, in terms of time as measured by an atomic clock. They found that the second of ephemeris time (ET) had the duration of 9,192,631,770 ± 20 cycles of the chosen cesium frequency. As a result, in 1967 the Thirteenth General Conference on Weights and Measures defined the second of atomic time in the International System of Units as

"the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom."

This SI second, referred to atomic time, was later verified to be in agreement, within 1 part in 1010, with the second of ephemeris time as determined from lunar observations. (Nevertheless, this SI second was already, when adopted, a little shorter than the then-current value of the second of mean solar time. )

During the 1970s it was realized that gravitational time dilation caused the second produced by each atomic clock to differ depending on its altitude. A uniform second was produced by correcting the output of each atomic clock to mean sea level (the rotating geoid), lengthening the second by about 1. This correction was applied at the beginning of 1977 and formalized in 1980. In relativistic terms, the SI second is defined as the proper time on the rotating geoid.

The definition of the second was later refined at the 1997 meeting of the BIPM to include the statement

"This definition refers to a cesium atom at rest at a temperature of 0 K."

The revised definition would seem to imply that the ideal atomic clock would contain a single cesium atom at rest emitting a single frequency. In practice, however, the definition means that high-precision realizations of the second should compensate for the effects of the ambient temperature (black-body radiation) within which atomic clocks operate, and extrapolate accordingly to the value of the second at a temperature of absolute zero.

Today, the atomic clock operating in the microwave region is challenged by atomic clocks operating in the optical region. To quote Ludlow et al. “In recent years, optical atomic clocks have become increasingly competitive in performance with their microwave counterparts. The overall accuracy of single trapped ion based optical standards closely approaches that of the state-of-the-art cesium fountain standards. Large ensembles of ultracold alkaline earth atoms have provided impressive clock stability for short averaging times, surpassing that of single-ion based systems. So far, interrogation of neutral atom based optical standards has been carried out primarily in free space, unavoidably including atomic motional effects that typically limit the overall system accuracy. An alternative approach is to explore the ultranarrow optical transitions of atoms held in an optical lattice. The atoms are tightly localized so that Doppler and photon-recoil related effects on the transition frequency are eliminated.”

The NRC attaches a "relative uncertainty" of 2.5 (limited by day-to-day and device-to-device reproducibility) to their atomic clock based upon the 127I2 molecule, and is advocating use of an Sr88 ion trap instead (relative uncertainty due to linewidth of 2.2). See magneto-optical trap and Such uncertainties rival that of the NIST F-1 cesium atomic clock in the microwave region, estimated as a few parts in 1016 averaged over a day.

SI multiples
SI prefixes are commonly used to measure time less than a second, but rarely for multiples of a second. Instead, the non-SI units minutes, hours, days, Julian years, Julian centuries, and Julian millennia are used.