Coordinated Universal Time (UTC)
The idea of Coordinated Universal Time is something called a “time standard.” This term refers to a specific rate at which time passes or specific points in time. There are various types of time standards used throughout history, today the UTC is the main standard by which time is measured around the world.
In this case, we’re discussing a constructed time standard. This modern solution does not observe Daylight Saving Time and in terms of coordinates and time zones, it is located within one second of zero degrees longitude. Join us as we examine this method of telling time around the world and how it came to be.
An Overview of Time Standards
When we consider a time standard a lot of things come to mind. Clocks, calendars, astronomical observations, these are all tools that have been used to establish a time standard in various parts of the world throughout history. Time standards were originally based on the rotation of the Earth. The problem with this method was the assumption leading up to the end of the 19th century that Earth’s rate of rotation was constant.
After examining astronomical evidence, astronomers were able to discover that this rate wasn’t constant. A major step forward was the invention of the caesium atomic block in 1955. This replaced many of the astronomical time standards. This brings us to three major categories of time standards:
1. Based on Earth’s Rotation
Various types of time standards fall into this category, but perhaps the most recognized one is Greenwich Mean Time (GMT). This time standard was measured based on calculations from the Royal Greenwich Observatory. The International Meridian Conference in 1884 decided that the observatory’s location would be the Prime Meridian. This method was replaced by UTC.
2. Based on Planetary Motion
The most common standard utilizing this method is Ephemeris Time (ET) which is based on the time it takes for the Earth to orbit the sun, with seconds being a fraction of the tropical year. This standard was used from 1952 to 1976 as the standard for the International Astronomical Union. It is known as a dynamical time scale. Several other variations of this category include Terrestrial Time which is at Earth’s surface, Geocentric Coordinate time which is at Earth’s center, and Barycentric Coordinate Time which is at the center mass of the solar system, known as the barycenter.
3. Constructed Time Standards
This final category is where UTC falls. These are standards that have been calculated and constructed based on various agreed choices. This includes International Atomic Time, which is the standard by which all other time standards are calculated, including UTC. It is calculated from the input of various atomic clocks around the world.
The Uses and Functions of Coordinated Universal Time
Since the world is divided into time zones based on how and when the day and night cycles occur, each of the zones represents a positive or negative offset from the central UTC time. The farthest western time zone is UTC-12 for example, and the farther eastern zone theoretically is UTC+12. This isn’t exactly how things are used though.
For example, Kirbati decided they were going to use UTC+14 to keep their date in alignment with Australia instead of America. They start their day several hours before Australia instead of 22 hours before. This universal time standard is used by the internet and the World Wide Web as well. The Network Time Protocol, the method by which clocks of computers are synchronized around the world, uses the UTC system as well.
Computers, servers, and online companies prefer UTC over GMT because the former is more precise. UTC is used in aviation as well for both air traffic control and for flight planning. Weather forests use UTC worldwide to avoid the confusion that arises from time zones and DST. Even the International Space Station uses it as a standard.
UTC divides time into days, hours, minutes, and seconds. The days are measured primarily using the Gregorian calendar, although the Julian calendar is sometimes used as well. Each day contains 24 hours and each hour 60 minutes. Leap seconds are used when it comes to smaller measurements of time. Therefore UTC identifies these smaller intervals with variable durations, while the larger intervals are known as constant durations.
Almost every UTC day has a total of 86,400 seconds, with 60 per minute, but the mean solar day is longer than this, therefore the last minute of certain days is changed to have 61 seconds. The extra second is known as a “leap second.” There is also a plan in place to change the last minute to 59 seconds in the event that the Earth rotates faster, but this hasn’t been used.
Since 1972, UTC has been calculated through a method of subtracting the leap seconds from International Atomic Time. This standard does not change during season, but it is altered during Daylight Saving Time for countries that participate in the practice.
The History of UTC
It all began with the 1884 International Meridian Conference in Washington D.C. Here it was decided that the local mean solar time would be located at the Royal Observatory in Greenwich, England. This made it line up with Greenwich Mean Time (GMT) which had already been in use since 1847.
The next quantum leap came in 1955 when the first caesium atomic clock was created. To understand how profoundly this affected the way we tell time, let’s explore how such a device functions:
Interlude: What’s An Atomic Clock?
An atomic clock is a device that utilizes something called electronic transition frequency in various levels of the light spectrum to measure time more accurately than any other device in existence. They function by measure the microwave signal that electronics give off when they change energy levels.
To measure such miniscule changes, the atoms are lowered in temperature until they are almost at absolute zero. At this temperature the atoms move much more slowly. There are various atomic clocks around the world that work in tandem to maintain the accuracy of UTC. In terms of the device’s invention, it was first produced in an accurate form by Louis Essen in 1955 using a standard based on the caesium-133 atom. Ephemeris time was used to calibrate the clock.
There are four factors that led to the advancement and ultimate standard that these clocks provide:
● Laser cooling and trapping of atoms
● High-finesse Fabry-Perot cavities that provide narrow laser line widths
● Precise laser spectroscopy
A convenient method of counting optical frequencies using optical combs
Now back to our regularly scheduled article. Once the caesium atomic clock was brought forth, it was clear that astronomical methods of telling time were not going to work going forward. In 1956, the National Bureau of Standards and the U.S Naval Observatory started using atomic frequency time scales.
These scales were used in 1959 to generate the WMV shortwave time signals, named after the radio station that broadcasted them. The Royal Greenwich Observatory and the UK National Physical Laboratory coordinated their radio broadcasts to account for time steps and frequency changes. This time scale they used was referred to as “Coordinated Universal Time.”
New information came to light in 1958 when data was presented showing a link between the caesium transition and the ephemeris second. The latter term refers to a second measured based on the laws of motion that govern the movement of planets and moons in our solar system. These seconds are always a constant length, as are atomic seconds. This data helped establish the length of an atomic second.
UTC was first used in 1961 and in 1967 the standard for the length of a second was redefined to use measurements from a caesium atomic clock. The first leap second was implemented on June 30, 1972 and since that date there have been leap seconds every 19 months on average. As of June 2014, there are a total 25 leap seconds which puts UTC 35 seconds behind TAI.
The Future of UTC as a Time Standard
Earth’s rotation speed is gradually slowing, this results in a need for positive leap seconds more and more often as time goes on. This is a gradual change of course, usually no more than 1.7 additional milliseconds per century. By the end of the 21st century, the Length of Day (LOD) will be 84,400.004 seconds which means a leap second will be needed every 250 days.
After several centuries go by, the amount of leap seconds will be too much to handle. In the 22nd century, there will be two leap seconds per year. The 25th century will require four leap seconds, and in two thousand years there will be a leap second each month, and yet it will still fall behind. If we’re looking even further into the future, UTC will break down in a few tens of thousands of years as there will need to be a leap second every single day.
There is a proposal in place to abolish the leap second, or to provide more freedom in when they are placed into the year. Certain decisions are being weighted to see if civil time could be regulated with atomic time. If leap seconds are abandoned, then the calendar will no longer be connected to the Earth’s rotation.
This means that days won’t be defined by sunrise, or sunset. Instead they will be regulated entirely by the measurements of cesium atoms. Essentially, these are the options:
● If leap seconds remain: One calendar day is one approximate turn of the Earth on its axis
● If leap seconds are abolished: a calendar day will be equal to 794,243,384,928,000 hyperfine oscillations of cesium-133
Personally I think the first one is less complicated. Well, there you have it, an examination of how we established a universal time standard. Things may get more complicated as time goes on, but for now everything is working just fine. For more interesting topics and article, be sure to check out our blog.