Readers who are confident about these basics won't find anything new here except, perhaps, for the precise mathematical definition of redshift stated at the very end.
Fundamental to almost any science is the process of measurement. In astronomy, perhaps the most important quantity that can be measured through an optical telescope is brightness. But observed, measurable brightness all by itself is not too useful, because what's more important for an object such as a distant star or galaxy is not the observed brightness, but instead the intrinsic brightness – the amount of light actually emitted by the object, not what we are able to observe.
Since observed brightness falls off as the square of the distance, we can compute the intrinsic brightness from the observed brightness if we know the distance. Unfortunately, for most astronomical objects outside the solar system, there's no simple way to determine the distance. We can't just do it with a yardstick. There are a few indirect techniques for measuring astronomical distance, but most of these fail for things that are really distant, like most galaxies.
There is, however, one thing that's relatively easy to measure with a telescope, whether it's of the optical kind or one that works in some other part of the electromagnetic spectrum, like a radio telescope. And that is the relative strength of the electromagnetic signal at different wavelengths in the spectrum. This is what a spectrometer (literally, an instrument for measuring a spectrum) does in the optical part of the spectrum.
In a type of luminous object called (paradoxically) a "black body", the signal strength of electromagnetic radiation varies continuously across the spectrum in a known way, without sharp peaks or dips. But normally the signal strength from a star or a cloud of interstellar gas does not vary smoothly. Instead, there are usually particular wavelengths at which the signal is especially stronger or weaker than at most adjacent wavelengths. This is because of the way a hot gas of atoms or molecules emits or absorbs radiation unusually strongly at certain particular wavelengths.
These special wavelengths are the emission or absorption lines in the spectrum. When we know the kind of atom or molecule involved, these wavelengths can be measured in a laboratory, and each type of atom or molecule has its own characteristic "signature" of lines. If a gas of these atoms or molecules is emitting light, we get emission lines as peaks in the spectrum. And if a continuous spectrum of light passes through the gas (when it is cool enough not to emit light), we find absorption lines at the same wavelengths.
The most abundant elements in the universe are hydrogen and helium. The spectral signatures from these two gases are quite well known. But when we measure spectra from (for example) distant stars, we find slight shifts in where the lines are from where they "ought" to be. This shift is known as the Doppler shift, and it tells us precisely how fast the object is moving towards or away from us. (For very distant objects, the same shift occurs, but not for the usual reason, as we will explain later.) In most cases, especially for distant objects like galaxies, the shift is towards longer wavelengths. For visible light, that shift is in the direction of the red end of the visible spectrum, so it's called a "redshift".
The remarkable thing, which has been known for less than 100 years, is that light from very distant objects like galaxies is almost always shifted in the red direction, meaning that most such objects are moving away from us. The amount of the shift is easily computed to be proportional to the speed of the object along the line of sight. And what has been found that is even more remarkable than the existence of the shift in (usually) the red direction is that the amount of the shift (and hence the speed of the movement) varies directly with the actual distance to the object for most remote objects.
Because of the existence of this velocity-distance relationship, it becomes possible to infer the distance of an object from a measurement of its spectrum. This is why redshift is so important in astronomy. So let's have a look at the history of how this surprising, unexpected relationship was discovered.
Edwin Hubble, in the early 1920s, was the astronomer most responsible for the discovery of the velocity-distance relationship, and hence the first to understand that the universe as a whole is expanding.
Several other astronomers around 1920 recognized that shift of spectral lines from a galaxy might be interpreted as being the result of relative motion between the Earth and the galaxy. A blue shift would mean the object was moving in Earth's direction, while a red shift would mean it was moving away. Other interpretations of the red shift are possible. Indeed, some astronomers around 1920 (and even today) preferred other interpretations. But the interpretation of the red shift of spectra as a result of relative velocity has become accepted as the best way to interpret vast amounts of observational data.
In 1920 galaxies were not known to be enormous collections of stars like the Milky Way, and lying outside it. They were then just thought of as fuzzy stars – nebulae (from the Latin for "clouds"). But the interpretation of spectral redshift as due to relative velocity, followed by Hubble's discovery of a correlation between this redshift and actual distance, showed convincingly that galaxies had to be so remote that they could not be part of the Milky Way.
Naturally, the correlation between redshift (hence apparent velocity) and distance, which at the time could be stated as a simple proportion, became known as Hubble's Law. And the constant of proportionality became known as the "Hubble constant". (The relationship was actually a little more complicated, as we'll explain shortly.)
Hubble was able to derive an independent estimate of distance from Earth to relatively nearby galaxies by identifying stars in those galaxies whose intrinsic brightness could be accurately estimated. These stars are known as Cepheid variables. In this type of variable star, it was known that the regular period in which the brightness changes is directly related to the maximum brightness of the star. Thus a measurement of the period of such a star in any galaxy where the star could be identified indicates what its actual brightness is, and from its apparent brightness as seen from Earth, the actual distance can be determined.
Because Hubble could estimate in this way how far away a few galaxies were, he was able to determine that they were much too far away to actually lie within the Milky Way – contrary to what had been generally assumed up to that time. Indeed, the general supposition then was that the Milky Way comprised the entire universe, so Hubble's discovery was a big deal.
Hubble's Law simply states that the amount of redshift of a galaxy was proportional to its distance. The constant of proportionality, usually denoted by H (guess why) is called the "Hubble constant".
As it turns out, Hubble underestimated the actual distance of the galaxies he studied by nearly a factor of 10, due to errors in measuring the brightness of distant Cepheids. Consequently, the initial value figured for the Hubble constant was also off by the same factor.
This numerical problem was corrected soon enough. But it turns out that there are a couple of conceptual problems as well with the law. These became apparent before long when cosmologists tried to apply the equations of Einstein's general relativity theory to describing the expansion of the universe. Surprisingly enough, a fairly simple equation, called the Friedmann equation, first proposed by Alexander Friedmann in 1922, does a very good job.
The story of the Friedmann equation itself is quite interesting, but a little off topic right now. However, as cosmologists now understand the equation and use it to model the universe, a couple things in the conceptual understanding of Hubble's law are changed from Hubble's original idea. In the first place, Hubble's constant isn't in fact a constant at all, so cosmologists now prefer to call it "Hubble's parameter". It varies in a known way for objects that are very far apart, like billions of light years. But for relatively nearby galaxies it is pretty close to constant (the value is about 71, in case you're wondering).
The second conceptual point is that cosmological redshift is now understood to be due to the actual expansion of space itself, rather than the Doppler shift it was originally presumed to be. A classical Doppler shift results because the peak-to-peak distance of a periodic wave emitted by an object moving away from the observer is slightly longer than it would be if there were no relative motion, precisely because of the relative motion. The distance is increased by how far the source of the wave moves in the time between two peaks.
Now that cosmologists conceive of space itself as actually expanding with time, the redshift that a photon undergoes in traveling a long distance between points A and B results from the expansion of space that occurs in the time it takes for the photon to travel from A to B. The wavelength itself is stretched along with space.
Nevertheless, there is still a relatively simple, monotonic, though nonlinear, relationship between the distance of a remote galaxy and its observed redshift. Converting from a redshift to distance involves a variety of assumptions about certain parameters, such as the Hubble parameter and the curvature (if any) of space on a large scale. But these parameters have been measured in a variety of independent ways so that we now have fairly reliable estimates of their values. (You can go here if you want to play with this relationship yourself.)
The actual distances of remote objects are rather difficult (if not impossible) to determine with any accuracy, while redshift is pretty easy to measure with spectrometers. Consequently, astronomers customarily think of distance, which isn't directly observable, in terms of spectral redshift, which is. In fact, standard operating procedure is to report the redshift rather than the inferred distance.
The formal definition of redshift, denoted by z, is
z = (λ0 - λe) / λeHere λ0 is the measured wavelength of a photon, while λe is the original wavelength of the photo when it was emitted.
For example, if the wavelength is exactly doubled, λ0 = 2λe, so z=1. If you rearrange terms in the definition of z, you get λ0 = (z+1)λe. That is, z+1 is the actual factor by which the wavelength is increased for any given z. (If this seems confusing, just remember that z=0 means no shift at all, so the factor of expansion is simply 1.)
In subsequent articles where I will discuss recent research results, there will be a lot of talk of redshift. The simple equation just shown can then be used to compare the change in photon wavelengths. A table or calculation such as noted above can be used to infer the distance of the object in question. And from this distance, one then knows how long ago the object emitted the light we see now, hence how long this time was after the big bang occurred (which is now estimated to be about 13.7 billion years ago).
Links to this post: