Original by Ned Wright 2001.
Recently two different groups [1, 2] have measured the apparent brightness of supernovae with redshifts near z = 1. These data show that supernovae are fainter than they would be in a critical density universe. To be fainter, the supernovae must be further away. That means that the time between now and z = 1 must be larger than it would have been in a critical density universe, and this requires that the expansion of the Universe was slower in the past. Hence the faintness of the supernovae at z = 1 implies an accelerating expansion. Based on this data the old idea of a cosmological constant is making a comeback.
Einstein introduced the cosmological constant to make a static, homogeneous cosmological model in 1917. The gravitational effect of matter caused an acceleration in this model which Einstein did not want, since at the time the Universe was not known to be expanding. Thus Einstein introduced the cosmological constant into his equations for General Relativity. This term acts to counteract the gravitational pull of matter, and so it has been described as an anti-gravity effect.
Why does the cosmological constant behave this way?
This term acts like a vacuum energy density, an idea which has become quite fashionable in high energy particle physics models since a vacuum energy density of a specific kind is used in the Higgs mechanism for spontaneous symmetry breaking. Indeed, the inflationary scenario for the first picosecond after the Big Bang proposes that a fairly large vacuum energy density existed during the inflationary epoch. The vacuum energy density must be associated with a negative pressure because:
(1) The vacuum must be Lorentz invariant or one would have a preferred frame. The stress-energy tensor of the vacuum is diag(rho*c2,P,P,P) and this tensor must be Lorentz invariant. The only Lorentz invariant nonzero rank two tensor is the metric = diag(-1,1,1,1) in a local inertial frame, so if the vacuum energy density is non-zero, the pressure has to be -rho*c2.
(2) If one considers a cylinder filled with vacuum and surrounded by "nothing", then pulling out the piston creates new volume filled with vacuum having a nonzero energy density, so one has to supply energy equal to rho*c2*dV. This requires a negative pressure P = -rho*c2.
But in General Relativity, the energy in P*V has a gravitational effect, which means that the gravitational acceleration at the edge of a uniform density sphere is not given by
G M 4 pi g = --- = ---- G rho R R2 3but rather is given by
4 pi 4 pi g = ---- G (rho + 3P/c2) R = ---- G [rho(matter) - 2 rho(vacuum)] R 3 3Now Einstein wanted a static model, which means that g = 0, but he also wanted to have some matter, so rho > 0, and thus he needed P < 0. In fact, by setting
1 rho(vacuum) = --- rho(matter) 2he had a total density of 1.5*rho(matter) and a total pressure of -0.5*rho(matter)*c2 since the pressure from ordinary matter is essentially zero (compared to rho*c2). Thus rho+3P/c2 = 0, and from two equations up the gravitational acceleration becomes zero, allowing a static universe.
However, there is a basic flaw in this Einstein static model: it is unstable -- like a pencil balanced on its point. For imagine that the Universe grew slightly: say by 1 part per million in size. Then the vacuum energy density stays the same, but the matter energy density goes down by 3 parts per million. This gives a net negative gravitational acceleration, which makes the Universe grow even more! If instead the Universe shrank slightly, one gets a net positive gravitational acceleration, which makes it shrink more! Any small deviation gets magnified, and the model is fundamentally flawed.
In addition to this flaw of instability, the static model's premise of a static Universe was shown by Hubble to be incorrect. This led Einstein to refer to the cosmological constant as his greatest blunder, and to drop it from his equations. But it still exists as a possibility -- a coefficient that should be determined from observations or fundamental theory.
The equations of quantum field theory describing interacting particles and antiparticles of mass M are very hard to solve exactly. With a large amount of mathematical work it is possible to prove that the ground state of this system has an energy that is less than infinity. But there is no obvious reason why the energy of this ground state should be zero. One expects roughly one particle in every volume equal to the Compton wavelength of the particle cubed, which gives a vacuum density of
rho(vacuum) = M4c3/h3 = 1013 [M/proton mass]4 g/cm3For the highest reasonable elementary particle mass, the Planck mass of 20 micrograms, this density is more than 1091 g/cm3. So there must be a suppression mechanism at work now that reduces the vacuum energy density by at least 120 orders of magnitude.
We don't know what this mechanism is, but it seems reasonable that suppression by 122 orders of magnitude, which would make the effect of the vacuum energy density on the Universe negligible, is just as probable as suppression by 120 orders of magnitude. And 124, 126, 128 etc. orders of magnitude should all be just as probable as well, and all give a negligible effect on the Universe. On the other hand suppressions by 118, 116, 114, etc. orders of magnitude are ruled out by the data. Unless there are data to rule out suppression factors of 122, 124, etc. orders of magnitude then the most probable value of the vacuum energy density is zero. This is a Bayesian argument against a nonzero cosmological constant.
If the supernova data and the CMB data are correct, then the vacuum density is about 70% of the total density now. But at redshift z = 2, which occurred 11 Gyr ago for this model if Ho = 65 km/s/Mpc, the vacuum energy density was only 10% of the total density. And 11 Gyr in the future the vacuum density will be 96% of the total density. Why are we alive coincidentally at the time when the vacuum density is in the middle of its fairly rapid transition from a negligible fraction to the dominant fraction of the total density? This is the Dicke coincidence argument against a nonzero cosmological constant. If, on the other hand, the vacuum energy density is zero, then it is always 0% of the total density and the current epoch is not special.
The best observational limit on the vacuum energy density comes from the largest possible system: the Universe as a whole. The vacuum energy density leads to an accelerating expansion of the Universe. If the vacuum energy density is greater than the critical density [3*Ho2/8*pi*G = 8*10-30 g/cm3 for Ho = 65 km/s/Mpc], then the Universe will not have gone through a very hot dense phase when the scale factor was zero (the Big Bang). But we know that the Universe did go through a hot dense phase, because of the light element abundances and the properties of the cosmic microwave background. These require that the Universe was at least a billion times smaller in the past than it is now, and this limits the vacuum energy density to
rho(vacuum) < rho(critical) = 8*10-30 g/cm3The recent supernova results suggest that the vacuum energy density is close to this limit: rho(vacuum) = 0.7*rho(critical) = 6*10-30 g/cm3. The ratio of rho(vacuum) to rho(critical) is called lambda. This expresses the vacuum energy density on the same scale used by the density parameter Omega = rho(matter)/rho(critical). The position of the first acoustic peak in the angular power spectrum of the CMB anisotropy suggests that Omega+lambda=1. The supernova data suggest that lambda = 0.7, so Omega = 0.3. The Universe is open if Omega + lambda is less than one, closed if it is greater than one, and flat if it is exactly one. If lambda is greater than zero, then the Universe will expand forever unless the matter density Omega is much larger than current observations suggest. For lambda greater than zero, even a closed universe can expand forever.
In the past, we have had only upper limits on the vacuum density, and philosophical arguments based on the Dicke coincidence problem and Bayesian statistics that suggested that the most likely value of the vacuum density was zero. Now we have the supernova data that suggest that the vacuum energy density is greater than zero. This result is very important if true. We need to confirm it using other techniques, such as the MAP satellite which will observe the anisotropy of the cosmic microwave background with angular resolution and sensitivity that are sufficient to measure the vacuum energy density.
Extended version of this FAQ at UCLA, with figures.
More on supernovae in cosmology.
"Einstein's Greatest Blunder?: The Cosmological Constant and Other Fudge Factors in the Physics of the Universe" by Donald Goldsmith.
"The Runaway Universe: The Race to Discover the Future of the Cosmos" by Donald Goldsmith covers the accelerating expansion seen in the recent supernova data.