Astronomical Games: May 2003

The Color Green

There actually are green stars in the sky, but you can't see them

It's not easy being green.

—Kermit the Frog

SOME TIME ago, my in-laws went to a Chinese restaurant and returned with some leftovers: shrimp chips. Shrimp chips are those thick, spongy chips that come with Peking Duck. By this time, the chips were a little cold, so I asked them how to warm them up. My brother-in-law suggested that we put them in the toaster oven on low.

So when we brought the chips back to our house, I did just that: I put the chips on a tray in the toaster oven, put it on high, then went to check up on my son. When I came back, I was a bit startled to see what was transpiring within the oven.

"Um," I called out to my wife, who was sitting in the family room, "the chips are on fire."

"Well, so turn the toaster oven off."

"No, I mean the chips are on fire."

They really were. Now, as it turns out, I knew that we had a fire extinguisher sitting in the corner of the kitchen, but when I turned my head to look over there, it was gone! Someone had moved it. Blast them! So I went to Plan B: I froze.

Fortunately, my wife came in, and she is not fazed by such little things. She went over to the kitchen corner, picked up the fire extinguisher (it was right there, all along), pulled the pin out, and proceeded to coat the inside and outside of the toaster oven, along with the kitchen counter, the washing machine, and half of the rest of the kitchen with a fine blue foam. (After all, as she said, you can only use it once, so live it up.)

The toaster was unusable now, to say nothing of the shrimp chips, but what upset me more than all that was that I had looked right at the fire extinguisher and didn't see it. All of us have had the experience where you don't see something because you don't expect to, but here I was, expecting to see something, and still failing utterly to do so.

But, you know, beginning stargazers do that all the time. They look up, expecting to see something, and they can't. And it's not because the something isn't there.

By and large, the night sky is not a tremendously colorful place. The brightest thing going is the Moon, if it happens to be up, and it is colored a dark grey. (It really is quite dark—about the brightness of asphalt. It looks so bright in its full phase because the Sun can illuminate things quite intensely, and because the night sky is so dark in contrast.)

The next brightest things are several of the planets, depending on when you see them. Venus is typically about eight magnitudes dimmer than the Full Moon—a factor of about 1,500—but it's still brighter than anything else in the night sky (aside from the occasional bright artificial satellite passing overhead). Venus too is essentially just white, as are most of the planets to the unaided eye. The lone exception is Mars, which does appear as a ruddy orange point of light, due to iron oxides in its dusty surface (another essay some time). Some of the other planets do have some color, but that color can only be seen through the telescope.

Going outside our own solar system, we come to the stars. Even here, just about every star looks white to our eyes—so much so that a star of discernible color is an item of note. The Greek astronomer Ptolemy (c. 85–165) listed five stars that he called hypokeros, which appears to have referred to a kind of orangey red: Aldebaran, Antares, Arcturus, Betelgeuse, and Pollux.

(To this was later added a sixth star, Sirius. Sirius is the brightest star in the night sky, but it is about as plain white as you can get. Since Sirius has a white dwarf companion, called Sirius B, and white dwarfs are the shrunken remnants of red giant stars, it was once thought that Sirius B was a red giant in the days of the ancient Greeks. If a white dwarf had formed that recently, however, it should have a substantial planetary nebula surrounding it, which it certainly does not. Nowadays, we suspect that whoever called it hypokeros was either speaking metaphorically, or made a translation error. Notably, the hypokeros designation for Sirius only appears in some of the translations of Ptolemy's Almagest. The earlier ones apparently do not contain it.)

A few stars have a noticeable blue tint to them, too. Rigel, the left foot of Orion the Hunter, is a bluish white star. The blue stars are never as intense as the red stars; Rigel appears quite pale in comparison with its opposite number, Betelgeuse, the Hunter's right shoulder star. [1]

The stars are not intensely or purely colored because they actually emit light of all different wavelengths along the visible spectrum. What we see is therefore a mixture of different colors. A more or less equal mixture of colors yields a white star, so those stars that do have some noticeable color have it only because they emit considerably more light of some colors than others.

Thus, Betelgeuse looks red because it emits quite a bit more red light than light of other colors. Rigel looks bluish white because it emits somewhat more blue light than other colors, but the difference is not as great in Rigel's case, so its color is less saturated. You might guess, since green is intermediate on the spectrum between red and blue, that green stars would look more saturated than Rigel, but less saturated than Betelgeuse.

That, surprisingly, turns out not to be so. It's a common question from stargazers: Why are there so few green stars? In fact, you can look all you want in the night sky, and you will not find a single star that consistently looks green to the unaided eye. Not one!

But why not? Why shouldn't there be any green stars? The answer is that there are green stars, in a manner of speaking, but we can't see the color. To see why that is, let's turn back to the ancient Greeks.

According to Aristotelian theory, heat is created by one of the four basic elements, fire. (The others are earth, air, and water.) If you mix together fire and air, you get dry heat, whereas if you mix fire and water, you get humid heat.

This makes a certain sort of sense. In and around the Greek islands, summers are often hot and humid, and in a way it really is the fire of the Sun and the water of the Mediterranean Sea that does it. If you go to the desert, on the other hand, the Sun seems to make the very air itself sing with a dry heat.

There are some phenomena that this theory doesn't explain, though. If you put two objects in contact, one hot and one cold, they don't remain that way. Instead, the hot one becomes cooler and the cold one warmer, until they reach a common intermediate temperature. At no time does any material seem to transfer between the two objects. If the hot object had more elemental fire to begin with, how does it leave the hot object for the cold one, and how does it know when to stop when there's an equal amount of fire, so to speak, in both objects?

The Italian scientist Galileo Galilei (1564–1642) had an idea. He thought that heat was not a material element, but was rather a fluid, called the caloric, from the Latin word for heat. Hot objects had a lot of caloric fluid. So long as all you had was a hot object in isolation, the caloric fluid couldn't flow away (except to the surrounding air, which it could do but slowly). But if you put it next to a cold object, the caloric fluid would flow to it, until the levels of caloric fluid were equalized in the two objects.

We can see an analogy between this and water levels in a container. If you just have a single full container of water, that water can't go anywhere (except evaporate to the surrounding air, which it can do but slowly). But if you connect it to a second empty container by a pipe attached to the bottom of both containers, the water flows from the full container to the empty one, until the water levels are equalized in the two containers.

What's more, the total amount of water in the two containers remains constant at all times (barring the insignificant amount lost to evaporation). This led the pioneering French chemist Antoine Lavoisier (1743–1794) to propose that the total amount of caloric in a closed system was conserved. In this, he may have been inspired by the law of conservation of mass, which he also formulated.

Actually, mass in the usual sense is not conserved, although this wasn't fully appreciated until the 20th century, most dramatically in Einstein's equation E = mc2, along with the atomic bombs whose energy output can thus be computed. Similarly, it was found after Lavoisier's time that there is no valid law of conservation of caloric, either. This came about because of studies into the nature of heat by the American-born English physicist, Benjamin Thompson (1753–1814).

Thompson was born in Massachusetts, and taught himself enough science and mathematics so that at the age of 15, he was able to calculate eclipses. Three years later, at age 18, he moved to the town now known as Concord and married Sarah Wolfe, a wealthy widow, and appeared set for life, but then the conflicts which set the stage for the Revolutionary War started. Thompson was a Loyalist, and there is today some thought that he might have been spying on the colonists for the British. In any event, he feared for his life, and in 1775, he abruptly abandoned his wife and young child for England.

There, he began in earnest his studies into the theory of heat. He performed experiments that showed that the caloric fluid, if it existed, had to be a strange substance indeed. For one thing, it was invisible, odorless, and could not be isolated for study; it only appeared in conjunction with warm objects. By carefully weighing an object at different temperatures, he showed that the caloric had no measurable mass. All of these properties made Thompson wonder if the caloric fluid were a figment of imagination after all.

The final straw came some years later, when Thompson found himself in Bavaria, where he was made a Count of the Holy Roman Empire. At this point, he did an unusual thing: He chose as his landed name the original name of the town in which he married his American wife, Rumford. It is that name by which he is best known today, Count Rumford.

While in Bavaria, conducting routine inspections of the army equipment, Rumford came upon workers boring out cannons. He noticed that as they were boring the cannon, the cannon grew quite hot, so that water would boil in it when they were done. If the caloric fluid were conserved, the heat required to boil the water had to come from somewhere—perhaps the chips that were bored out of the cannon, or the boring tool itself. But those too grew hot over the course of boring. Moreover, the process could be repeated; once you had boiled some water in the mouth of the cannon, you could empty it, continue boring the cannon, and it would again grow hot enough to boil water.

Obviously, the caloric fluid was not conserved. But if it wasn't conserved, then all evidence and all reason for its existence went away. If you couldn't see it or smell it, if you couldn't capture it for study and it could come and go as it liked, then it made no more sense to maintain that it was there at all.

In place of the caloric fluid, Rumford suggested that heat was the manifestation of small-scale vibration in matter. When the boring tool was rubbed back and forth in the cannon, that motion didn't go away entirely when the tool was withdrawn. Instead, friction with the cannon transferred some of that motion to the small bits of matter that constituted the cannon, and Rumsford claimed that it was that vibration that we feel as heat. In this, Rumsford was perfectly correct. (He didn't call the small bits of matter atoms, since the atomic theory was still a few years away.)

Incidentally, there is more than one way to transfer heat from one object to another. In the case of the water boiling in the cannon, heat is transferred by conductance: actual physical contact. Or, if we stand downwind of a big fire, we feel the heat carried to us by the surrounding air, a process called convection.

But a fire can heat us in another way. Even if we block the heated air from reaching us by means of a solid obstruction, like a plate glass window, the fire can still heat us. It must be radiating heat to us, and this is in fact the third way for heat to transfer from one object to another.

In 1800, the English astronomer William Herschel (1738–1822) conducted an experiment to demonstrate the existence of heat radiation. Following the experiments of Newton, he passed sunlight through a prism, breaking it into the rainbow of the spectrum. He then placed the bulb of a thermometer into the spectrum, but not in the path of any of the visible colors. Rather, he placed it in the dark area outside the red end of the spectrum. He found that the temperature indicated by the thermometer increased. After eliminating other potential sources of heat, he was able to conclude that there was another, invisible "color" of radiation that carried heat. He called this color infra-red, from Latin infra, "below."

All warm bodies radiate infra-red radiation, as it turns out. But since infra-red radiation is just another part of the spectrum, shouldn't there also be a temperature at which bodies radiate visible light? Sure enough, when a metalworker's tools get hot, they begin to glow a dull red. And, that is a property of all objects when they get sufficiently hot.

It's interesting then to ask what kind of relationship exists between temperature and color. This question is complicated by the fact that a hot object doesn't radiate light of just one color in the spectrum. Much like stars do, they radiate light of all different wavelengths up and down the spectrum, and the reason that a tool appears, say, red is that it radiates comparatively more red light than light of other colors. In that case, what you'd like to be able to do is to determine the distribution of the wavelengths of light emitted by an object, called its emission spectrum, based only on the object's temperature and composition.

The problem is that although temperature is a single number, and it's therefore easy to determine a trend in temperature, an object's composition can vary in any of a large number of ways. To try to characterize all the different ways that an object's emission spectrum varies by composition was more than turn-of-the-19th century physics could achieve.

However, suppose you considered the ideal emitter of radiation—an object that emitted as much radiation as possible for any object at its temperature. You could remove the dependency on composition and concentrate solely on temperature, and still have an upper bound on the emission spectrum for any object.

At around this time, the French physicist Pierre Prevost (1764–1823) was putting together his theory of exchanges, which states that any object at a temperature above absolute zero both emits radiation to, and absorbs radiation from, its surroundings. Furthermore, an object at equilibrium, isolated from both conductive and convective heat transfer, emits and absorbs radiative heat at the same rate.

This meant that the ideal emitter was also the ideal absorber. And an object that absorbs as much radiation as possible will not let any radiation through it or reflect off of it. In other words, it is as black and opaque as possible when cold, so such an object is called a black body, and its emission is called black body radiation. There is no truly black body in reality, although certain substances do get close. Carbon in its graphite form, for example, absorbs all but about 3 percent of the visible and infra-red radiation that strikes it.

When a black body is heated sufficiently, it glows in visible light like any object and is decidedly not black anymore. An early attempt to characterize the black body radiation spectrum was made by the English physicist John William Strutt, Lord Rayleigh (1842–1919). He concluded, from first principles, that the intensity of the radiation emitted by a black body, at any wavelength of light, was inversely proportional to the fourth power of the wavelength. (It is therefore quite reminiscent of his scattering law.) This law was further refined by the English physicist James Jeans (1877–1946), so that it is called the Rayleigh-Jeans law.

The Rayleigh-Jeans law is fairly accurate for long wavelengths of light, where the intensity does in fact vary as the inverse fourth power of wavelength. However, it also predicts that as the wavelength of light gets shorter and shorter, from red through green to blue and then ultra-violet light (from Latin ultra, "beyond"), the intensity grows steeply, and in the limit of zero wavelength, the intensity becomes infinite, a phenomenon called the ultra-violet catastrophe.

That simply wasn't observed. Any real near-black body radiation did initially have increased intensity at shorter wavelengths, but in all cases, the emission spectrum curve reached a peak, and as wavelength continued to get shorter, the intensity decreased, until in the limit of zero wavelength, the intensity also fell to zero. So the Rayleigh-Jeans law was no good in the short wavelength domain. But it was the best that classical physics could do.

A second attempt was made by the German physicist Wilhelm Wien (1864–1928). He somehow came onto the idea that the emission spectrum of a black body was analogous to the distribution of molecular speeds in an ideal gas, which had been developed by the Scottish physicist James Clerk Maxwell (1831–1879). He had no fundamentally sound reason to do so, but for whatever reason, this idea attracted him.

Maxwell had found that molecules in an ideal gas do not all move at the same speed. There is a distribution of speeds that arises because molecules exchange energy when they collide. Even if you started out with a gas of molecules all travelling at the same speed, each time a pair of them collided, chances were that one would gain some energy, and the other would lose an equal amount. That would be expressed as a change in the molecular speeds.

The speeds would only spread out so far, though. If two molecules collided, one going very fast and the other going very slow, the chances that the fast one would strike the slow one so as to slow it down even further were very small. It was much more likely that their speeds would even out somewhat.

In between the tendency of the molecular speeds to spread out when they were very similar, and to even out when they were very far apart, lies a middle ground. Maxwell used statistics to derive the distribution of molecular speeds—how many molecules were travelling at what speed, for any given temperature. Wien may have decided that as molecular energy was exchanged within a black body before emitting light at the surface, they arrived at very much the same distribution. With that in mind, he proposed what is now called the Wien distribution law for the black body radiation.

Wien's law, as it happens, is good in the short wavelength domain, precisely where the Rayleigh-Jeans law is bad. Conversely, at long wavelengths, where the Rayleigh-Jeans law is accurate, Wien's law is bad. Clearly, there were two different behaviors here, each covered by the individual laws, but how they blended from one to the other as wavelength varied, nobody knew.

Enter the German physicist Max Planck (1858–1947). He worked at the problem in reverse fashion. Instead of trying to deduce the fundamental physics background behind a formula that would incorporate both laws, he tried first to stitch the two equations together mathematically, with no regard to what the hybrid form meant.

This is a common tactic of cranks and in almost every case, utter nonsense results. Fortunately, Planck was a genius and no crank, and he succeeded. He arrived first at his mathematical Frankenstein's monster, and then a few months later, came up with his theoretical explanation of the monster, basing it on a new vision of physics, in which light was not emitted continuously, but instead came in packets called photons. The new branch of quantum physics had been invented, and with it, the black body problem was solved.

As I explained in "How to Cook a Star," a star shines by the energy released when four hydrogen nuclei fuse, in a series of steps, into one helium nucleus.

However, the photons released in this series of reactions are not the gentle visible and infra-red photons we receive from our Sun. Instead, they are harsh, high-energy gamma rays. The reason these gamma rays don't fry us on a daily basis is that they never make it out to the Earth unimpeded. They start out at the Sun's center, and before they ever reach the surface, they have been absorbed by the Sun's opaque layers of hydrogen and helium.

These hydrogen and helium atoms then emit new photons, which are again absorbed, re-emitted, re-absorbed, and so on. There is no good way to track a single photon as it makes its way arduously from the center to the surface, but if we could, we would see that it takes millions of years for it to go those 700,000 km. Once at the surface, the photon headed for the Earth shoots in a more or less straight line, reaching the Earth, 150 million km away, in a mere 500 seconds.

As the photon bounces its way to the Sun's surface, its energy drops. Since the Sun is basically in equilibrium, each layer of the Sun emits as much energy as it absorbs. However, the number of photons it emits is proportional to the area. This means that each successively larger layer of the Sun must radiate photons whose average energy drops, in inverse proportion to its area. That is why we get mostly visible and infra-red photons, along with some ultra-violet photons, and hardly any gamma-ray photons.

By the time photons arrive at the surface, they've been absorbed and re-emitted countless times, exchanging energy with the absorbing atoms all the way. The result is that the Sun, and most stars, are very good facsimiles of black bodies.

The black body radiation curve discovered by Planck is shaped a bit like a lopsided bell. In the Sun's case, the peak, depending on how it's defined, is somewhere near the green portion of the visible spectrum, but the curve doesn't fall away equally in both directions. The falloff is steeper toward the high-energy shorter wavelengths than it is toward the low-energy longer wavelengths. Thus, even though the peak of the Sun's emission is in the green portion, there is only a little less red and yellow light, and considerably less blue light.

This still makes it seem as though the Sun should look, perhaps, yellowish-green, along with many other stars that are the same color. One reason it doesn't do that is that the human eye is not equally sensitive to all colors. Its color detectors are three chemicals that react to light of a range of wavelengths, and these chemicals have their own peak wavelengths, to which they are most sensitive. These chemicals are typically considered to be sensitive to red, green, and blue light, but the fact is that the "red" and "green" chemicals have very similar sensitivity curves. Their peaks aren't separated very far, whereas the "blue" chemical has a peak well into the blue.

What this means is that any light that stimulates the green chemical is likely to stimulate the red chemical to a similar degree. There is enough of a difference between them to distinguish between colors of everyday saturation, but the Sun is too bright, and the black body spectrum too flat over most of the visible range, for the Sun to appear as anything than a very faintly yellowish white, at best. This is true of any star whose peak emission lies in the green.

The reason that stars can look bluish white is that even though the falloff of the black body curve toward the red end of the spectrum is gentle, the peaks of the red and green chemicals is far enough away that they are stimulated sufficiently less than the blue chemical for the brain to identify a blue tint to the predominantly white color of the star. This is the case for Rigel, for example, whose peak is actually well into the ultra-violet.

Red stars appear even more intense, because the falloff from red to blue is steeper for relatively cool stars. (A typical red star's surface temperature is still around 4000 degrees Kelvin.) That is why Betelgeuse looks redder than Rigel looks blue.

Stars are not perfect black body radiators. The Sun, for instance, has an atmosphere which absorbs some of the radiation leaving the surface. This absorption reveals itself primarily in the form of absorption lines. (See, again, "How to Cook a Star.")

Some stars are so far from black body radiators that their colors can appear unusually intense. For instance, planetary nebulae, the remains of dead stars, do emit light that is very strongly bluish green. However, in this case, the light doesn't come from the dead star itself, which is a white dwarf, but from the star's slowly spreading outer husk, glowing largely from excited oxygen atoms. Or, a star may appear green by contrast. For example, the faint companion of Antares, a red star, is often characterized as green, because white often looks a little greenish next to red. But if that faint star were seen in isolation, it would probably look white, like most stars.

In other cases, however, the strongly colored light does come from the star itself. One example is carbon stars, which show strong signs of carbon (naturally) in their composition. These stars look very red, even redder than ordinary cool stars. I'll go into them in further detail in a few months.

[1] The name does not refer to Orion's armpit, as is often supposed. The original Arabic form was yad al-jauza, "the hand of al-jauza [a feminine figure of unknown identity]." The first person to translate this name into Latin misidentified the initial character; with two dots, the character is ya and should be transliterated as a 'y', but with only one dot, it is ba and gets transliterated as a 'b'. So this star's name became, in the Roman alphabet, "Bedalgeuze." Later, the first syllable was misinterpreted as bat, an improper form of ibt, "armpit."

Copyright (c) 2003 Brian Tung