Thursday, September 28, 2006

Quantum Weirdness

The quantum is that embarrassing little piece of thread that always hangs from the sweater of space-time. Pull it and the whole thing unravels. ~Fred Alan Wolfe

“Quantum weirdness” is almost a redundancy. Everything to do with quantum theory is as weird as anything in Lewis Carroll's Wonderland. Einstein hated it. Richard Feynman reveled in it. And we reap the benefits of its implementations.

Of course, it's also the reason I'm not a physicist. Firstly, the math is brutal; if I never see another eigenfunction, I'll die a happy man. Second, trying to wrap your brain around the incongruities of the quantum world is not easy. I think I could have managed the second if I could have survived the first.

Consider, if you will, wave-particle duality. Every physics student has done the double slit experiment in one form or another. Basically, you shoot electrons at a card with two thin slits in it. The electrons pass through the slits and impinge on a target (like a cathode ray tube, or televison screen, as it's more colloquially known). The pattern that results is an interference pattern, alternating dark and light bars, as though waves passed through the slits. You get this pattern even if you shoot the electrons one at a time. So even though you're shooting little discreet particles at the slits, the pattern of their impact is that of waves.

But it doesn't stop there. Suppose you're a clever little scientist, and you decide you're going to find out exactly what's going on. So, you devious devil, you put detectors at each slit so you can actually detect each electron as it passes through the slit. The devices do not impede or alter the motion of the electron in any way. Thus prepared, you began shooting your electrons at the double slit card. Sure enough you can detect each electron clearly passing through one hole or the other, acting like particles and not the least bit wave-like. But, when you look at the target, expecting the interference pattern, you get a most unpleasant surprise.

Instead of the wave interference pattern, you have two clumps of strikes, exactly what you'd expect if discrete particles passed through discrete slits.

Neils Bohr explained this effect in what has become known as the “Copenhagen Interpretation.” He introduced the concept of complementarity, that is, particles act like both waves and particles, including things like electrons which are really neither. Moreover, in attempting to measure in the quantum world, the observer interacts with the system to such an extent that the system isn't independent any more. In other words, by choosing to measure the particle nature of, say, an electron, we wipe out the wave nature of that electron. We are said to “collapse the wave function.”

This is the Uncertainty Principle in action. Erwin Shrodinger, whose wave function equation was doing the collapsing, didn't like what was implied. The Copenhagen Interpretation was saying that, until you observed the electron it existed in a “superposition” of states, both wave and particle. Einstein was similarly distressed because this led to the concept of “spooky action at a distance.”

Imagine two electrons that are “linked” such that one has spin in one direction while the other must have the opposite spin. This is called “quantum entanglement” If you separate the two by some means, both electrons still have the possibility of either spin state. But which state is not determinable without observation, so essentially both electrons have both spin states. When you look at one and check its spin, the wave function is collapsed, and the other particle assumes the opposite spin. This happens instanteously, even if the particles have been separated by a light year, which means the state of the second particle has been determined by information sent at faster than the speed of light.

This is not some metaphysical construct. There is direct evidence of quantum entanglement and “spooky action”. See the Brian Greene reference below for an excellent discussion of the topic.

At any rate, all this stuff bothered Schrodinger enormously, so he posed his own conundrum. Supposed you had a box containing a radioactive source, a geiger counter, a vial of poisonous gas, and a cat. You rig the apparatus such that, if the geiger counter detects that the source has undergone radioactive decay, the vial is broken which releases the poison and kills the cat.

Now you can determine a time period over which there is a fifty-fifty chance for radioactive decay to occur. So you turn on the counter for this period of time. If decay occurs, the cat dies. If it doesn't, the cat lives to annoy you another day. According to the Copenhagen Interpretation, since the outcome is dependent on a probabilistic quantum state change, until you actually open the box to see what happened, the cat exists in both states, or neither, depending on your point of view. It's in a superposition of states that is all states until you make the observation by opening the box and collapse the wave function. (For a much more complete explanation, check out the John Gribbin book listed below).

You could well argue that all of this is playing with math and metaphysics, but a group of scientists is determined to see if superposition actually can be detected, without harming any cats in the process.

A University of Maryland team
headed by Keith Schwab has created a “nanoscale resonator”, essentially an incredibly tiny pendulum on a chip. On the same chip, these people have crammed a “single electron transistor” which changes its current based on changes in position of the resonator. With this device, they have demonstrated the Uncertainty Principle's “back effect.” Essentially, when you measure the position of the resonator, you alter its momentum, which causes it to heat up and act “noisier” than it would if the measurement hadn't been done.

Just another day at the lab, you say, because this effect has been observed often. But, using a technique known as electron tunneling, they can cause the transistor to absorb energy which causes the resonator to cool down from 500 millikelvin to 300 millikelvin. To put this in perspective, zero degrees Kelvin is absolute zero, where nothing moves. We're talking chilled here.

It is just possible that at this ridiculously low temperature, the resonator could enter a state of superposition, which the scientists hope to actually observe. In other words, they would see the resonator both “dead” and “alive”.

What they're looking for is the point at which classical physics gives way to quantum physics, a boundary that has thus far eluded experimental discovery. Think about this for a moment. The idea of superposition essentially means that all possible states exist simultaneously; until now, the act of observation has always caused the wave function to collapse into one state. So Mr. Schwab is looking into a realm that has only been dreamt of by theoreticians and science fiction authors.

Not to mention being experienced by Schrodinger's cat.

References:
In Search of Schrodinger's Cat, John Gribbin, Bantam Books, 1984
The Fabric of the Cosmos, Brian Greene, Alfred A. Knopf, 2004

Tuesday, September 26, 2006

Neanderthals in the News

It is impossible to overlook the extent to which civilization is built upon a renunciation of instinct. ~Sigmund Freud

In recent years, Homo Sapiens has developed an increased interest in Homo Neanderthalis, based on what the Sapiens writing this blog sees. In the last three or so years, there have been a number of programs and articles delving into how the Neanderthals lived, how they may have dealt with the Cro-Magnons, and whether they are capable of buying auto insurance online.

Neanderthals were misunderstood for a long time. This was due to the basic error of making a lot of deductions from one set of bones. Based on the original Neanderthal skeleton that was found, it was determined that the species walked hunched over, almost dragging his knuckles in the dirt. From that it was assumed that he was brutish and stupid. Unfortunately for the learned gentlemen who made these determinations, the skeleton they had found was that of an old man severely afflicted with arthritis. He certainly would have walked in a stooped manner, but his peers, as it turns out, did not.

As more bones and artifacts turned up, the opinion of Neanderthal flipped 180 degrees. Now he was athletic, strongly built, and admirably adapted to the harsh climate of his time. He was pretty successful, too, lasting for around 200,000 years before dying out sometime around the time that Cro-Magnon made it into Europe. At least one enthusiastic scientists said, on a Science Channel program, that Neanderthals were built like Mr. Universe. If the image of a body builder with the head of a Neanderthal bothers you, join the club.

It now appears that the period of coexistence might have been longer than anyone expected. A site has been discovered in Gibraltar that was inhabited by Neanderthals as recently as 24,000 years ago What with Homo Sapiens coming to Europe around 35-40,000 years ago, that allows for a considerable period of overlap during which the two groups would have coexisted and perhaps interacted. The possibility has even been raised that the two species might have interbred. So far, though, although there have been a couple of tantalizing discoveries, genetic data seems to speak against the possibility.

So why all the interest in this character? Well, perhaps one reason is to make up for all the misinterpretations of years gone by. It's almost as though there's been a pro-Neanderthal movement to make up for the maligning of those folks for so many years. To be sure, the picture of these ancient men and women has become a good bit clearer in recent years.

As the picture has become better defined, it's also possible that scientists are more concerned about the role Cro-Magnon played in the demise of the older species. Given what we know of our own behavior, it's not out of the question to imagine that Homo Sapiens showed up and started wiping out Neanderthals willy-nilly.

Having been recently cleared of the great Mammoth Murder Caper (it's pretty well determined that climatic changes did the bulk of the damage to the woolly mammoth), it would be a shame to find that we killed off another human species.

It's certainly not out of the question for modern man to have had a hand in the demise of the older species, but it's unlikely in my mind that Cro-Magnons did all of the damage, just as they did not do all the damage to the mammoths.

Neanderthal was ideally suited to a particular set of climatic conditions himself, so it was not good when the cold conditions began to change. Yet one would think that the species would adapt, if for no other reason than his big brain. Neanderthal brains were actually slightly bigger than ours, so they certainly should have had the smarts.

Maybe they didn't. Consider that over 200,000 years, Neanderthal culture changed little if at all. In just 40,000 years, Homo Sapiens has gone from hunter-gatherer to sending robots to visit the planets. It's as though Neanderthal brains were hard-wired with all the right instructions to be a successful caveman in the ice ages, but they carried no instructions for dealing with a changing environment or improving their methods of doing business.

There's a great deal of debate over whether Neanderthal could speak. The physical key to speech is a tiny bone called the hyoid that's somewhere in the vicinity of the larynx (sorry, my knowledge of anatomy stinks). Without it, you can't make most of the important noises needed to make complex words. Until the last frew years, it was thought that Neanderthals didn't have hyoid bones, but at least one has been discovered. Therefore, they did have the physical capability of speech. Brain casts, though, paint another picture in that the areas associated with human speech don't seem to be very well developed.

Now it's no sure thing that speech was a great determiner in progress, but the ability to communicate ideas in abstract terms has to be an important factor. If Neanderthal couldn't progress beyond some grunts and gestures, it might explain his slow, almost stalled, development.

Interestingly, some new research is looking at the human family tree in a different light. Traditionally, the line from primates to Homo Sapiens is pretty direct with some offshoots, of which Neanderthals are one. In other words, we're the culmination of the evolutionary process, while species like Neanderthals were just dead ends. Now, the view is being put forth that the line actually runs direct to Neanderthals, while it's Homo Sapiens that is the offshoot.

Modern humans have many unusual anatomical differences from other hominid species. For example, with the exception of the occasional middle linebacker or nose tackle, modern humans completely lack the strong brow ridges that characterizes species all the way back to chimpanzees. It turns out there are many other examples. In other words, we're the ones who diverged from the primate family tree, not the Neanderthals. We're the mutants.

It's an interesting supposition. No one knows why modern humans suddenly appeared, although climate changes in Africa were certainly a factor. But there must have been some other factors that gave us the adaptable brain, speech, planning skills, and all the other things that have led to us getting where we are. But we should also keep in mind that with a species lifetime of under 100,000 years, we're still evolutionary teenagers compared to Neanderthals and positive infants compared to Erectus, for example. Given our own upcoming climate change, ever increasing population, and changing social and economic conditions, I'd say we're going to find out just how adaptable we are.

It would be a shame to be just another evolutionary dead end.

Thursday, September 21, 2006

The Galileo Caper

Most institutions demand unqualified faith; but the institution of science makes skepticism a virtue. ~Robert K. Merton

I'm sure you were relieved to learn that Pope Benedict XVI is holding a "private seminar"at the Vatican to assess the Church's position concerning Darwinian evolution. Now, given that John Paul II stated in 1996 that evolution was more than a hypothesis, you'd think there would be a sufficient precedent to go on, but evidently Benedict wants to come up with something more definitive.

Rather brings to mind the Galileo caper.

It is a mistake to assume that the Church refused to accept scientific progress. What was important to the authorities was to control the release of new science with the Church's approval and, more importantly, the Church's interpretation relative to Catholic dogma. However, Rome was also wedded to Aristotelean science; any theories that went beyond that had to be treated with care. One such theory that definitely went beyond was the Copernican view of the solar system.

Copernicus himself was in no hurry to publish his work, knowing full well that it was running against the grain of centuries of belief in an Earth-centered universe. He knew that he wasn't answering some important questions, basically because there was still a lot to learn. For example, one significant objection to his theory was the fact that, if the planets circled the sun, why didn't they fall into it? However, an aged Copernicus finally agreed to publish his work, which didn't happen until after his death, in 1543. Even so, it contained a preface, written by a professor named Rheticus (who oversaw the publication) which added the disclaimer that the book should not be taken as a theory of how the universe actually worked, but as a mathematical construct that conveniently fitted the natural world.

About seventy years later, Galileo Galilei started looking through his telescope and began confirming the Copernican system. He listed his discoveries in a book called The Starry Messenger, but he wisely avoided any direct statement that these observations supported Copernicus. Galileo was no dummy; he preferred to work with the Church to get official sanction for his work, an excellent alternative to being burned as a heretic. So he visited Rome in 1611 where he was positively received by Pope Paul V. He presented his work to a committee of Jesuits (a very tough crowd) which concluded that:
  1. the Milky Way consists of a huge number of stars,
  2. Saturn had an oval shape that looked as though there were object on either side,
  3. the Moon surface is scarred and pitted by craters,
  4. Venus has phases like the Moon,
  5. Jupiter has objects circling it.
The Church put its imprimatur on Galileo's observations but stopped short of making any comment approving the Copernican system.

So, at this point, Galileo is in good with the Vatican. What went wrong?

Feeling confident in his relationship with Paul V and Cardinal Bellarmine (the power behind the papacy), Galileo began to be more open about supporting the sun-centered universe. Eventually, he was back in Rome, this time in front of a group of Inquisitors. Bellarmine tried to broker a compromise (there is confusion over exactly what happened) that ended with the Inquisitors creating a set of "minutes" for the official record that held that Galileo could not "hold, defend, or teach" the Copernican theory. However, Bellarmine and Paul V made sure it was clear that Galileo had done nothing wrong; he had only been informed of what was a general edict from the Catholic Church.

By 1624, there was a new pope, Urban VIII (there was another in between, but he doesn't matter to the story). Once again, Galileo, who had published a book on comets, came visiting and hit it off with the new pope. As a result of the meetings, Urban VIII granted Galileo permission to write a book about both the Ptolomeic and the Copernican systems, so long as he simply described them but did not argue in favor of the Copernican view.

The book, Dialogue on the Two Chief World Systems, was set as a dialog between two characters and a moderator. A character called Salviati supported the Copernican idea, while Simplicio (remember that name) presented the Ptolomeic system. Sagredo was the supposedly impartial moderator, and therein lay the problem. As the book progresses, Sagredo begins more and more to support Salvati over Simplicio.

Despite this, things seemed to be going well for Galileo. It was passed by a Catholic censor with only a request for an introduction and conclusion explaining that the Copernican theory was presented only hypothetically. The censor sent along suggested text, with the comment that Galileo could revise it as long as the meaning was not changed. Galileo made one small addition, which clearly indicated that the preface and conclusion did not represent his own views.

Then it was noticed that Simplicio was using verbiage that was close to wording used by Urban VIII. Now, the name Simplicio can be taken to indicate a simpleton, and his presentation in the dialog, coupled with Sagredo's support of Salvati, could be taken to mean that Galileo was implying that the Pope was a simpleton. Not a smart move.

So back to Rome came Galileo, and this time the welcome was not warm. The Inquisitors dug out the old minutes and accused Galileo of breaking its instructions. Cardinal Barberini, Urban's nephew, was still a supporter of Galileo and worked to mitigate any punishment, which ended up being house arrest. Legend says that, as Galileo left the Inquisitors, he mumbled, "Eppur, si muove" (yet, it does move). No one knows for sure whether he did, and if he had been heard saying it, he probably would have suffered torture or burning at the stake. Galileo was a stubborn, impatient, and contentious man all his life, but it's unlikely he was that stupid.

It's possible that had Galileo played by the rules, he could have publicized the Copernican system without ending up before the Inquisition. The Church was ultimately left with its Aristotelean framework while the rest of the world moved on. Eventually, hundreds of years later, the Vatican would admit that it was wrong in condemning Galileo, as if the rest of the world, including all Catholics, hadn't figured that out already.

If Benedict's Star Chamber session on evolution doesn't keep history in view, we're going to need another admission of error somewhere down the line. Maybe in a few hundred years, give or take a few decades.

Reference: The Scientists by John Gribbin (Random House, 2002).

Tuesday, September 19, 2006

Deep Background on the Background

But in science the credit goes to the man who convinces the world, not to the man to whom the idea first occurs. ~Francis Darwin

Hardly a week goes by when some bright scientist announces some new observations or analysis that promises to unhinge some long-established theory. Today's conjecture comes from Richard Lieu at the University of Alabama whereby he has announced that observations of nearby galaxies do not reveal distortions in the cosmic microwave background (CMB) radiation that theory predicts should be there. Therefore, he says, the Big Bang theory is in doubt. Note that the CMB is not in doubt, just a predicted effect caused by the radiation interacting with clusters of galaxies.

Predictably, there is skepticism about the data. The data was taken using WMAP, a satellite that has been taking extensive data for some time on the CMB. The trouble is that the resolution of WMAP may not be sufficient to detect the distortions; scientists using ground-based radio telescopes and WMAP have, in fact, already confirmed the distortions in distant galaxy clusters. Mr. Lieu says, naturally enough, that WMAP's resolution is fine for nearby clusters, so he isn't buying that argument.

But, a Harvard astrophysicist points out that even if Mr. Lieu's team's observations are correct that says less about the Big Bang theory than it does about our limited knowledge of how clusters of galaxies work. Mr. Lieu snappy rejoinder to this? "That I do buy. I myself am not at this point prepared to accept that the CMB is noncosmological and that there was no Big Bang. That would be doomsday."

Well, he may not accept it, but he's saying it somewhere because David Spergel of Princeton is quoted as referring to the team's conclusions, which implies that at least one of those conclusions is questioning the Big Bang. I've written before about this business of making pronouncements on slight or controversial data, but the story of CMB adds another dimension to why scientists might be a little quick to publish a theory.

Most people have heard the story of how Robert Wilson and Arno Penzias discovered the CMB using an old horn-shaped radio telescope at Bell Labs. Wilson and Penzias used the device to bounce signals off the old Echo satellite (actually a huge balloon); as a reward for a job well done, they were given access to the scope to do further work on radio astronomy. To do the job, they decided to make sure that the old horn was in tip-top shape. After cleaning and tuning it, they found that there was a hiss that occurred wherever they pointed the antenna. Figuring that this was a malfunction, they went through all sorts of gyrations trying to get rid of it,including removing pigeons (and their droppings) over and over again. According to Penzias, one day they got a man with a shotgun to do something about the pigeons. Penzias claims that he never knew what the man did, but the pigeons went away.

But the hiss didn't.

What some folks don't know is that a group of people over at nearby Princeton University had come to the conclusion that the Big Bang should have left an "echo", the initial outpouring of radiation at the moment of creation. The radiation would now be very cold (less than 5 degrees K) and very redshifted. It would be so redshifted that it would be observed in the microwave portion of the spectrum. Jim Peebles had prepared a paper in 1964, only to have it rejected with the suggestion that he actually research the literature on the subject. It turned out that he had overlooked the fact that, about 20 years earlier, George Gamow had proposed almost precisely the same thing.

Peebles did the research and found that no one had ever followed up on whether the CMB could actually be detected, so he and Dicke decided it would be a hoot (and a possible Nobel Prize) if they could find it. So Dicke put a couple of associates to work on a detector.

Meanwhile, Penzias and Wilson had decided that the hiss wasn't going away. If it didn't go away, it had to be important, but they didn't know what it was. So they decided to do paper on the calibration methods for the antenna and mention the hiss in an "oh, by the way" manner. While preparing the paper, a colleague tipped off Penzias to Peebles' paper. Penzias got a copy, read it, and called Dicke at Princeton to tell them that they might want to come over to Bell Labs and look at some interesting data.

So, now everyone was excited. Peebles and Dicke were happy because someone found evidence of the CMB; Penzias and Wilson were thrilled because they could stop having the pigeons killed now that they knew what the hiss was. Oh, and they were kinda thrilled to have made a significant observational discovery (that probably outranked the pigeons, but my way is funnier). So the Bell boys (ouch) wrote a paper on their observations, while the Princeton guys wrote a brief paper on the CMB and its cosmological implications. And everyone looked forward to the Nobel Prize presentation.

It might have happened that way except for one thing: The New York Times found out about the story and gave it a page one treatment. And George Gamow read the newspapers.

Gamow, by now (1965) retired, was incensed. He wrote an angry letter to Dicke, essentially accusing the Princeton scientists of stealing his work. Gamow's associates, Ralph Alpher and Robert Herman, who had published specific predictions about the nature of the CMB in 1949, actually quit physics in disgust over the perceived slight.

Penzias and Wilson, meanwhile, escaped unscathed because they had just done the observational work. They didn't claim to have determined its significance or its origin. As a result, when the Nobel Committee finally got around to giving a Prize in 1978, they gave it to Penzias and Wilson for discovering the CMB. They gave no one anything for its prediction.

Now Peebles and Dicke surely did not set out to steal Gamow's work (and that of Alpher and Herman),. Besides, they actually went beyond the earlier work. But there was precedence, and the Princeton team had not acknowledged it, even after Peebles had discovered the existence of the earlier work. Had they referenced the earlier work, it's possible that Peebles, Dicke, and Gamow all would have received a prize for the predicition of the CMB (assuming Gamow was still alive when it was awarded; Nobels are not given postumously).

Scientists know about such things. Serious students working in the areas where Nobel Prizes are awarded can recite the history of winners and near-winners from their fields of study. No one wants to be one of those who missed the opportunity to grab the gold ring. Perhaps that's one more reason why scientists rush to make bold claims. Better to be proved wrong and suffer a little embarrassment than to hesitate and wonder what might have been.

So maybe the rush to publish is understandable, but history has taught us to take claims with a grain of salt until more data is in. That's why the Nobel people wait so long to give out awards.

They don't like being embarrassed in Stockholm.

Reference: Lonely Hearts of the Cosmos, Dennis Overbye, HarperPerennial, 1991

Thursday, September 14, 2006

Plutonics

A new scientific truth does not triumph by convincing opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it. ~Max Planck

The Pluto planetary predicament has now reached the silly stage.

At New Mexico State University, a protest, consisting of about 50 people was organized to protest the IAU definition of a planet. What is supposed to make this protest somehow more meaningful is that Clyde Tombaugh, discoverer of Pluto, came to NMSU in 1955 and started what has become a highly-regarded astronomical research program. His wife, Patsy, and son, Al, were at the protest, where Ms. Tombaugh opined, “I'm disappointed all this has happened.”

So am I, madam, so am I.

I have no personal knowledge of the late Mr. Tombaugh beyond the usual information about how he came to identify the little dot on the photographic plates that revealed what was then a ninth planet. But, somehow, I don't think he would be getting so bent out of shape about all of this.

And to cap the silliness, the California legislature, evidently having solved all the other problems of the state, took a daring stand and passed a resolution condemning the “mean-spirited” IAU for kicking Pluto out of the planet club. Of course, the fact that Disneyland is in California has nothing to do with this.

Frankly, I've always thought California politicians came from Fantasyland and were lineal descendants of Goofy or Mickey Mouse. As to their governor ... no, no, this is about science, not politicians. Suffice it to say that the opinions of the California legislature count for diddly-squat in this discussion, other than to point out just how emotional this debate has become.

I'm not going to defend the IAU, because they had a proposal, which wasn't very good, modified it in some manner which is still unclear to something that was no better and possibly worse, then used a vote of a small minority of its members set a standard definition.

This isn't science, as I have already written.

Most definitions in science are based on the properties of the thing being defined. The scientific definition of an atom, for instance, has changed over time as our knowledge of the atom has increased. No one got bent out of shape when it was discovered that the nucleus contained neutrons and protons instead of being some solid lump. No one considers that electrons orbit the nucleus like little [ahem] planets any longer, but there was no protest when this model came into being.

The concept of a planet, though, is different. It's different because there is no real definition of a planet and never has been.

“Planet” means “wanderer.” When the ancients looked up into the sky, they saw the Sun, the Moon, and the stars. The stars were “permanent”, that is, from season to season, they kept their relative positions in the sky. The pole star stayed the pole star; Orion's belt didn't shift up or down or spread out. But, they couldn't miss the fact that there were a few objects up there that looked like stars but didn't behave like them. Eventually, these things were called “planets” and, with a few modifications as more objects were found, everything that circled the Sun and was round became a “planet.” Everything else was a “satellite” or an “asteroid”.

So, beyond being round and orbiting the sun, what is a planet? The answer is that we don't know. And extrasolar discoveries aren't helping. We're finding objects several times larger than Jupiter in strange, fast orbits or highly eccentric orbits. Are these planets or brown dwarfs (essentially failed stars)?

Well, then, how could we define a planet? Let's look at what we've got in our solar system that directly orbits the Sun. There are four rocky planets with solid cores in the inner part of the system. Then, we have the asteroids, which are probably left over rubble from the formation of the solar system. Beyond them, there are four gas giants, each a miniature solar system of its own, with satellites aplenty. Farther out, we have the Kuiper Belt, more leftovers, of unknown makeup, although we know that some comets, generally short-period ones, come from there. Even farther out, we have the Oort Cloud surrounding the solar system on all sides, also a source of comets, mostly ones that are very long period.

With all this diversity, how do we decide that something is a planet or not? What makes a large Kuiper Belt object not a planet? What is “large”? At one time, four asteroids were called planets. Evidently, as the nature of the asteroid belt became known and the true size and nature of the asteroids became better known, it was clear that these objects were somehow different.

There's no solution that satisfies everyone, because the issue is not about what a planet is. The issue is over whether Pluto should continue to be called the ninth planet. Moreover, it's become a purely emotional question, either because people just love Pluto or because some egos became bruised by the method the IAU used to set a standard.

Let's set all the emotions aside for the moment. What Pluto is, almost certainly, is a Kuiper Belt object, more like a comet than it is like any of the traditional planets. What we need to do is properly define asteroids and Kuiper Belt members. We've got a pretty good idea about the asteroids; New Horizons will help us to some extent with identifying what Pluto's characteristics are.

Astronomers need to stop concocting definitions to include or exclude Pluto or Xena (now officially known as Eris; sorry, Mr. Brown) or any other object. Instead, they should be looking at the characteristics of the solar system's constituent parts and classifying them accordingly. I'm sorry if some children will be disappointed if Pluto isn't a planet any more, but, if that's where the science leads, then that's where we should go.

We just may have to wait until a lot of people die off to get there.

Tuesday, September 12, 2006

Frog Marrow, A Bulging Moon, and A Round Table

The cure for boredom is curiosity. There is no cure for curiosity. ~Dorothy Parker

I seem to have accumulated some interesting stuff over the last few weeks, so let's lay them out before they get too awfully stale.

Where are all the Jurassic Park references?
Preserved bone marrow has been found in frog fossils that are over 10 million years old. This is potentially exciting because it is just possible that DNA could be extracted from such material. I applaud the author of the piece for refraining from painting pictures of prehistoric frogs leapfrogging out of a lab somewhere.

The idea of resurrecting prehistoric beasts seems to strike a chord in some people. When a mammoth was found potentially intact in a block of ice in Siberia, the first program went on at great length how a possible breeding program could be undertaken to create a new generation of woolly mammoths. What we would do with a generation of mammoths when we can't seem to live with the elephants we have is debatable.

It's important to understand that, unless one could directly clone a creature from the past, any breeding with modern animals would result in an interesting animal, but it would not be a true representative of the ancient one. DNA recovery, even it is possible from the marrow, has not reached a stage where a clonable sample would be obtained. We should recognize how much we may be able to learn from what can be discovered from the preserved marrow, but we should keep in perspective.

So far, writers have.

The Moon is bulgy.
I don't know that most people realize how important the Moon is to life on Earth. It generates larger tides than we would have without it, it helped sweep space in our vicinity to cut down on the number of large rocks hitting our planet, and it helps stabilize the planet's axis wobble (it doesn't eliminate it, but it does make it smaller).

Scientists would be interested in how the moon came to be without all that going on, but considering that life may not have made it on Earth without it makes the issues of how it came to be somehow more important.

I think it is generally agreed these days that a Mars-sized rock smacked the proto-Earth about 4.3 billion years ago. The stuff that was ejected settled into an orbit around the planet and eventually coalesced into the Moon. Its composition has been a source of discussion for years. Its not so much what it's made of (no, not green cheese) as how its put together that has puzzled scientists.

The data from unmanned and manned missions to the Moon seemed to indicate that rather than having a single rocky core like our rock, the moon has lumps known as “mass concentrations” or “mascons” (which sounds like an office at the Defense Department). The Moon was sort of a “plum pudding” with large masses distributed in its interior.

(Ever notice how food seems to be a favorite analogy for astronomers and physicists? The Moon is a plum pudding; the universe's expansion is like raisin bread baking. These people are not getting enough nourishment. Send a cookie to a scientist today.)

The other theory is that the Moon has a bulge due to the extreme stresses that occurred during formation. This brings to mind an old B.C. cartoon, where B.C. and Peter are contemplating the Moon, when Peter says that no one has ever seen the “backside” of the moon. B.C. says, “Gee, I wonder what it looks like.” The last panel is of the moon as seen from space ... with buttocks.

Moving along ... A group at MIT has run simulations to see what could cause a bulge in the Moon's shape during formation. While they came up with several possible scenarios, the one they favor would have had the Moon's initial orbit being highly elliptical, rather than nearly circular, as it is today. This would have resulted in a remarkable set of views had there been someone around 4.3 billion years ago to see it. The Moon would at times appear huge in the sky, moving so fast that it would run through all its phases in one night. The next night it would be smaller as it moved rapidly away. The tidal effects would have been remarkable, not on water, since there was no liquid water at this point, but on the land itself, causing the land masses themselves to move and buckle.

Of course, it's not the only scenario, but it certainly does create some remarkable mental images, none of which involve “backsides.”

Is it THE Round Table?
The Time Team (which used to be shown on History International) has been digging around Windsor Castle in England and found a structure that could be linked to the legend of King Arthur and the Knights of the Round Table. Now, if you've ever seen Time Team, one of my favorite programs, you know that, if they have a fault, it's that they have a tendency to draw rather large conclusions from small amounts of evidence. At least, host Tony Robinson does; his team of experts tends to be a bit more controlled.

In this case, what they think they've found is a circular room dating back to Edward III, a 14th century king of the realm. It was said to house a large round table around which the original 300 Knights of the Garter sat. The thinking seems to be that the setup may have helped give rise to the Round Table portion of the Arthurian legends.

Well, maybe. More interesting is that the Royal Family gave the Time Team unprecedented permission to dig around various royal dwellings, and the team didn't disappoint, finding pre-Roman flint and Victorian jewelry at Buckingham Palace, a 17th or 18th century seal used to stamp documents at the Queen's Scottish residence, and the aforementioned round building near Windsor Castle.

Most of these things may seem like small finds, but the Time Team has always done a great job of using their digs to paint a picture of the time that they are investigating. And small finds accumulate to tell us a great deal about what happened in the past.

I hope the publicity will give the H-I folks the thought that it might be time to bring the Time Team back to U.S. television. Their programs are wonderfully energetic and give lie to the idea of archaeology as some sort of dull, dry pursuit.

Besides, any show hosted by the Black Adder's sidekick has to be worth watching.

Thursday, September 07, 2006

Bursts and Flashes

If we would have new knowledge, we must get a whole world of new questions. ~Susanne K. Langer

Years ago, when I was starting out in college I tried majoring in physics, which didn't work out so well. After a considerable struggle, I ended up majoring in Management Science, a rather strange amalgam of economics, finance, behavioral science, and computer science (which left me with an impressive-sounding major that no one has ever understood, but it looked good on a resume). Physics took its toll on a lot of my fellow classmates, and for some reason, many of them gave astronomy a try, as if that was something easy. Now I had taken a course in radio astronomy, so I knew that thinking astronomy was easier than physics was like thinking that a volcano was less dangerous than an earthquake. Either one can get you, if you don't watch out.

One thing that makes astronomy difficult is that you are trying to understand very complex systems that are very far away. Yet astronomers make up convincing theories to explain what they observe, along with astrophysicists and physicists as well. Unfortunately, since the universe is large and our means of observing it get more sophisticated all the time, astronomers keep finding things that upset the applecart, causing major reappraisals in thinking and intense (and sometimes acrimonious) debate. In science, this is called “fun.”

One of the current areas of “fun” involves Gamma Ray Bursters (GRBs). GRBs are titanic explosions that propel huge amounts of gamma rays in our direction. Originally, they were thought to be events occurring relatively near by, in astronomical terms, at least. For that to be the case, the GRBs should have fallen in the plane of the Milky Way, since everything near to us falls into that area. As more data was gathered, though, it was found that GRBs were all over the sky, which meant that they were coming from a long way off.

Ultimately, thanks to timely observations, the distance to one of these things was determined, which turned out to be very, very far away. This information was very disconcerting because to generate the strength of gamma radiation that was detected over the huge distances, the energy being released had to be so enormous as to violate Einstein's famous equation, E=mc2.

Now you can mess with a lot of things in science, but the mass-energy equivalency is not something to be tinkered with lightly. Yet, if the GRBs were viewed as traditional explosions, radiating uniformly in all directions, then some, if not each one, would have to be converting an amount of mass equal to all the visible mass in the universe into energy.

Somehow, that seemed unlikely.

But, there was an observed phenomenon that would explain the intense bursts. If, instead of a uniform blast, the gamma rays were ejected as “beacons” or “jets”, say from each magnetic pole, then the mass-energy conversion works just fine. Just such jets of radiation have been observed for years, first by radio astronomers, later by visual observations from Hubble. If the Earth lies in the path of the jet, we see a brilliant outpouring of gamma rays. So one problem was solved. But the question still remained: What are GRBs anyway?

There has been no shortage of theories, including colliding neutron stars, matter falling into a massive black hole, even colliding black holes. Now, a new observation has astronomers thinking in a more traditional direction: Supernovas.

Supernovas would make excellent candidates for GRBs in a number of ways. First, supernovas are known to explode in a symmetrical way but not in a uniformly radiating manner. The best way to describe supernova explosions is to think of them as expelling matter and radiation in the shape of an hourglass. At the equator of the star, a disk of matter is ejected, but the main burst is from the poles (Eta Carinae is an excellent example).

Secondly, since there are a lot of stars, there are a lot of supernovas, which would explain why we see so many. Thirdly, early stars were bigger, shorter-lived stars, so we'd expect to see more GRBs at great distances, which we do. Things come together nicely.

However, not everything is so rosy. The supernova that was observed in conjunction with the GRB event lasted longer than normal. It also was less energetic than most GRBs, so, of course, scientists have created a sub-class of GRBs called “X-ray flashes”. I don't know if the IAU had anything to do with this (sneaky reference to Pluto; sorry).

Also, not all supernovas create GRBs. While this could be due to the directional nature of GRBs (that is, we see the light but the gamma ray burst is aimed in another direction), it's probably more a sign that not all GRBs can be explained by supernovas. Since GRBs are associated with black holes and not all supernovas result in black holes, it makes sense that there is yet another source or sources for the mysterious outbursts.

What makes this observation so interesting is that the star doesn't appear to be massive enough to collapse into a black hole. In addition, the GRB actually preceded the stellar explosion. So X-ray flashes may be a precursor to certain types of supernova events, but not all GRBs are necessarily associated with such events.

So, we have a new piece of information that, as is often the case, raises more questions than it answers. We are still a ways from knowing what GRBs are and how they fit into the great cosmic story, but we've come a tiny step closer. A lot more analysis will need to be done of the current data, and many more observations will need to be made. And, in the meantime, there will be much theorizing, debating, and arguing about what has been found.

As I said, this is what scientists call “fun.”

Tuesday, September 05, 2006

Respecting History

In the Cleveland Museum of Art, they have a mummy. When did the ancient Egyptians bury people in Cleveland?~ Zoltan

Oetzi and Kennewick Man have much in common. They are both important to understanding the period in which they lived, and they've both spent time in court.

Kennewick Man died around 9000 years ago in what is now Washington state. When he was found, scientists were excited about what they might learn about early inhabitants of North America. Native Americans, however, declared that he was an ancestor, and, by law, his bones had to be given to them for burial. After several years of court hearings, a federal court has finally determined the obvious: No present-day tribe can prove kinship to a 9000-year-old man, therefore, he can be studied by scientists.

Congress, in its ongoing attempts to avoid tackling real issues, is actually attempting, for the third time, to amend the law so scientists would have to return the remains to the current tribes living in the area. How Congress figures it knows who his descendants are is a little mysterious, but then most things Congress does are pretty strange. At any rate, one enlightened individual, Doc Hastings, representative from the district where Kennewick Man was found, is planning to introduce a bill to specifically protect scientists' rights to examine the bones.

None of this would be necessary were it not for the disregard and disrespect that has been shown to peoples around the world by legitimate scientists, museums, and “collectors” for years.

Take mummies. In the nineteenth century and earlier, mummies were routinely removed from Egypt to be ground up as “medicine” or to be part of parlor show unwrappings. Some were even burned as fuel. I often wonder how those folks would have felt if a bunch of Egyptians had shown up and started digging up English or French cemeteries to harvest bones to make fertilizer.

It's bad enough that the Egyptians themselves looted tombs and destroyed the mummified bodies of kings and queens, but for others, including scientists who should have known better, simply removed them to their own museums to dissect is even worse. The cost in information about Egyptian history is immense.

There is also the general removal of artifacts. Parts of the Parthenon frieze sit in London, England. The Greeks are not at all happy about this and never have been. Now, I imagine that some upstanding Englishman decided that the Greeks at the time weren't doing a very good job of keeping the Parthenon intact, so he decided that this portion, at least, would be better served to be London rather than Athens. At the time, he may have been right, but the time is long past that the piece should be returned.

Unfortunately, the history of archaeology has been filled with the removal of artifacts from their home countries to be studied and/or displayed in a place far away. In the latter part of the twentieth century and continuing into the twenty-first, a good deal of restitution has been made. There are still many articles that were removed many years ago, many of which will probably never be returned.

Zahi Hawass, Egypt's main man of Antiquities, has made it a priority to recover as many of Egypt's treasures as is practical. Recently, an exhibit of wonders from King Tutankhamun's tomb was loaned to the Fields Museum in Chicago. One helpful soul, wanting to impress Dr. Hawass with the love the Fields' director had for things Egyptian, told the doctor that said director had an Egyptian sarcophagus in his private collection. Dr. Hawass promptly went ballistic, threatening to pull the Tut exhibit unless the piece was returned to Egypt immediately. Ultimately, cooler heads prevailed, and the exhibit stayed. I imagine that some sort of negotiations have been made regarding the disposition of the sarcophagus, and the helpful soul is now a janitor.

The problem is that such articles are almost always obtained illegally. The director may have purchased the sarcophagus in good faith, but he should have been aware that it could hardly have been removed from Egyptian territory legitimately.

Illegal sales of artifacts has been a major problem for years. Museums purchase items this way, claiming that they are saving them from being hidden in a private collection, but, since they know full well that they are supporting the thieves, this is a specious argument. If they were buying them to return them to the land from which they came, that's one thing. Buying them to display is quite another.

Such actions merely encourage the looting of archaelogical sites. Some sites look like the surface of the moon, with craters scattered across the landscape as the looters dig for treasures to sell in the antiquities black market. One site, the tomb of the Lord of Copan, was saved only because it happened to be under a corral where a bull was kept. The surrounding area looked like an artillery range.

Many of these looted items are bought by private “collectors”. I've never understood the idea of buying some ancient find and hiding it from the world in some private hoard. What can the motivation to keep knowledge from the rest of the world? What good is a wonderful object if you can't show it to anyone?

In North America, too, there is a long history of digging up burial mounds with no respect for those who might have been buried within. That's why the laws regarding Native American ancestors are in place. Because of the actions of these despoilers, we lose our ability to gain more insight into the civilizations that roamed the continent in the centuries before Europeans, Vikings, or Chinese sailors (depending on your favorite theory) found the continent.

It's not that some countries aren't fighting back. Italy, for example, has a task force which does nothing but track down illegal artifact dealers (and artifact forgers). More and more museums are working with the countries of origin to work out deals for returning long-removed treasures. There's light at the end of the tunnel.

And, some scientists have found that native peoples can be a help rather than a hindrance in hunting for artifacts and the remains of early North American settlers. “Lost World” is a fascinating book by Tom Koppel laying out new theories (at least as of 2003) about the earliest settlers of North America. In the book, he talks about the concerns of dealing with native groups about digging in tribal areas. Rather than being confrontational, the team asked the Tlingit Indians to participate in the discovery process. Remains were treated respectfully in accordance with tribal customs, but significant information was gathered that would never have been obtained otherwise. Everyone wants to know where they came from and who they were. Everyone also deserves respect.

And so do their ancestors.

Saturday, September 02, 2006

Pluto-nium

One cannot make an omelette without breaking eggs – but it is amazing how many eggs one can break without making a decent omelet. ~ Prof. Charles P. Issawi

The revolt has begun. The metaphorical gates of the IAU are being stormed by angry astronomers, all because of a ball of ice and dirt called Pluto.

I first mentioned that the International Astronomical Union was going to take up the issue of what makes a body a planet here in a passing matter-of-fact reference. Before too long, I penned a more expansive piece dealing with the increasing emotionalism that seemed to swirl around the first proposed definition. This initial definition, developed by a committee chaired by Owen Gingerich, who is no slouch of an astronomer, increased the number of planets to 12. The trouble is that this definition had multiple categories (both official and unofficial) including “plutons”, a term that turned out to already be in use by geologists. The definition also promoted Charon to planethood, which didn't seem like such a good idea.

Another astronomer, Julio Fernandez, was so displeased with the committee's effort that he offered his own proposal for discussion. A press conference to present his idea practically turned into a revolution itself.

Finally, as I chronicled here, the IAU came up with and passed yet another proposal that demoted Pluto to something called a “dwarf planet”, dropping the number of “real” planets to eight but adding a flock of “dwarf” planets, led by Pluto and including Ceres (but not Charon), and the recently-discovered Xena (as it is unofficially named). This resulted in howls from many astronomers, who are now in open revolt against the IAU.

I tell you, this has been a blogging bonanza.

There have been stories all over about this, but my favorite headline came from the New Scientist site: "Astronomers Plot To Overturn Planet Definition". Can't you just picture a bunch of white-coated astronomers meeting in the middle of the day (they work nights, remember?) to concoct a scheme to pull a coup d'etat against the evil overlords of the IAU. Can't you imagine a crack team of mutant ninja astronomers sneaking into IAU headquarters to steal the definition?

Well, no, neither can I, but it would be pretty funny.

What astronomers are doing is going very public about their dissatisfaction with the IAU. In the lead, as if he hasn't caused enough trouble (I'm joking, already), is Owen Gingrich, who, if you can recall a few paragraphs back chaired the committee that created one of the definitions that wasn't accepted. Led by Mr. Gingerich, astronomers will have a new conference to once again tackle the issue of what a planet is. This guys are serious; they have also prepared a petition , by golly, to show the IAU that they mean business.

All right, perhaps I'm a little overboard here (so what else is new?), but there is something comical about the fuss. I am reminded of the quote I used in one of the earlier articles from an astronomer at Johns Hopkins: "I think the whole debate is absurd. The fact (in my opinion) that Pluto is in a different class from the eight planets does not make it less interesting." Well said, sir.

Instead of solving anything, the IAU has create chaos. Some scientists are saying they will simply ignore the definition; some museums and planetaria are saying they will not change displays or presentations to conform with Pluto's devalued status. And, we are liable to be looking at a competing definition in 2007 when the conference organized by Mr. Gingerich occurs. All of this distracts astronomers from the real work they should be doing, and that is a bad thing.

At the root of this whole brouhaha is the way the voting happened in Prague. While the IAU has thousands of members, only between 400 and 500 actually were left at the conference by the time the proposal came to a vote on the last day of the session. Even Mr. Gingerich had been forced to leave because of other commitments. He has called loudly for Internet voting to avoid such controversies in the future, which is a sensible enough concept.

The trouble is that, even if thousands of astronomers had voted, they would have had poor choices. Any vote would have been very close, ensuring even more controversy. I also suspect that, like me, there would have been a lot of astronomers who didn't like any of the definitions, for reasons I have discussed previously (see those links above, if you're curious). To make matters worse, as far as the IAU is concerned, the next meeting is in 2009. No one wants to wait that long to achieve some sort of consensus.

So, essentially, we're right back where we were before the IAU meeting in that there is no clear agreed-upon definition of a planet. What bothers me is that we will soon have at least two, if not more, definitions to choose from. This is also not a good thing.

We can make light of it, but teachers are going to have a genuine problem here, trying to explain to students how a concept as seemingly obvious as a “planet” can get so completely muddled. Textbook writers are going to have to waste time and effort explaining the same thing or, worse, taking one definition over another. Now this isn't earthshakingly serious; no one will build bridges that fall down because they haven't got a clear concept of what Pluto is. But, it demeans science itself if its practitioners can't settle an issue this basic.

At the bottom of this, the fundamental problem is that science took a back seat to politics. The idea of voting on what constitutes a planet is like voting on what constitutes a photon. This is not some legalistic exercise; it's a description of real objects that impacts how we define theories of planetary formation. We don't need committee meetings, conferences or votes. We need for astronomers (and perhaps astrophysicists and physicists) to step forward with definitions that are backed up with data. Let's have none of this “largest by far” and “cleared the neighborhood” nonsense.

Use things like size, shape of orbit, distance from the sun, physical makeup, and so on to put a specific set of parameters on the table for what a planet is. Publish it, and let the comments flow. Will we get bunches of definitions and even bigger bunches of comments? Probably, but that's basically the way most of our definitions have come about. Someone says that a certain class of stars has common qualities, so we say they're all of a type. Perhaps later we learn more about them that causes us to subdivide the classification or dispense with it altogether.

When Clyde Tombaugh discovered Pluto, we didn't know Kuiper Belt objects existed; now we do. Should all of them be planets or should Pluto be one of them? Personally, I lean toward the latter view (yes, that would mean eight planets; deal with it), but I could live with either if someone defines the difference between planets and the Kuiper Belt objects.

Or did I just start a whole new argument?