There should be no articles of faith in science, unless it be the faith that no discovery, no law, is so absolute that it cannot be superseded. ~Anthony Storr
Time to tackle a few odds and ends once again.
One of the intriguing things about anthropology, paleontology, and even archaeology is that a discoveries that have been lying about in museums and university collections for years at a time can, when found or looked at a second time, yield interesting new findings. Part of the reason for this is that there a lot of bones and artifacts and priority is given to the most immediately promising. In the case of bones, there is often a great deal to be done just to prepare them for study. Sometimes the item just gets lost in the shuffle, cataloged away to be analyzed further when time permits. But along come other newer finds, and the earlier ones get pushed back.
This is not because these scientists are somehow callous or forgetful. Newer finds generally have better provenance and more complete dating information than older ones. They also tend to be better handled in the field, taking advantages of the lessons learned as the various sciences have matured.
Take a bit of hominid upper jaw that was lying around in the Torquay Museum collection in England. The jaw with its three teeth was dug up in 1927, where it sat until it was radio-carbon dated in the 1980's and found to be 31,000 years old. Recently, though, bones found in the same layer as the jaw were dated to between 37,000 and 40,000 years old.
That time frame raises the interesting proposition that the jaw could have belonged to a Neanderthal who had come to live in the British Isles. As I've said before, Neanderthals are a hot item these days, which is probably one reason for renewed interest in this particular jaw and it's three teeth.
Current investigators are trying to determine if the teeth are in the right place and if they haven't been damaged, since teeth are normally an excellent source of DNA. But, as noted in the referenced article, even museums didn't the care with bones in the 1930's that they do today. Contamination is a distinct possibility.
But, if DNA could be extracted, it would definitively establish whether the teeth belonged to a modern human or Homo Neanderthalis. If the latter, it would be evidence that the Neanderthals got to Great Britain much earlier than previously thought.
You just never know what you might find in the back room.
Cosmic Rays Again?
It seems that when Neanderthals aren't in the news, cosmic rays are. I recently wrote about their possible impact on the weather. Although that link seemed rather tenuous to me, it seems that some scientists are taking it very seriously. Basically, the theory holds that cosmic rays are a factor in cloud formation, and the more cosmic rays you have, the more clouds you have. If clouds increase, the amount of sunlight reaching the surface of the planet is diminished. Simply put, it gets colder. In fact, it can get cold enough to create ice age conditions that inhibit the chances for life to flourish.
It turns out, through determinations of the amount of carbon-13 at various times during Earth's history, that a correlation has been found between periods of star formation and periods of cold climate. Heavy star formation generates large amounts of cosmic rays, supporting the theory of cosmic ray cooling.
The correlations are supposed to be so good that, according to the linked article, “the odds are 10000-1” against a coincidental relationship between star formation, the attendant cosmic ray increase, climatic cooling, and declines in living organisms. That a colder climate sustains less life is a fair statement, but I think the jury is still out on the star-formation-cosmic-ray-clouds connection.
For example, an alternative explanation might be that intense star formation (and subsequent star death) increases the amount of dust in our corner of the galactic neighborhood. Such an increase would also decrease the amount of sunlight hitting the surface of the planet and lead to planetary cooling.
I am not counting out the cosmic ray connection to climate, but I do think we're just beginning to understand this relationship.
It Was the Pencils That Were Expensive?
One of the enduring stories of early manned spaceflight concerns writing instruments. Astronauts needed to write things down, but a conventional pen uses gravity to deliver ink, so, according to the story, NASA had special pens made that would work in zero-g, each of which cost several thousand dollars. The Soviets solved the problem more simply: They used pencils.
According the Scientific American article linked above, it just ain't so. It turned out the American astronauts started out using pencils, too. Eventually, though, both sides began to use pens that had the ink pressurized with nitrogen to force the ink out the right end. Said pens only cost $2.39 each. So the pens were cheap, but why did both the Americans and the Soviets switch in the first place?
It seems that pencils are not very safe for use in space. They break and flake tiny bits of graphite that would be no problem on Earth but would float around in a space capsule and possibly end up in an astronaut's eye or causing a short circuit somewhere. Oh, graphite burns, too, which, after the Apollo fire was something NASA was sensitive about. So pens made good sense.
But there's a little kicker to this story. The mechanical pencils that U.S. astronauts used were pretty special. They were so special that they ran $128.89. When this purchase became known, NASA got some serious knocks for blowing that kind of cash on a pencil, so they started looking very hard for a replacement. It turned out that Fisher Pen Company had spent $1 million of their own money to develop a “space pen.” Actually, it had been designed to write upside down, underwater, and in frigid conditions, making it ideal for use in space. And it sold for $3.98. The U.S. and the Soviets bought a case, got a quantity discount, and both paid $2.39 each.
So there was an element of truth to the myth; they just got it backwards.