Monday, September 28, 2009

Climatology: The Black Art

Climate is what we expect; weather is what we get. ~ Mark Twain

Okay, class, it's time once again to sum up the current foolishness about the global warming situation.

This article says we can fix everything by building ships that create clouds, which will be a whole lot cheaper than sending sun shades into space (I'm not making that up).

If you don't like the boat thing, there's a genius that wants to re-forest the Sahara. Forget that not too many months ago, global warming enthusiasts were warning against planting trees because they would increase the amount of carbon dioxide in the atmosphere. Instead, try to imagine the number of ways the climate of the planet might be completely screwed up Hold that thought.

Here we are told it's the sun that is driving out climate. In fact, thanks to an incredibly quiet sun cycle, there are those suggesting we may be looking a new Maunder Minimum, which was a period during the middle ages that became known as the Little Ice Age.

Some scientists seem to think we're in for a spell of global cooling. Oopsie. And that arctic ice melt-off? Postponed for the next 20 or 30 years.

The U.N. has pushed arctic melting out another 10 years, but goes on to say that cutting US CO2 emissions by 80% won't help. Well, that's a fine kettle of fish.

For our final bit of humor, we find the noted climatologists at Clemson decided that we have
more hurricanes but they're not as severe (despite having heard the "more and meaner" drumbeat for years now). However, the problem is that they just have their geography messed up; it's Japan that's getting the stronger storms.

And I didn't even bring up the cosmic ray nonsense.

The simple fact is that no one knows what the climate is going to do. Even if they had a reasonable forecast of the climate, they wouldn't know what to do to change the future climate without creating a situation that's far, far worse.

The bottom line is that no one knows how the climate works. That's nothing to be ashamed of, because we don't even know all that factors that make the climate do what it does. We've had warm periods on the planet that are beyond even the wildest predictions of global warming fanatics. In fact, for much of the planet's history, there hasn't even been an arctic icecap during the summer. Yet during those times, like the Cretaceous, there wasn't a single factory or Hummer in sight.

What's frightening is people talking about geo-engineering the Earth. If we don't know how the climate works, how can we reliably judge how to fix it? The potential for creating totally unpredictable consequences is so high as to be nearly a sure thing. Look back at those stories and note the various disagreements.

Two years ago, I wrote about the chinks in the CO2 fortress. The reaction of the global warming guys, who had the upper hand with governments and the media at the time, is to simply ostracize those who disagree with them (same sort of thing happens with physicists who thing string theory is bunk, but that's another article). It's time for some serious evaluation of data.

It's also time to recognize the role of the sun, the state of the Earth's magnetic field (which is moving and may flip polarity, as it has done many times in the past) and even cosmic rays on our weather. Global warming nuts simply pretend that data doesn't exist or warp it to fit their own preconceptions. And, in global warming, preconceptions is the name of the game.

In the 1980's, the prevailing climatological thinking was that we were heading for a new ice age. The hope was that the additional CO2 in the air would mitigate the effects. I am willing to be those guys had some terrific data, too.

We don't need any cockamamie schemes for changing the climate. We need to improve our predictive abilities, quit relying on computer models that can't predict the motion of a tropical storm more than 15 minutes in advance, and quit using scare tactics to get hold of government grants.

If, in fact, global warming is real, we aren't going to be able to change it. Sure, it's a good idea to cut down CO2 emissions, if only because the reason they exist is because we need to quit using inefficient energy generation methods. We should get rid of fuel-guzzling vehicles, because they don't make sense in any way, shape or form when there are alternative methods of getting from one place to another.

Oh, and we should be careful about those alternative propulsion methods. People are hyping hydrogen-powered vehicles because their only emission is water. Well, if those cars become the standard, we'll be putting massive amounts of water vapor into the air.

Care to speculate on what that might do to the climate?

Sunday, September 20, 2009

The Trouble With NASA

I understand that NASA reported that there's new evidence of water on Mars. I'm here to report that we still don't have any evidence of affordable gasoline in Michigan. ~ David Bonior

It was inevitable that reality would set in with respect to the US space program. The reader may recall when George W. Bush, trying to distract everyone's attention from the various disasters of his administration (including that lack of affordable gasoline), came up with his "plan" for space exploration. The "plan" called for a return to the moon and then a mission to Mars, all conveniently to occur after he was out of office. To perform all these wonderful things, NASA was to get more money. Also, the shuttle and the ISS were to be scrapped.

One would think that there would have been some sort of outcry about shutting down a space station that would take 20-plus years to build after just 5 years of useful life, but this bit of twisted logic seemed to escape the notice of most people. Some did wonder how exactly the ISS would be supplied without a shuttle, especially given that NASA didn't have anything that would be ready in time.

Oh, there was the nonsense about "commercial" space flight picking up the slack. "Commercial", of course, means private companies taking taxpayer money without NASA oversight, which in turn means getting not a damn thing for our money. Events so far have shown that so called "commercial" space ventures beyond Proton and Arianne (both funded by various governments) have been so much moon dust.

None of this would have mattered, of course, because if the space station couldn't be supplied, it would just be brought down a couple of years sooner.

I've never thought much of the ISS because it's never seemed to have a purpose, other than to suck up money that could have been used for space science and exploration. To turn around and trash the thing, though, is a complete travesty. I mean, as long as it's there, let's get something out of it.

Now a Presidential committee has come along and admitted what we all knew to begin with, that the Bush "plan" was a joke all along.

The trouble is that NASA has been a bit of joke for a long time now. Take a look at this little slide show of abandoned NASA projects. There's a theme that runs through all of them. Cost overruns and various changes of mind caused most of them to have their funding yanked.

Now part of the problem has always been that when Presidents and Congress is looking to pretend to worry about spending the first thing they cut, even before education and human services, is NASA's budget. After the heyday of space exploration that took us to the moon in the first place, NASA has watched one President after the other promise all sorts of things while whacking their funding.

But, that doesn't really explain why NASA can't stick to a budget once they start on a project. The problem here is more complex, just as it is in business. Part of the problem in developing any project is trying to think of everything that you need to consider. In the euphoria of a new objective, most people, whether corporate or government, will understate the costs that will ultimately be incurred. NASA, though, has made cost overruns standard practice.

Partly, this is because the head of NASA is an appointed official. Presidents come in, put their genius into the director's seat. The new director, either under orders from the boss or just because he can, proceeds to shuffle positions, change procedures, revisit all current projects, and redoes all the objectives. As a result, projects that were killed a few years ago are suddenly resurrected while ongoing projects are dumped, only to be reanimated when the next guy comes in.

In the unlikely event a project actually survives the change at the top, the contractors (like Lockheed) will think up all sorts of little goodies that ought to be added, or the project managers start changing the mission ("Y'know as long as we're out there we might as well ..."). As a result, costs keep going up.

The problem is that the US still has no national science, technology, or exploration policies. Bush's flight of fancy about going to Mars had no rhyme or reason, no stated goal. When Kennedy dictated that we would go to the moon, it was part of a program to improve US science and engineering education (begun by Eisenhower, when the Soviets embarrassed the crap out of the US). But even Kennedy, and later Lyndon Johnson, never seemed to have an idea of what would follow. That made it very easy for Nixon just to call the whole thing off.

President Obama's committee has shown we don't know what we're doing or why. Now the challenge to come up with a real plan.

You'll excuse me if I don't hold my breath waiting for one.

Sunday, August 09, 2009

A Cure Worse than the Disease?

The trouble with weather forecasting is that it's right too often for us to ignore it and wrong too often for us to rely on it. ~Patrick Young

The Copenhagen Consensus Centre, whoever the deuce they are, commissioned a study to see how much we could totally screw up the climate. Well, not really. What they said they were looking for, I guess, is a way to combat global warming. What the group came up with is a fleet of ships spritzing salt water into the air to create clouds which would cool the planet. The cost is a mere bagatelle, under $10 billion. It is especially cheap compared to the other idea they were given, to deploy lots and lots of little sunshades into space, which would have pushed $500 trillion.

There are only two little problems with this kind of thinking. First of all, it assumes man-made emissions as the evil that is causing global warming. Second, it assumes that, since we created the warming, our puny efforts can reverse it.

Odds are the think tank and other "mankind is evil" supporters are probably as wrong as they can be. I've written before (here and here, for example) about how there's more to the global warming picture than a lot of people like to consider. Since I raised those points there have been more.

A recent study has raised significant issues as to whether our current climate models are accurate. It turns out they don't do a very good job of accounting for the last big warming episode, which occurred about 55 million years ago. Basically, the study shows that carbon dioxide can't account for the amount of warming that was observed. Therefore, there was some other agent involved, but, as yet,no one has determined what that could be.

(There's a great opportunity for a joke about dinosaur flatulence here, but they had been gone for 10 million years when this warming event occurred, so I'll have to let it pass. Durn it.)

Then there's this article in which we learn that ice ages are quite probably caused by one of the wobbles in the Earth's tilt. What does the wobble do? It changes the amount of sunlight that certain parts of the Earth receive. The changes in warming patterns generate the ice ages. In between the ice ages, of course, we have warming.

One of the things most people like to ignore is that over most of the Earth's history, the planet has not had polar ice caps of any significance. The reason we have them is that we are at the tail end of an ice age.

Okay, you say, but global warming appears to be a fact, so a fleet of cloud-making ships still might be a good idea. Well, it might be if we actually had a clue how the climate worked. Given that we can't even make a decent prediction of how many hurricanes will occur in a given year, it's hard to see how we can actually be sure that a) global warming really is happening, and b) our cloud making will not create some worse situation.

The fact is that, when it comes to the climate and weather, meteorologists are taking stabs in the dark. Take hurricane track prediction. NOAA and others will go on about how great their predictions for where a hurricane will land have become. They get their predictions by averaging around 30 computer models. One of the weather web sites, a couple of years ago, added a map showing the predictions of all the models. It was comical.

The majority of the models would have the storm tracking in generally the same direction, although "generally" meant the storm could could land anywhere along a 1000-mile stretch. But what was hilarious was that some models predicted totally outrageous paths (including one that showed the storm remaining a hurricane after passing over the entire midwestern US and entering Canada). This is dart board stuff.

When a science is so weak that it can't accurately predict the path and strength of a hurricane accurately over a 72-hour period (or for a 24-hour period if you mean really accurately), you don't want its practitioners offering suggestions to alter the climate.

The Earth wobbles on its axis (known as precession), it's orbit wobbles around the sun, and the sun itself varies in brightness. Oh, and don't forget that the continents are moving around, altering ocean currents and atmospheric movement. All of these things impact our climate. I suspect no model ever created has ever taken all of these effects into account.

Chances are good that whatever cloud-making ships would do, it wouldn't be what the weather-guessers would be expecting.

It's clear that we need to be aware of climate change and that we should be taking action to be prepared for, say, the rise of the oceans in coastal areas. We should also be conducting tests into the effects on crops, working to create drought-resistant heat-tolerant food crops. If we don't need them, there will always be areas on the planet that could benefit from such research.

If we want to reduce carbon emissions along the way, that's okay because there are good reasons to do so, most of which have to do with less dependence on expensive (not scarce, just monopolisticly expensive) fossil fuels.

But we should quit kidding ourselves that we can modify the climate through our puny efforts. All I suspect that we're capable of is creating catastrophic weather conditions that none of those inaccurate models have predicted.

Until they can pass the test of predicting the past, climatologists should quit mucking with our future.

Sunday, July 26, 2009

Nitpicking the First Person

Some editors are failed writers, but so are most writers. ~ T. S. Eliot

In the other blog, I recently wrote a little ditty discussing a story about The Elements of Style, which has been the semi-precious-metal standard for writing for some years now. One of the elements I mentioned was that the use of the first person singular should be avoided unless the writer's subject was purely personal. Evidently, this style point has ceased to be taught in professional or journalistic writing classes.

I regularly read three magazines: Archaeology, The Smithsonian Magazine, and Biblical Archaeology Review. I have noticed an annoying trend that shows up in all three publications, so I must assume it's showing up in other non-fiction writing as well. What I'm talking about, if you haven't got my drift yet, is the ever-increasing intrusion of the use of the first person singular in articles that are not personal in nature.

An example of what I'm trying to describe (in my fumbling way) appears in the current issue of one the aforementioned magazines (name withheld to protect the guilty). In one article, the writer leads with "Shielding my eyes from the glare of the morning sun, I look toward the horizon ..." and goes on to tell us how she is at a particular historical location. A little later, she continues that she has arrived at the site (as if we were worried she wouldn't make it), saying "I drive partway up the mountain where I will meet" the person who is the expert on the site. "I am the sole visitor," she tells us for no good reason. Then: "At a kiosk, I buy a ticket that lets me ascend on foot ..."

I allow the reader to ponder how someone is coming to do an interview should have to buy a ticket to get to the interviewee.

At any rate, the travelogue continues for a few more paragraphs before the author relents and actually gets to the subject of the article.

Now the critical reader might complain that I am certainly no slouch at using the first person myself. But, this is a blog involving my opinions. There is a lot of me involved here, as one would expect. However, if I were to write an article about, say, a dig site where a major discovery had been made, I would certainly not spend time telling you how hot the day was when I got there, how tiresome the hike to the area was, and how scenic I found the views to be. I would spend a lot of time telling you about the dig and what had been discovered, quoting extensively from those involved in the dig and from other experts in the field. The only personal intrusions would be if I wished to express an opinion on the findings, and then only to make it clear that it was my opinion and not someone else's.

That's what you'd expect, and you'd be right to do so.

To be fair, the article I've been discussing does eventually get to the point and provide some interesting information, but the writer takes her sweet old time getting there. And she's not alone. I've seen this trend in article after article, and, frankly, I'm tired of it.

Now, there are lots of good articles that need the first person. For example, The Smithsonian Magazine has a series called "My Kind of Town", where well-known authors talk about their home towns. I'd expect a lot of first person in that; after all, the stories are as much about the influence of the town as about the bricks, mortar, and people who make it up.

Or, say the article is written by the person who has made a particular discovery. It would be hard to avoid the use of the first person, although most such articles tend to use the first person plural, because many discoveries involve the work of a team. The only time the use of the first person gets in the way in such stories is when the author gets into the same travelogue mode I discussed above.

It's rather like what Sherlock Holmes said to Watson one time when Watson was describing the results of an investigation Holmes had asked him to do. As Watson began to wax eloquently about the foliage and scenery, Holmes cuts him off with, "Cut the poetry, Watson, and get to the point."

That's exactly what I'm trying to say to these writers who insist on telling us how hot they are, how amazed they are, how dirty they are, or how cold they are. I don't really care, and I suspect the average reader of these periodicals doesn't care either. The reason folks like me subscribed to these magazines to learn new stuff. The fact that it's hot in the desert or cold in Alaska isn't new.

So, cut the poetry, guys, and get to the point. We'll all appreciate it.

Monday, July 20, 2009

From the Earth to the Moon -- and Back

Heroes abound, and should be revered as such, but don't count astronauts among them. We work very hard; we did our jobs to near perfection, but that was what we had hired on to do. In no way did we meet the criterion of the Congressional Medal of Honor: 'above and beyond the call of duty.'~ Michael Collins, command module pilot, Apollo 11

[This is a bit of a memoir that I also published on Gog's Blog, but it sort of belongs here, too. Well, they're my blogs and I can do what I want, so here's the piece.}

Do I remember where I was on July 20, 1969? You bet your Aunt Fanny's bloomers I do. I was at a friend's house where we huddled around his old TV set watching astronauts land and walk around on the moon.

I've been watching some of these programs over the last few days about the Apollo mission. Some of them have been pretty good, but I do get tired of the many shows that harp on the "primitive" technology available and how fallible all the equipment was and how it's amazing we got there at all.

Well, here's a bulletin for all those young producers: The technology was a quantum leap over what NASA had in 1961 when the project started. Those pitiful little computers were more powerful than anything that had ever been used before. And the people involved were the most remarkable aggregation of genius and determination since the Manhattan Project -- and this time a city didn't have to be vaporized in the process.

Yes, we lost three astronauts, the price of the builders not listening to what the geniuses, in this case the guys who were going to fly the thing, were trying to tell them. When they did, NASA ended up with a flight system could overcome an explosion in a fuel cell and return its crew in tact. The Saturn rocket, which, if you believe the current shows, was just waiting to blow up at any second, is the only U.S. rocket to never have a failure.

Now, we can't keep the toilet working on the ISS.

But, I'm not going to crab about the current space program, because I prefer to remember when we had inspired and dedicated people working toward a concrete goal. Sure, you can argue it was done because of cold-war politics. But it was still a magnificent example of what people can do with a purpose.

Michael Collins, the often forgotten man of Apollo 11, offers a collection of thoughts over at I was stuck by the quote that starts this article. Mr. Collins is making a point lost on so many people. There are brave people doing their jobs everyday, but calling them heroes is devaluing the term. The astronauts came into the program with eyes wide open; this was their job, and they did one hell of a fine piece of work.

I suspect he feels the same way about the way the word "great" is thrown around as well.

He has no use for "celebrity" either, calling it an "empty concept". For a man who journeyed a half million miles through space, he has his feet planted firmly on the ground.

I also like his idea, shared by Buzz Aldrin and Neil Armstrong, that we should quit piddling around and commit a dedicated mission to reach Mars. We've been to the Moon; going back serves no great purpose. You want a launch pad in space? Build a proper space station and launch from there. Going all the way to the Moon to launch rockets to Mars is absurd.

Collins, by his own admission is a bit of a grumpy old man, as is Buzz Aldrin. But, that astronaut's optimism and drive still sneaks out. In answer to the question, "Don't you have any keen insights?", he says:

"Oh yeah, a whole bunch, but I'm saving them for the 50th.

I'll be looking forward to reading them.

Saturday, July 18, 2009

The Endanged Neanderthals

Neanderthals are three times as different from us as we are from each other ... ~ Chris Stringer

It has become standard fare in the pop science media to portray the extinction of Neanderthals as being, in large part, due to the arrival of Homo Sapiens, because we sapiens were just so much smarter and clever and ruthless and good looking. Okay, maybe not that last. Now, it seems that detective work into Neanderthal mitochondrial DNA has indicated that they were pretty much doomed without our assistance.

There is no doubt that Neanderthal got a bad rap for years, based mostly on conclusions drawn from one set of bones. Those bones conjured up an image of a stooped, thick-browed caveman who was barely smart enough to get out of his own way. It turned out, of course, that the bow-legged, bent frame suggested by the bones in fact belonged to an elderly Neanderthal afflicted with arthritis. Later discoveries showed that Neanderthal most likely was put together more like a body-builder.

It also appeared that Neanderthal society was a little more complex than first thought. They buried their dead with grave goods, for instance, not something you'd expect from a stupid cave man. This sort of burial implies a belief in an afterlife, which takes the beginnings of a searching mind.

Along comes Cro-Magnon man, and Neanderthal, after chugging along for a quarter of a million years, drops off the face of the Earth. Conclusion: Cro-Magnons kicked Neanderthal butt. Or, if you prefer, Neanderthals and Cro-Magnons interbred to lead to the mess that we are today.

Well, maybe not. To begin with, studies of Neanderthal DNA have pretty much ruled out that they are in any way our ancestors. As to getting wiped out by Cro-Magnon, Chris Springer is quoted in the linked article as pointing out that there weren't many Cro-Magnons yet, so it's entirely possible they never even made contact. If they did, it's unlikely there was any large-scale warfare of extinction.

Because the populations were so small, it's also unlikely that Cro-Magnon somehow was using up scarce resources. In other words, there were plenty of animals for both of them. The conditions were changing so Cro-Magnon may have adapted better than Neanderthal, but the newcomers weren't taking the food out of the mouths of the cavemen.

Population size, however, is the nub of the matter. According to the study, Neanderthal populations were small. The mitochondrial DNA shows that Neanderthal was more prone to harmful mutations than modern humans; normally these would be weeded out as population grew. But Neanderthal population wasn't growing, so the mutations had a deleterious effect, perhaps affecting their immunity to ailments or their ability to process nutrients. Whatever it might have been, Neanderthal was doomed because their numbers were always relatively low.

No explanation is offered for why they never reproduced rapidly enough to be able to overcome the negative mutations, but low population may account for another thing that has always bothered anthropologists.

Neanderthal, as I said, was around for around 250,000 years. At the end of that time, they were using basically the same tools and methods that they were using in the beginning. Compare that to the progress made by Homo Sapiens in a 100,000 year span, and I'm not talking about space flight. Early modern humans discovered farming, domesticated animals, developed spear throwing implements, and continuously improved their stone tools.

Here's Neanderthal, with a brain as big as ours (in fact, slightly larger), which appeared to be basically wired the same as ours, yet they stagnated. One theory used to claim that Neanderthal didn't have the power of speech, but recent discoveries of hyoid bones would indicate that they could have spoken. Whether they did or not is still open to debate, but it's hard to imagine that they did not. They appeared to be good hunters, which implies some sort of communication.

A new mathematical model of population versus innovation might provide a clue. According to the model, given an increase in population and an intermingling of different communities can spur innovation. Now, I've often expressed my concerns about mathematical models and computer simulations, so I would suggest that this one should be taken with a grain of salt. But, it does promote an plausible scenario for the demise of Neanderthal.

Consider this: Neanderthals were spread across Europe in small numbers. They probably seldom crossed paths to exchange ideas. As their numbers dwindle, such interactions, if they occurred at all, became almost non-existent. Because of their small numbers, they don't even stumble across Cro-Magnon groups much either. Then the climate chilled and dried, causing game to move from the accustomed areas. Okay, Neanderthal was probably smart enough to move with the herds, but the worsening weather causes disease to have a greater effect on the health of individual Neanderthals, making it harder to muster a long hunt.

Basically, Neanderthal got weaker and more isolated, a recipe for demise. It's a very plausible scenario.

Because Neanderthal lasted so long, they're often referred to as a very successful species. But, they didn't prosper, they merely survived. While that's not exactly chopped liver, it's not a long term recipe for success.

We modern humans, who have been around for half as long as the Neanderthals were, should keep that in mind.

Sunday, July 05, 2009

Any Color As Long As It's Black

Innovation! One cannot be forever innovating. I want to create classics. ~ Coco Chanel

Porsche says electric cars are not ready for prime time. Charging systems aren't adequate, batteries are too heavy, and so on. Of course, Porsche does happen to have a prototype floating around, but it's the same old story: 100 or so miles per charge, forever to charge unless you have a 220 outlet available, in which case it's only half of forever. What Porsche really means is that their own feeble attempt at an electric car isn't any great shakes.

Porsche does make a hybrid, the Cayenne, but it's a joke, getting a rousing 24 mpg (although this is considered a significant improvement over the 13 mpg the model usually gets. In this, Porsche matches the American auto makers who tout hybrids that get marginally better mileage while costing an arm and/or a leg more.

Only two manufacturers that I know of make for-real hybrids that are actually fuel-efficient and eco-friendly: Toyota with the Prius and Honda with the Civic Hybrid. [Truth-in-blogging notice: I own a Civic Hybrid.] Both are rated in the upper 40's for mpg's. The Prius gets its best mileage in short-hop, around town sort of driving, while the Honda is best for open road driving. The difference is in the way they use the electrical assist mode. Prius actually runs fulling on electric power in slower-speed, stop-and-go situations, while Honda uses it's batteries as a horsepower assist.

I've never driven a Prius, but I can vouch for the fact that I get over 50 mpg in my Civic, which is rated at 45/45 by whoever certifies those numbers on the sticker. That's because I drive 150 miles per day, 60% of which is on the freeway. Most of the rest is over a country road, that has no stops and light traffic.

By the way, no hypermiling techniques are involved in that. I suspect that if I was driving a Prius in it's best situation, I could do the same.

So, the first question is: Why aren't all the automakers providing hybrids, at reasonable prices, that can do at least 40 mpg?

When it comes to electric cars, the situation is even worse. The first electric car was built sometime between 1832 and 1839. You read it right; we're talking around 170 years ago. That's one hundred and seventy, one-seven-zero, almost two centuries ago. Electric cars were doing quite made for around 100 years before roads got to be good enough and gasoline engines got reliable enough for people to want to drive longer distances. And that was that until the EV1, first built in 1996 and off the roads by 2003. Fortunately, for the few of us who actually want to quit supporting Exxon executive bonuses, the Toyota and Honda hybrids appeared soon afterward.

So the second question is: Why haven't the automakers or even one automaker, started turning out electric cars at all ?

And, while we're at it, the third question is: Why hasn't any automaker made any serious attempt to produce hydrogen, natural gas, fuel cell, solar, or any other alternative fuel car with or without the back up of a gasoline-powered engine?

I have seen endless stories over the last five years about new battery designs that allow for quick charging and longer life. Five years ago, Honda announced a cheap way to produce solar cells. Yet we still talk about 100-150 mile range, 18+ hour charging times (at 110 volts), and ridiculously high prices when it comes to electric cars.

You wanna know why? Well, I'm gonna tell you why, buddy-row.

There's a story related by Robert Lacy in Ford: The Men and the Machine (1st Edition) that pretty well sums it up. In the first quarter of the twentieth century, automaking was an industry slopping over with innovation. Companies sprang up like weeds with new styles and new designs. But, the king of the hill was Ford, run by Henry Ford, who had created the assembly line, and produced the first affordable and reliable car, the Model T. Henry loved the Model T.

In 1912, when the Model T was four years old, some Ford execs decided that some updating was in order. So while Henry and family were vacationing in Europe, they put together a new, improved Model T which looked positively sleek compared to the boxy model they were currently providing. When Ford returned, the team proudly showed him their handiwork. Ford listened to them, then inspected the car closely, walking round and round it. After a pause, he tore it apart with his bare hands.


It took almost 20 years and a 50% reduction in market share for Ford to realize that maybe, just maybe, he should update the vehicle.

That is the attitude that prevails in the auto industry, pretty much world wide. There comes an occasional burst of activity, caused by one manufacturer finally realizing they've lost enough sales. But fundamental changes have come seldom, very seldom. We'd still be looking at fleet averages of 15 mpg if the government hadn't legislated better fuel efficiency in the 80's. The main reaction to that by the U.S. automakers was to make some very fuel-efficient (relatively speaking) cars and getting trucks and SUV's exempted from the fleet mpg calculations. In truth, there's been no improvement in fuel efficiency since the mid-80's.

The auto industry hasn't cared for innovation. Rather than spend the money on R&D, they spent it on lobbyists and marketing. And now, two of three are bankrupt, and the third ain't feeling so good. Oops.

Meanwhile, the oil companies that helped push them over the edge is laughing all the way to the bank because companies like Porsche are moaning that alternative technologies aren't practical or, excluding Honda and Toyota, are producing so-called over-priced hybrids that are overpriced while providing minimal improvements in mileage.

Just to show that the Japanese aren't immune from the disease, two years ago, Honda was planning to dump the hybrids. They have thought better of that idea.

Now, if they'd do something with those cheap solar cells they were promising ...

Saturday, June 27, 2009

Geocentists, UFO's, and other Stupidities

Only two things are infinite, the universe and human stupidity -- and I'm not sure about the former. ~ Albert Einstein

The reader may have heard of the Flat Earth Society. These folks insist that, in the face of all evidence to the contrary, the Earth is flat. You don't hear much from these people any more, not so much because we have all manner of pictures and scientific evidence showing that the Earth is in fact round, but because their founder passed away some years ago.

I always thought the Flat Earthers were actually being a bit tongue-in-cheek about their beliefs, but evidently some of them were quite serous. When presented with photos of the Earth taken from space, they announced that these were actually pictures of some of the many "non-luminous" bodies between the Earth and the moon. They didn't bother to explain how something non-luminous could show up when viewed from space but never be visible from Earth.

At any rate, they have pretty much faded away, but it seems like a new group has leaped into the fray to take their place.

I received a brochure in the snail-mail a couple of weeks ago from a group that asks, "Have scientists been wrong for 400 years?" Now the obvious answer to such a question is 'No." Even without knowing what these people are talking about, the answer is "no", because no one poses a question like that when they're about to say something reasonable. So what have scientists been wrong about?

The Copernican view of the solar system. Yes, friends, it seems that ever since Galileo (the group ignores Copernicus altogether in the brochure), scientists have erroneously had a "theory" that the Earth orbits the sun. Well, these geocentrists are here to set us straight. The sun goes around the Earth. Not only that, but the earth doesn't rotate at all. It is a fixed point in the universe.

Remember, you read this revalation here first.

Their proof? Why, the Bible, of course.

Now it would be easy to lump these guys in with Creationists and others who use the Bible as their source of scientific informaiton, but that's not my point here. It's not that people take the Bible literally. The problem is that people are willing to simply ignore physical evidence and the reality of the universe around them. In fact, within the same week, I saw another example of this that has nothing whatsoever to do with Bible-toters.

NASA recently lauched a dual-satellite lunar mission, which, in a few months, will monitor the impact of it's rocket stage into the moon. This isn't the first time humans have crashed things into the moon. The Ranger series, the first photographic surveys of the moon, were designed to crash into the surface, taking pictures all the way to impact. Clementine and, very recently, a Japanese satellite were deliberately crashed into the lunar surface to study the material that was kicked up(in Clementine's case, to look for possible water ice). The current mission is the first where orbiting saltellites would be in a position to study the impact in very close detail.

So, along comes this guy who decries the NASA "bombing" of the moon, because it will upset the extraterrestrials living on the moon.

Just chew on that one for a moment.

I have nothing against the idea of the existence of ET's. Nor do I argue that not everything people have seen in the sky over the years has been adequately explained. However, I don't believe aliens drop in and kidnap drunken backwoods yokels and take them on joyrides around the planet before dropping the off again. While I don't believe every UFO sighting can be explained away, I can't accept any of them as alien spaceships.

And, no I don't believe in Bigfoot, Nessie, OgoPogo, or any other fairytale monsters either.

As Carl Sagan once said, "Extraordinay claims demand extraordinary evidence." What bothers me is that people will not accept the genuine evidence before them while accepting non-evidential sources (like the Bible or mythological sources) or dubious evidence (fuzzy picutres and transcripts that can't be documented).

The Bermuda Triangle has been debunked over and over again, but people will still quote the transcript statements of the Flight 19 flyers that appear nowhere in the Naval records. They will ignore the fundamental fact that more aircraft have been lost and never found in the continental US (where they're sitting on the ground, for crying out loud) than have been lost over the Bermuda Triangle. Heck, they can't even agree where the stupid triangle is, as it gets moved around a lot depending on what ship disaster or reminant of Atlantis you want it to include.

And don't even get me started about Atlantis.

I've never understood the desire of people to ignore the vailidity of factual evidence or theories supported by reason and observation while buying into the wildest, weirdest belief systems. The Internet has been no help in this regard, because many people will believe anything if it comes from the Internet. There's no other way to explain all those folks duped by Nigerian 419 scams.

But, the Internet isn't to blame here. It just isn't helping make things any better. I don't know what the solution is, other than to do a better job educating our kids. With better education, it's possible that the people who are out of touch with reality will be outnumbered by those who have a clue. Trouble is, people don't want to spend money on education or support teachers who want higher standards (and the authority to discipline). If we continue down the path we're going, the geocentrists, the creationists, and the ufologists are going to get the upper hand.

On the other hand, the debate between the geocentrists and the UFO crowd over whether the alien planets don't rotate either ought to be hoot.

Sunday, June 21, 2009

Not Worth the Paper ...

Science too often trivializes the profound, answering questions that are very different from the ones that were asked. To formulate a question suitable for scientific research too often requires us to forget what it was that we really wanted to know. ~Earon Davis

A couple of French astronomers with time on their hands and some computing power available have determined that it you tweak Mercury's orbit by very small amounts, chaos will ultimately occur in the solar system. Well, ok, and if I drop another planet in between the orbit of Earth and Mars, some bad things will also occur.

I'm not at all sure what the point of their exercise is supposed to be. Is this another one of those "had their been a slight variation in starting conditions, we wouldn't be here" scenarios? What is the mechanism for chaniging the semi-major axis of Mercury? And, since they were working in 5 billion year intervals, did they take into account that the sun will probably have gobbled up Mercury before it could pinballing around the inner solar system?

But the ultimate question is: What the heck was it they were trying to do in the first place?

There are lots of interesting questions about the solar system's orbital mechanics. For example, how did the asteroid belt come to be? Why didn't everything get kicked out of the belt by Jupiter's gravity? How did Uranus end up on it's side, and why is Triton orbiting in the wrong direction? Why do all the gas giants have rings while no rocky planet has them?

In other words, if you're going to spend a ton of time developing a program to model the solar system, why waste time fiddling with Mercury's orbit when you could be trying to determine the conditions that got the solar system to where it is now?

Well, maybe, just maybe because a couple of French astronomers were trying to get published in a prestigious magazine to enhance their own reputation. Of course, they submitted the information as a letter, not a paper, which I imagine avoids peer review.

Which brings us to a paper submitted by the Center for Research in Applied Phrenology (CRAP).

The paper, entitled "Deconstructing Access Points", was submitted for publication to The Open Information Science Journal by CRAP researchers David Phillips and Andrew Kent. And it was accepted for publication "after peer review", as long as Phillips and Kent supplied an $800 publication fee.

None of this would be hugely unusual except that Phillips and Kent were actually Phillip Davis and Kent Anderson, and their CRAP paper was actually a pile of nonsense generated by a computer program designed to generate phony research papers. The name of their "research" outfit was deliberately designed to send a huge hint to Bentham Publishing, the outfit responsible for the journal, that maybe their collective legs were being pulled.

Evidently, Bentham editors didn't care, as long as the check was good.

Bentham, of course, now claims that they knew it was a gag all along, and they were just stringing these guys along to find out who they were. Interestingly, the editor of the journal subsequently resigned. Evidently, he wasn't in on the investigation by his own staff.

Now, add these to the incident of the "missing link" fossil that was sort-of-but-not-very-peer-reviewed, and you have the makings of a disturbing trend.
The pressure to publish, as I've said before, is huge. There is a lot of competition for research dollars, and getting published is one way to get hold of them. The problem is that getting published may not require that what is published have any particular scientific merit. Or, as in the case of the fossil, it may have merit not but justify the hype-filled conclusions.

Then there is the reputation factor. Jorge Hirsch, self-proclaimed genius, has determined that scientific reputation is determined by where you get published and how often you get cited. Interestingly, the h-index, as Hirsch has dubbed it, gives Hirsch a very good rating. Even more interestingly, the h-index takes no account of the quality, validity, or originality of the publications.

What's frightening is that there appear to be scientists to take this nonsense seriously.

At least the attitude that publishing anything is all the matters goes a long way toward explaining some of the whacko theorizing that has become the hallmark of the 21st century so far.

It might even explain dark energy.

Saturday, June 13, 2009

Making Excuses for Economists

Economics has never been a science - and it is even less now than a few years ago. ~Paul A. Samuelson

Any article that starts out with the sentence, "According to classical models of economics, financial crises don't happen," is bound to be a knee-slapper. Back in the Neolithic, after flopping at physics, I moved into Management Sciences, a very impressive-sounding way of saying "Business Major." Along the way, I managed to pick up a minor in economics, so I have actually studied the subject. And I can tell you, my professors could define any number of ways financial crises would happen.

The article goes to say that the study of economics could benefit from methods developed in other sciences. Presumably, they're thinking of string theory, which also has no ability to predict anything.

They're missing the point. It's not that classical economics can't function as a guide and make general predictions. It's not even that the assumptions of classical economics are all goofy. It's that the actual economic systems in existence today have nothing to do with real market principles. And until the economists and the policy-makers get that through their thick heads (or stop accepting bribes from the market-wreckers), economic predictors will make the weather forecasters look like psychics by comparison.

For example, one assumption of classical economics is "perfect knowledge." That is, investors know everything there is to know about the supply and demand of a product. In the olden days, someone could create a market panic just by falsifying a report of floods in Paraguay destroying some crop or another. If anything, the knowledge level today should be better than it's ever been. However, when markets are controlled by a few large players, like, say, oil, then all bets are off.

The funny thing is that people all blame OPEC for their control of the oil supply. At one time, their control was probably impressive. When one looks at the profits now generated by Exxon, BP, and Dutch Shell, it should be clear that OPEC is just along for the ride. In fact, if anyone ever decides to actual investigate the books of those giants, I suspect they'll find that they were the big players in oil future speculation.

In case anyone wasn't paying attention, it was overpriced fuel that began the so-called financial crash. After all, someone had to be buying those futures that someone (most likely the oil companies) was selling. When fuel prices got out of hand, consumers had to start cutting back on purchases and began defaulting on credit and mortgages. Meanwhile, the financial folks who had bet wrong on oil futures going up indefinitely, which would be most of them, suddenly got cash strapped when oil began to fall, and they found themselves on the wrong end of the oil companies short-selling.

Then there's the general business of monopolistic markets. Everyone recognized for years that, if you didn't regulate monoplies, they would put the screws to everyone. Now regulated utilities worked fine for years. The "breaking up" of AT&T into little unregulated regional monopolies was the biggest absurdity of modern economic times. Most of those pieces are back together, and rates are going up while service goes down. Almost anywhere in the civilized world you can get better internet service than you can in the United States. With AT&T pretty much allowed to do what they want, it's not hard to understand.

Note: AT&T recently shed the last bit of regulatory control Alabama had imposed on them. They celebrated by raising dial-up internet access by 50%. Meanwhile, they do nothing to improve the availability of ADSL, meaning that they have a captive market. Doesn't exactly sound like a free market to me.

Another assumption of classical economics is that investors will act rationally. I'll wait while you stop laughing.

There's two things wrong with that. First of all, a huge amount of investing is actually being done by computer models that don't do anything rationally. The programs just follow rules. In many cases, those rules allow for a market to enter a death-spiral because, for instance, a drop in price can trigger sales, which trigger lower prices, which trigger more sales and so on.

There are so-called "witching days" where all the financial computer models trigger an activity based on whatever conditions are programmed into them. There are times that several of these witching days coincide, with all sorts of volatile results, almost none of which ever favor an individual investor.

Back in the 1920's, everybody was speculating in the market. Worse, they weren't investing for the long term; they were looking for the quick kill. The result was margin-buying and hot-tip buying, both of which are recipes for disaster. Now, we have endless ads from online investing outfits that purport to give you sophisticated tools to do quick-kill investing. If there's anything that a rudementary study of economics reveals it's that constantly going for short-term gain is a recipe for disaster, whether for individuals or for businesses.

The other thing economics teaches us is that the small investor is not a rational investor. They got into trouble in 1929, which is why the SEC came about and financial institutions got heavily regulated. The idea was to keep the small investor from betting the farm based on whatever any huckster tells them. Then the de-regulators started having their way. The banks could be stockbrokers, the stockbrokers could be banks, and small investors could go nuts once again on their own. Which is another reason 2008 looked a lot like 1929.

We don't need more science in economics because we haven't paid attention to what it already has taught us. If we won't learn from economic history, applying some theory from physics to economics won't help.

Dark matter is bad enough; can you imagine "Dark Money"? It positively makes the blood run cold.

Monday, June 08, 2009

Missing Linkage

The evolution of the brain not only overshot the needs of prehistoric man, it is the only example of evolution providing a species with an organ which it does not know how to use. ~ Arthur Koestler

An interesting new fossil has been discovered that should have a lively discussion on possible hominid ancestors. Instead, thanks once again to the desire to publish quickly and the desire to make a buck by getting it on television, we have flawed science surrounded by a bloody circus.

The fossil is called Darwinius masillae, and it's 47 million years old. It is remarkably well-preserved and has been surrounded by hype worthy of a Simcha Jacobovic production. David Attenborough, who should know better, threw together a special program for the BBC, which I guess is the one that was shown on the History Channel. The National Geographic, who really should know better, whipped out this article, which was pretty typical of the hype surrounding the discovery made by Jorn Horum. The operative phrase in all of the articles was "missing link."

The funny thing is that no one could get hold of the paper written by Horum's "dream team", to use his phrase, to actually have it checked out by some experts in the field. Once it was, the hype began to die a prickly death.

Let's get something straight up front. As is pointed out quite correctly in this piece, there is no such thing as a "missing link." When people talk use this phrase, they are either talking about transitional species or, more likely, an early common ancestor of two or more species. But people, including many learned types, have been tossing around the phrase forever. The "discovery" of Piltdown Man really pushed the "missing link" idea into the forefront of public thinking, and it's been stuck there ever since.

Now, the use of an inaccurate phrase could be forgiven if the basic conclusions about the possible common ancestry to hominids and lemurs could be justified. The trouble is that it can't, no matter how many times they said it in the TV show.

What really stinks about this whole deal is that, once again, science has been folded, bent, stapled, and mutilated for the sake of television exposure. Horum, the star of the show, clearly has designs on becoming a media star. His employer probably got all glassy-eyed at the thought of grant money flowing in to support his coming researches. Sadly, a perfectly good discovery is now going to be ignored since it couldn't live up to the hype.

What is really sad is that at least one member of this "dream team", Phil Gingerich, had misgivings about the whole process. He is quoted in this story, which, by the way, does a good job summarizing the whole mess. Gingerich's says that there was time pressure because a TV company was involved. He says, "It's not how I like to do science." Well, sir, there was a simple solution to that problem: Don't do the science that way.

I have frequently complained about the nature of science and history programming. Sensationalism is what sells, not knowledge, so the more outrageous the claim that can be made, the more likely to garner viewers. The trouble is these viewers are going to leave this sort of program with misinformation, which will be retained because they don't take the time to check things out further.

Worse, it's almost a sure thing that the program will migrate from History to Discovery to the Science Channel, being shown over and over. If you don't think that's likely, just watch the endless shows about King Tut that still continue to prattle about assasination plots, despite the well publicized (on TV, no less) research that pretty much settles that Tutankhamun died as a result of a severe injury to his leg. Few if any assasins normally tried to kill someone by breaking his or her legs.

And yet, not half an hour ago, History or History International was rebroadcasting some canard about "King Tut's Curse", one of those half-baked "investigative" reports where some guy you've never heard of claims to have "solved" something that didn't need solving. In that program, they once again rolled out Tut's "murder", possibly making him the first victim of his own curse.

That's so stupid as to be laughable.

It's not that these channels don't come up with some good programming; they do. But, when you're claiming to impart serious knowledge, you have a responsibility to ensure that the information is accurate or at least clearly delineates which information is fact and which is speculation. Also, the programs need to be peer-reviewed as much as the research that inspired them.

There is quite enough ignorance and misinformation in the world already.

Tuesday, June 02, 2009

Mars Attacks - AGAIN!

We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology. ~Carl Sagan

Note: I published this originally in August, 2006. However, it appears that the marching morons have begun sending that stupid e-mail around again. So, in what is probably a vain attempt to keep at least one person from telling all their friends that Mars will be as big as the moon, here it is again.

The other day the son says to me that a friend of his has told him that, on August 27, Mars will be the closest it's ever been. It will be so close that it will appear as big as the Moon in the sky. Fortunately, the son said it sounded like bull to him, restoring my faith that he has some semblance of intelligence.

Of course this is bull. To begin with, if Mars came close enough to the Earth to look as big as the Moon, we'd be too busy dealing with earthquakes, volcanoes, and high tides rolling inland about 20 miles to notice what Mars looked like. Secondly, right now, Mars is on the other side of the Sun, so, unless Mars knows a shortcut, it's going to be about as far away as it gets from us on August 27.

How does this nonsense get started? In this case, we can probably trace the misinformation back to some correct information.

A Martian year is about two terrestrial years long. This means that, about once a year, we overtake Mars. At this point, Mars is as close as it will get for that period. But, planetary orbits are elliptical, so the spacing between them varies. If the Earth overtakes Mars at the right point, the two planets will be much closer than at other times. It turns out that on August 27, 2003, Mars and Earth got to their narrowest separation in 60,000 years. This was a boon to amateur astronomers with small telescopes (like me) because it was possible, even with a 4-inch reflector, to make out patterns and blotches on the Red Planet. Very cool.

At any rate, an e-mail began circulating a few months before the close approach which gave the information above and added that Mars would be very bright, second only to the Moon. Now Mars was bright, although it wasn't as bright as, say Venus, or probably even Jupiter, but I don't recall if either was visible at the same period, so the statement may well be accurate. Somewhere along the way, as the e-mail made the rounds, “almost as bright as” became “almost as big as.” And, boy, did this get legs.

I don't know how many people, aware that I like astronomy, stopped by to tell me about the huge Mars that was going to be hanging in the sky. I would patiently explain that, even at closest approach, we're talking a long way off. Mars would be nice and bright, but not very huge. Science sites all over the Internet explained this endlessly, yet the e-mail outranked the science.

August, 2003, came and went, but the e-mail carries on. All that changes is the year. It's still floating, but now it claims August 27, 2006 is the big day. You'd expect that by now, some of the folks who get this thing would walk outside and wonder where the huge planet is.

This sort of ignorance is not new, but, thanks to the connected world, it certainly spreads farther and faster than ever before. What amazes me is that people are willing to believe an e-mail, which contains those deadly words “send this to everyone you know”, is somehow going to tell them about a near apocalyptic event when normal news and science outlets have nothing to say about it.

What makes the son's friend's ignorance more poignant is the announcement from Michael (Launch that sucker!) Griffin's NASA people that, to make up for budget shortfalls, all science on the ISS should be shut down. Frankly, I haven't heard about a lot of science coming from the ISS, since most of the time the astronauts are repairing things and trying to stay alive. But, apparently, there is actually some research being done, and NASA wants to dump what little is being done to concentrate on manned missions to the Moon to build our launchpad to Mars.

What for? If you're not going to do science, spending billions to go somewhere to say you got there is a waste of time and money. There is so much to learn, especially about possibly escaping to Mars to escape climatic catastrophe on Earth. Yet the guidance from our leadership is aimed at going and planting the flag.

An oft-used theme in science fiction involves the fallen “galactic empire” where the technology still exists to flit from planet to planet, but no one knows how it works or how to fix it. Isaac Asimov's “Foundation” series tells it best, but others have dealt with the possibility as well. There are times I think we're heading in that direction, without ever even having a galactic empire.

I don't know why people think research is useless, even though they use things every day that came from pure research conducted at Bell Labs or that were developed to get us to the Moon in the first place. It's unsettling to see a trend to try to paint the original Apollo missions as just lucky, but that's what's happening these days. Much is made of the lack of computer power available to the mission, as though there was something else available. In fact, the computer technology used by NASA to make the missions work was beyond state-of-the-art at the time. Yes, it's primitive compared to today's desktop computers, but, thanks to the work done in the 1960's, we have those desktops today.

And let's not forget that the Internet was originally created for researchers to share information, first Defense Department researchers, then a more open network of university and corporate researchers. Today, however, the researchers have left the Internet to form their own network away from the spam, shopping sites, online lonely hearts clubs, and porn that the Internet has become.

So science is taking a back seat to technology. Worse, the efforts that got us here are downplayed at times on what little science programming is available. We're a society that loves whiz-bang toys and believes anything we read in our e-mail, but we don't want to have a basic grounding in the sciences that got us where we are.

This continuing ignorance of the importance of scientific inquiry and research is going to kill us yet.

Thursday, May 28, 2009

A Small Error of Fact

If the television craze continues with the present level of programs, we are destined to have a nation of morons. ~ Daniel Marsh, 1950

We have fulfilled our destiny.

As I have said on many occasions, my television viewing is pretty much limited to the Discovery-History axis of channels, with an occasional side trip to sports, very old movies, or cartoons. I have made it clear that I thought that channels that claim to be presenting programs about history or science have a certain duty to get stuff right. History International blew that one magnificently.

The show was Mega Movers, which as far as I can see should be on HGTV or something, since it doesn't have a lot to do with the history of anything. A lot of the programming these days on the Discovery-History axis seems to have strayed a long way from the original focus of those networks. What are we discovering from Survivorman, exactly (which is now boring us for entire evenings on Discovery and the Science Channel)? Do they really expect us to believe that the guy is in real danger and completely out of communication? I don't believe that for a second.

As if that wasn't bad enough, Discovery found some character named Bear Gryls, who was such a phony that he and his crew were camping out at local motels.

Then there are the reality shows like Deadliest Catch, Axmen (and an almost identical show whose name I forget), and Ice Road Truckers. These programs do appear to show real events, but the real attraction seems to be the endless bleeping of the participants' dialog. The other night, I wondered why the Son was watching a program on telegraphy, since all I heard coming from the set he was watching was Morse code beeps. It wasn't Morse code; it was just an endless stream of bleeped expletives coming from some idiot on a crab boat.

What meaningful prgramming.

Remember The Learning Channel? It became TLC, because when you turn a channel into a freak show, there's not a lot of learning going on.

At any rate, the Mega Movers were moving a full-size copy of a space station module to one of the space centers so they'd be able to train astornauts and replicate problems that might be occurring on the real article. To do this, NASA uses an aircraft designated as the 377SGT Turboprop, better known as the Super Guppy, for reasons that should be apparent. Well, sort of. Personally, I'd have called it the Baluga whale (which is what Airbus calls their own updated version).

Originally, there was simply the Pregnant Guppy, which was pretty darn big, but needs changed and the Pregnant Guppy got replaced by the Very Pregnant Guppy. At this point, someone decided that pregnant airplanes didn't cut it, so the name was changed to Super Guppy.

During the Mega Movers episode, the difficulties of flying a whale were made obvious. After all, you have a massively loaded aircraft with the cross-section of a watermelon. Crosswinds at takeoff or landing are potentially deadly for such a monster, so the pilots have to be a skilled bunch. The link to the Pregnant Guppy is worth a read because it details how close a little town in the Mojave Desert came close to be being obliterated by the maiden flight of the expectant fish.

Yet, the show went on to say, the plane was so well built that it survived a near catastrophe. It seems that during a test flight in the Mojave, part of the cargo canopy got ripped off, yet the pilots managed to land the plane. Now that's some mighty fancy flying, Wilbur.

Except that it evidently didn't happen, at least not that I can find. Read the extremely detailed article on the Pregnant Guppy again. Go ahead, I'll wait. Not a single mention of the canopy getting ripped off either the Pregnant or Super Guppies. I looked at several sites, yet not a one, including the Wiki articles, mentioned landing a Guppy with major damage to the canopy. The Super Guppy link has, buried way down in the page, a picture of some wing damage sustained in a test, but no references to canopy damage.

I don't know where Mega Movers got their information, but it would appear that the photo they showed may not have been in-flight damage. It's easy to imagine a lot of ways that the canopy could be torn up on the ground, perhaps during a loading operation.

Now, you might be inclined to ask why I even bothered to take the time to fact-check such a small item from an otherwise fairly dull program. Well, when they described the incident, they said the canopy came off during a test flight in the Mojave Desert. They then presented an old piece of film showing a plane in the distance slowly descending. To begin with, the plane's profile didn't look the least bit pregnant. But there was a more telling issue. As the plane flew over the Mojave, which is in California, for those of you unfamiliar with North American geography, it passed behind some familiar structures.

It seems that Mega Movers think that the Mojave Desert is home to the Pyramids of Giza.

For those of you who have fulfilled your destiny, Giza is in Egypt.

Friday, May 22, 2009

Still Looking for the Smoking Gun

The dinosaurs became extinct because they didn't have a space program. ~ Larry Niven

When last we met -- well, I was here, where were you? -- I was discussing a new theory that pterosaurs, at least the big ones, may not have been able to fly. Now, that is the sort of discussion that can get a group of paleontologists reasonably worked up, but if you really want to see a bunch upset scientists, start talking about the last great extinction event, when the dinosaurs ceased to walk the earth, 65 million years ago.

Ever since dinosaurs were discovered, people have been wondering where they went. For years, the prevailing theories were some sort of disease or significant climate change. Then, in 1980, a geologist named Walter Alvarez got to wondering about this thin black stratum he kept finding at the end of the Cretaceous. When he and his father, physicist Luis Alvarez analyzed the material in the stratum they discovered an abnormally high amount of the element iridium. The most likely source for a lot of iridium was from a meteor impact. Since the iridium layer (known as the K-T boundary, from the German form of Cretaceious-Tertiary) was found all over the planet, it had to be a big impact. They theorized that this could have been what did the dinosaurs in.

Now any group of scientists don't really like outsiders telling them their business, and paleontologists are no different than anyone else in this regard, so the Alvarez' theory was met with polite skepticism at best and outright derision at worst. Then someone found a big hole in the ground.

Actually, the big hole was mostly under water, off the coast of the Yucatan peninsula. It was dubbed Chixulub and seemed to settle the issue once and for all, at least for most people. It was generally now assumed that most, if not all, dinosaurs were wiped out by the catastrophic event.

Well, maybe not. Some scientists got to wondering if the dinosaurs weren't already on the decline because of climatic changes or maybe because of the eruption of the Deccan Traps, volcanic activity on a massive scale. And then there was Gerta Keller, who didn't think Chixulub had anything to do with it at all. First she announced that another meteor was responsible for the extinction, which is a pretty fine point. If two meteors hit the Earth close enough in time to have resulted in one K-T layer, asking which one killed the dinosaurs is like asking whether the fall or the sudden stop is what killed a guy falling off a cliff.

Then a little while later, Ms. Keller came back and said it wasn't meteors at all. It was the Deccan Traps in India that did the deed.

Then, recently, there was an article that was titled, "New Blow against Dinosaur-killing Asteroid, Geologists Say." It turns out that the "geologists" are actually a team led by -- wait for it -- Gerta Keller, who is basically rehashing her theories of three years ago. She announces unequivocally that not a single species went extinct because of the Chixulub impact. She doesn't mention if any went extinct by the other impact she once hypothesized.

One of the problems here is the popular picture that, when the dinosaurs went extinct, they did so in one afternoon, geologically speaking. Meteor hits, worldwide catastrophe, no more velociraptors. The thing is that it is becoming generally accepted that while the a Chixulub-size meteor would not be pleasant, it would not have created the planet-wide fires and other global disasters originally predicated. That said, it would have altered the climate significantly for a lengthy period, possibly long enough to starve a lot of sauropods because of a lack of plant life (thanks to global cooling) whcih would deprive a lot of theropds of their sustenance. Add the Deccan Traps outburst, and you have a very difficult time for dinosaurs.

So it's unlikely that any one cause killed the dinosaurs off, but it's patently silly to deny the effects of the Chixulub impact. It can't be pinpointed whether it occcured before, after, or during the Deccan eruption, but it would have been a serious blow. Whether it was a killing blow or just one more body shot is the question.

Then there's the whole issue about how long it actually took the dinosaurs to vanish. Generally, as I said, the thinking is that they were in decline and some catastrophe (take your pick) finished them off. But a new theory holds that some of them hung around for about 500,000 years. Of course, the theory is controversial, and not many people are buying into. In fact, unless the Deccan Traps can fit into this new time line, not even Gerta Keller is going to be buying in.

Personally, I don't find it hard to imagine isolated pockets of dinosaurs hanging on for some slightly extended period. I doubt any of the large beasts that normally come to mind when someone says dinosaurs are among those that survived (the story doesn't say what sort of bones were found). But it is easy to imagine smaller saurians, of which there were many, eking out an existence for a little while longer.

Of course, one thing that will come of this most recent theory is that there still could be dinosaurs roaming around, an old sci-fi standby. Worse, some news reader or writer is going to misunderstand the time frames involved and boldly announce that this proves that dinosaurs and humans actually were alive at the same time.

The Fred Flintstone syndrome lives on.

Saturday, May 16, 2009


Scientists are complaining that the new Dinosaur movie shows dinosaurs with lemurs, who didn't evolve for another million years. They're afraid the movie will give kids a mistaken impression. What about the fact that the dinosaurs are singing and dancing? ~ Jay Leno

For a bunch of critters that went extinct around 65 million years ago, before even I was born, dinosaurs are always popping up in scientific news. On the one hand, new species are regularly turned up, which is not surprising, really. We find new species of living animals all the time. When we see the variety of life on the planet now, then the creatures we've dug up can only amount to a tiny percentage of all the beasties that walked the planet millions of years ago.

On the other hand, scientists keep developing theories about the dinosaurs we do know about. Dinosaurs used to be lumbering tail-dragging lizards, clomping across the landscape. It's now generally agreed that there were a lot of very agile dinosaurs, including some of the big ones. And few if any of them dragged their tails; the tails actually streamed straight out behind the dino, providing balance and, in some cases, a defensive weapon. And some were warm-blooded, not lizard-like at all.

So the vision of lumbering giants spending their time half-submerged in some swamp has been replaced by mobile herds of sauropods grazing their way through entire forests, while being bushwhacked by the occasional allosaur or T-rex, depending on the geological era. Of course, there seems to be considerable disagreement these days over tyrannosaurus rex himself. He was fast, he wasn't fast, he was an accomplished pack hunter, or he was just a scavenger. Oh, and he may have been covered by feathers, at least as a juvenile.

But, no matter how you envision the dinosaur age, one part of the picture never changed: The sky was always filled with flying pterosaurs. Not so fast, says Katusfumi Sato.

If you're like me, you've always had a bit of a disconnect between imagining the soaring pterosaur and imagining one on the ground. In particular, if you really thought about it, you had the nagging feeling that it had to pretty hard for a pterosaur to get off the ground. Trying to imagine quetzocoatlus, a pterosaur with a wingspan the length of a school bus, getting airborne was difficult. According to Prof. Sato, it was probably impossible for the largest specimens.

One could speculate that these were cliff-dwelling animals that could launch themselves from the lofty reaches and soar around with impunity. The trouble is that any large pterosaur that landed on the ground would have a serious problem ever getting airborne again. That would not bode well for their continued existence. Sato even debates whether their fragile wings could have supported them in the air at all.

Of course not all paleontologists agree with this view. They point out that using studies of modern birds may not be a good model for the more reptilian pterosaur. Perhaps the atmosphere was more dense (a distinct possibility), or gravity was lower (not very likely). At any rate, it is quite possible that the rules for pterosaur flight were different those governing an albatross, just as the flight rules for a bumblebee are different from that of an eagle.

One rather weird suggestion is that perhaps the pterosaurs were flightless and used their wings for swimming, like penguins. Unfortunately, as one scientist points out, the wings "do not look very efficient for swimming." In fact, it's hard to imagine the thin membrane being able to hold up against the rigors of underwater propulsion.

Some years ago, someone actually built a lifesize model of quetzalcoatlus, which was about the size of a decent ultralight aircraft. They equipped it with motors to make the wings flap and actually got the contraption airborne. Once in the air, the model performed quite well, soaring along nicely, fitting our classic view of the magnificent pterosaur ready to swoop down on its prey. Lovely image, but they did not try to get the thing flying from a standing start. They actually towed it like a glider then released it into flight. Since we can rule out quetzalcoatlus having friends with towing vehilcles, that still leaves open the issue of how he got into the air in the first place.

Prof. Sato has his critics and is by no means the last word on the subject, but we just may have to give up that image of the majestic flying reptile.

At least we still have the feathered theropods.

Monday, May 11, 2009

A Matter of Deflection

O poor mortals, how ye make this earth bitter for each other. ~ Thomas Carlyle

Google seems to be in hot water again. For a company whose motto is supposed to be "Do no evil", they certainly get accused of it often enough. Except this time, I'm not so sure they're the guilty party.

It seems that the "evil" Google has done this time is to publish a historical map of Japan as part of it's online collection of maps. Now keep in mind that this is not some sort of secret map. It's been published elsewhere and was even part of a historical display in Tokyo a few years ago. Yet, Google's publication has created a huge stir in Japan.

What Google did was publish a centuries-old set of woodcut maps which showed, among other things, the location of "burakumin" communities. If this means as little to you as it meant to me, some additional explanation is in order. In the time of the shoguns, Japanese society was caste-based. At the bottom of the system were the burakumin, evidently similar to the Untouchables of India. The burakumin did jobs related to death, like butchering, leather-making, and burials.

Okay, so Japan had a caste system, and, like the poorest and lowest of other societies, lived in segregated areas. You can find maps and descriptions of Jewish ghettos and American slave dwellings anywhere. What's the big deal in knowing where the burakumin lived?

It seems that the Japanese haven't exactly kicked their upper-caste repugnance of these people. In fact, some Japanese employers will not hire someone if they have burakumin ancestors or live in the communities that were once solely inhabited by these people. So, the evil thing Google has done is to make it easy to determine where those communities were in relation to modern Japanese locations. This makes Google guilty of racist agitation.

That is one tortured bit of logic.

Google illuminated a bit of history. Evidently, that illumination makes it easier for bigoted Japanese to discriminate against people whose only "fault" is to have a connection, which may be tenuous, to a group who were once shunned by the elite classes, except, of course, when their trades were needed. We have laws against that sort of thing in the U.S., and I suspect that the Japanese do as well. However, we have people who ignore those laws or circumvent them; evidently the same thing goes on in Japan as well.

I'm not surprised.

I've mentioned on a couple of occasions about the reptilian part of the human brain. Whether one buys into that theory or not, it is difficult not to recognize the overwhelming tendency human beings have toward bigotry. Human history demonstrates that people will always categorize each other based on skin color or religion or social class. Having done that, people will proceed to employ discrimination, segregation, or even genocide to eliminate those who are different.

Of course, the "different" are always perceived of as being "inferior," thus providing an excuse for the reprehensible actions.

Everyone is a bigot. Everyone. The mark of a civilized human being is being able to overcome that built-in reptilian reaction to other groups. There have been occasional moments in time when a society has demonstrated an ability to do just that, but it seldom lasts. The reptile is strong.

Google decided to remove the map after learning of the reaction. This promptly drew a reaction from Buraku Liberation League, which had been upset over the publication of the map. Now they were upset at their removal, as though such removal made the burakumin into "unpersons". Apparently the League wanted the maps but with an historical explanation.

This misses the point. It is evident that enough Japanese are familiar with the burakumin and where they lived because active discrimination goes on. The problem is not whether Google publishes or doesn't publish an ancient woodcut. The problem lies with the Japanese who insist on discriminating against the group.

The Japanese Ministry of Justice is now "gathering information" on the matter. I believe that justice would be better served if the Ministry gathered informaton on the organizations engaging in discriminatory practices. It can't be that hard; the author of the article had no apparent difficulty in finding someone in a company will to talk about the company's discriminatory practices.

Evidently, the Japanese would rather create a fuss about Google's actions to deflect from the actions of their own people against their own people. It's an old tactic, used over and over to divert attention from the real problem.

Score another one for the reptile.

Wednesday, May 06, 2009

Why People Don't Flock to Linux

The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it. ~Linus Torvalds

PC World had an article listing the seven reasons people quit using Linux. Since the piece is written by a Linux expert, it is intended to debunk these reasons, marking yet another attempt by a Linux person to tell the rest of the world why they're all stupid for not using Linux.

Well, that's a little harsh, but not by much.

To set the record straight, I have been a fan of Linux. I've run Red Hat, Mandrake (now Mandriva), Debian, Suse, and Ubuntu (both Gnome and KDE versions). I've set up Linux servers for e-mail and proxy services. I like Linux. I just don't use it much, for reasons that will become clear as we go along.

Let's take a look at those reasons and see if they're legitimate.

1. Linux doesn't run a program the user needs. Frankly, we could stop right here. This is the single biggest roadblock to large-scale enterprise deployments of Linux. Even the author admits that there's not much of answer to this one. In fact, he doesn't even suggest using WINE or other Windows emulators, probably because, a)they don't work all that well, and b) you've got to have a legal copy of Windows for the emulator to work.

Okay, there's lots of pirated copies of Windows out there, but we're talking legally running software here. So if you're wondering why Linux comprises 1% of all operating systems in use, you don't need to go much farther. But we will carry on.

2. After installing Linux, some piece of hardware doesn't work. Well, says the author, the same thing happens with Windows, which it does when new versions come out. However, when you've got a two or three year old video card in your system, Windows will have a driver. If it doesn't, a quick trip to a search engine will locate one that you can install in one step. If you're missing a Linux driver, you've got to hope someone has written a driver that will work with your Linux distro. If you can find one, then you may have the fun of compiling it, not a task the average user is going to familiar with. Granted there's been some improvement on this score in the Linux community, but there are still plenty of gaps.

After installing SUSE 10.0 some months back, I found that my PC Card wireless modem didn't work. After a significant amount of time searching, I found a rather lengthy procedure for installing a driver and then tweaking configuration files to make the thing actually connect, at half its normal speed. So I could run SUSE as long as I didn't want Intenet connectivity.

Oh wait, the card worked in my Windows machine, so I set up Internet Connection Sharing. I then spent a couple of hours tweaking the wireless adapter, which also didn't work so well with SUSE, to connect to the ICS network.

I got it working but had a tough time imagining the average user doing any of that.

3. Linux can require the use of the command line. Okay, I sympathise with the author here. It's ludicrous that people are so thoroughly intimidated by typing a simple command in a DOS box, if we're talking Windows, or a terminal session. But, friends of Linus, that's the way it is, and distros like Ubuntu virtually advertise themselves as easy-to-use windowing environments, not windowing environments that require knowing a lot of command-line syntax.

You want to spread to the masses, you've got to live with their frailties.

4. Something strange happened that doesn't happen in Windows. I'm with the author here, because things are going to break in any OS, and they'll break differently in Linux than in Windows. In fact, things break differently in Vista than they do in Windows XP. Actually, I've never heard this reason for quitting Linux before, but this guy writes user guides, so he's probably heard weirder ones than this.

5. I tried to get help online and got kicked in the teeth. Tough rocks, says the author. Well, the snobbishness of experienced Linux types has been legendary. Back in the days of the Usenet, Linux newsgroups were places newbies went to die. Or at least suffer a lot of humiliation. If they got any advice at all, it usually was either couched in technical language beyond the user's skill level. Or else the advice was, "Read the man page, moron."

Ironically, these same newsgroups would contain endless threads complaining about how Linux wasn't spreading like wildfire to the desktop. No one could understand why, but it had to have something to do with Microsoft's dastardly strategems.

6. Some people just don't like it. This is another reason that the author pretty much says, well, if you don't, you don't. Personal likes are always going to enter into a user's decisions.

Hey, I don't like Vista, and I'm not all that crazy about XP, but I use it because that's what I need to run the apps my organization uses (see 1, above), and I want to be able to get drivers for my hardware (see 2). XP also works with little mucking about as long as one employs good security practices and keeps the junk software off the PC.

The trouble for Linux is that, if the user doesn't like it, he falls back to Windows. If the user doesn't like the latest version of Windows, though, he falls back to his current version until Microsoft comes up with something he can stomach.

7. Sometimes installations of Linux just go totally bonkers. Yes, this can happen with Windows, and it has. But, having installed all those previously mentioned distros, plus three flavors of BSD, I can state based on experience that weird ju-ju pops up more with Linux than with Windows. Ubuntu and SUSE have become pretty painless, but even those can act strangely at times, usually because of hardware issues.

I have had endless discussions over the last 10 years with colleagues about what it would take to move an organization to Linux. The same roadblocks always come up. We have software that is dependent on .NET or Windows SQL server. It would cost tons to migrate them to Java and a Linux-based SQL. We'd have to hire a squad of internal programmers to do what we could buy off-the-shelf in a Windows environment. We'd have a massive retraining program for users. We'd have compaibiltiy problems with other organizations sending users Microsoft Powerpoint presentations, Word documents that wouldn't format properly in OpenOffice, spreadsheets with VB macros that wouldn't work, and on and on.

I'm not saying that Linux will never get into the enterprise. There are places where it has, but these are few and far between. Linux has had far greater penetration in the server end, particularly in the realm of web and ftp servers. Linux is also popular with the appliances used to provide proxy services, search services, and e-mail scanning. That smaller footprint and truly modular design makes Linux a really good server OS.

But, if Linux is going to win the hearts and minds of the ordinary user, they're going to have to deal with the problems above, especially issues with drivers and installation issues. Like or not, sons of Torvalds, you're going to have to win over home users. And you're only going to do that by
making it as easy as Windows to use and install if you're going to cut into Microsoft's dominance.

Oh, and it wouldn't hurt to be nicer when responding to newbie questions.