Sunday, January 15, 2006

Farr Farr Away

Farr Ago News is taking a vacation. Posting will resume February 1st.

In the meantime, here's some food for thought:
False facts are highly injurious to the progress of science, for they often endure long; but false views, if supported by some evidence, do little harm, for every one takes a salutary pleasure in proving their falseness: and when this is done, one path towards error is closed and the road to truth is often at the same time opened.

- Charles Darwin, The Descent of Man And Selection in Relation to Sex, 1871.

Amazing Stardust

Sometimes we shake our heads and wonder how humans could ever be so blindly destructive - of this planet and of each other. And other times we shake our heads in just the same way, and wonder at the truly extraordinary things we accomplish.

Early this morning, somewhere in the middle of a muddy Utah desert, a container from the Stardust explorer touched down and delivered its cargo of comet and star dust. This treasure trove of scientific data represents mission accomplished for the $212 million Stardust mission.
How did NASA spend that $212 million?
After its launching in 1999, the Stardust circled the Sun three times and even flew by the Earth in 2001 for a gravity boost to rendezvous with the comet Wild 2 near Jupiter. The spacecraft came within 149 miles of the comet on Jan. 2, 2004, deploying shields to protect itself from cometary dust while extending a collector filled with a material called aerogel. This low-density silicon material, called "glass smoke" because it is composed of 99.8 percent air, gently slowed and trapped particles without significantly altering or damaging them.

I confess that the process, even reading it a second time, inspires awe (no doubt in part because it's so far beyond my scope of technical understanding). In the blink of an evolutionary eye Homo sapiens has emerged from a small corner of Africa and left its fingerprints on this planet and this solar system in some truly incredible ways. The Stardust mission is only one gaudy example of the ingenuity and achievement of which we are capable.

In this space I spend more time than I would like bemoaning humanity's missteps, shaking my head in dismay. Every now and then it's nice to take a step back, look up at the sky, and shake your head in amazement.

Evolution vs. Intelligent Design: Distinguishing Criticism from Critical Analysis

The evolution vs. intelligent design debate flared up again this week in Ohio where the state school board narrowly approved (9-8) a science lesson on evolution that critics argue inappropriately and inaccurately questions evolutionary theory.

The Situation in Ohio

For those who want the full blow-by-blow account of what has happened to date in Ohio you can find it at, as well as a shorter and somewhat differently partisan account from the Akron Beacon Journal.

What I’d like to draw attention to, however, is not the details of this particular case which is being comprehensively covered on other blogs, including the Panda's Thumb. It’s indisputably important that intelligent design be kept out of science classrooms nationwide, from Dover to Ohio to California. But it’s perhaps equally important that in the post-Kitzmiller fury to eradicate creationism and intelligent design from science curricula proponents of evolutionary theory don’t push too hard in the other direction. And that is the focus of this post.
The Value of Critical Analysis

From the Akron newspaper account describing the controversy we get the following summary:
The state school board approved lesson plans last year that say students should be able to "describe how scientists today continue to investigate and critically analyze aspects of evolutionary theory."

This is, by all accounts, an incomplete and less than wholly accurate description of the curriculum in question. But that's not the instant point. What's useful about the passage above is that it illustrates just how delicate the current intelligent design vs. evolution controversy really is.

Consider: if the entire curriculum consisted of a requirement that students “critically analyze aspects of evolutionary theory” it should (and probably would) be acceptable to even the most ardent evolutionary biologist. Quite simply, mandating critical analysis is not the same thing as mandating criticism.

Science thrives on the push and pull of competing ideas, and while intelligent design has been thoroughly discredited as lacking any basis in empirical science, that does not mean that evolutionary theory, in all its nuanced complexity, is a sealed book.

The theory of evolution is well understood, but it is not yet fully understood in all its details and implications. And it is precisely the critical analysis (as opposed to blind criticism or, equally bad, unthinking acceptance) of evolution that biology courses are meant to engender in high school students who are (hopefully) budding scientists themselves.

The dangers posed by intelligent design, and other equally unsupportable examples of bad science, are real and demand that we be watchful to halt their spread. But arguing that intelligent design is bogus need not, and should not, entail the additional proposition that evolutionary theory is to be accepted without question or comment.

Darwin was right, but Darwin’s theory of evolution was also incomplete. It is only through critical analysis – which will often include some legitimate criticism – that students can hope to thoroughly understand, and one day even contribute to, the theory of evolution.

Science at the Frontier: Not Meant for the Textbooks, and that's OK

Nicholas Wade’s latest piece in the NY Times does a nice job distinguishing between “frontier science” and “textbook science”:
Textbook science is material that has stood the test of time and can be largely relied upon. It may include findings made just a few years ago, but which have been reasonably well confirmed by other laboratories.

Science from the frontiers of knowledge, on the other hand, is wild, untamed and often either wrong or irrelevant to future research. A few years after they are published, most scientific papers are never cited again.

In the post-Hwang cleanup there have been repeated calls to tighten the screws in the screening and review protocols at peer-reviewed scientific journals, including Science, where Hwang published his fraudulent results.
But as Wade aptly points out, frontier science is distinct from textbook science for good reason: it is designed to push the envelope. Though the filtering is imperfect, “this rough screening serves a purpose. Tightening it up, in a vain attempt to produce instant textbook science, could retard the pace of scientific advance.”

It’s important to take appropriate steps to weed out fraudulent science from legitimate and honest scientific research. But that does not (and cannot) mean that every paper published in journals like Science or Nature will one day contain scientific truth within the meaning of “textbook science.”

Wade, a long time science reporter himself, argues that the media and the public, so eager to grasp hold of “the next big thing”, deserve some of the blame for what happened in South Korea, and share some of the responsibility for preventing it from happening again:
Tightening up the reviewing system may remove some faults but will not erase the inescapable gap between textbook science and frontier science. A more effective protection against being surprised by the likes of Dr. Hwang might be for journalists to recognize that journals like Science and Nature do not, and cannot, publish scientific truths. They publish roughly screened scientific claims, which may or may not turn out to be true.

What happened in South Korea is a black eye for scientific research, and in some respects it is tragic, but it is by no means fatal for scientific research at the frontiers of our knowledge.

Urging prudence is appropriate. Setting the initial threshold at the level of established textbook science would be imprudent in the extreme.

Saturday, January 14, 2006

One "Big Lie", One Little Victim

In the aftermath of the stem cell fiasco in South Korea, the real losers are not the stem cell and cloning researchers or the broader scientific community. The real victims are the countless patients worldwide who found, in the form of Woo Suk Hwang, a new reason to hope, only to see that hope dashed in a few whirlwind days.

Case A1 is the South Korean child known as “Donor 2.” Donor 2’s real name is Kim Hyeoni, a nine-year-old who was hit by a car while crossing the street last year. The story in the L.A. Times details the sacrifices that Hyeoni’s family made on his behalf, as well as the hopes that blossomed with Hwang’s seemingly meteoric scientific progress and were shot dead with the revelation that it was “all a big lie.”
Though it may take some time, the scientific community is robust enough to bounce back from this particular scandal. It is for patients like Hyeoni, and countless more like him who will go unnamed in the newspapers, that we should feel compassion and, more importantly, a sense of responsibility to respond to the lies and failures of Hwang with a redoubled effort to make real progress in developing viable stem cell based treatments.

Bigger Haystacks Require Smarter Needle

Earlier, in the context of reviewing Nature’s study comparing the accuracy of Wikipedia and Brittanica, I wrote about the difficulties in distinguishing fact from error, and in labeling a quantum of information as "true." Since then, the Smoking Gun revealed that James Frey falsified large portions of his popular memoirs. What is interesting is not that Frey exaggerated, deceived, and outright lied both in his book and to his fans. No, the big news is that nobody seems to care all that much.

The standard apologist line is that this was a book and what really counts is the story that it tells, whether or not the story is true. Ignoring all the obvious counter-arguments (the most damning being that the book, if fictional, should have been published as such) the reaction to Frey's induced revelations provide further indication that truth is not only subjective, but that it may not even be all that important.
Several of my recent posts – evaluating disputed information as either factual or erroneous, the lack of a correlation between popularity and quality in online information, the disclosure that our online first impressions happen far too quickly to give any weight to content or quality – draw on a similar theme: the transition of information to the online environment, combined with an unprecedented surge in the amount of available information, present us with problems of previously unimagined complexity.

The online environment is dynamic and powerful and it allows information consumers to have more of the world than ever before at their fingertips…if you know what you’re looking for.

The metaphorical haystacks have long overflowed the barn and they’re starting to crowd out the farmers. We need better, smarter needles that will find their own way out of that haystack, and bring themselves to us.

[N.B. – I have reposted the discussion on fact and error in Wikipedia/Britannica to allow it to appear (for the time being) on the main page. The original post date was January 9th]

Intelligent Design, or "Cargo Cult Science"

Catherine Seipp has a very nice piece in the National Review online discussing, among other things, intelligent design and the late Cal-Tech physicist Richard Feynman.

In addition to touting the achievements (and wit) of Feynman, Seipp puts together an interesting analogy between intelligent design and an anecdote from Feynman's writings about “cargo cult science.”
The problem with intelligent design, according to Seipp, is its continual failure to recognize that it cannot be validated other than by faith:
None of the “debates” about evolution vs. intelligent design I’ve encountered seem to be aware that a theory by definition cannot be scientific if its proponents will only accept one conclusion or result. Naturally, any human being has hopes and preferences, which is why the double-blind test (in which experimenters as well as subjects don’t know whether they’re getting medicine or a placebo) is the gold standard in drug research, even though of course it’s not always practical.

But however much researchers might hope they’ve found, for instance, a new cure, they do not set out to prove that a specific one has to work because their religion requires them to have faith in it. Not if they expect to be taken seriously as scientists.

How does this relate to Feynman, or to something as intriguing as South Seas “cargo cult science”? I’ll let Seipp finish with her analogy:
The title comes, as Feynman explains, from primitive people in the South Seas who’d experienced airplanes landing with useful things during World War II and wanted this to happen again. For years afterward, they would station a man in a wooden hut next to an abandoned runway, with wooden pieces on his ears like headphones and bamboo sticking out like an antenna. But even though he looked just like an air-traffic controller, and fires burned as guide lights just like they did before, still no planes came.

It’s not a new point. But as long as intelligent design proponents keep attempting to inject unscientific, religious-based “theories” into public schools (the latest attempt is in California), it’s a point that must continue to be made. And Seipp does an excellent job.

First Impressions Matter, Especially Online

In the blink of an eye there is more tangential support for the notion that, especially on the web, popularity does not strictly correlate with quality comes from a new study that suggests potential readers form a lasting opinion about a webpage in just 50 milliseconds.

The study, by researchers at Carleton University in Ottawa and published in the journal Behavior and Information Technology, demonstrates that the brain is capable of evaluating a single webpage nearly as fast as the eye can take in the information.
While the researchers were surprised that humans could observe and process information in only 50 milliseconds ("My colleagues believed it would be impossible to really see anything in less than 500 milliseconds," says Gitte Lindgaard.) it’s clear that there is no substantive evaluation happening at these speeds.

The article points out that:
The lasting effect of first impressions is known to psychologists as the 'halo effect': if you can snare people with an attractive design, they are more likely to overlook other minor faults with the site, and may rate its actual content (such as this article, for example) more favourably.

This is because of 'cognitive bias', Lindgaard explains. People enjoy being right, so continuing to use a website that gave a good first impression helps to 'prove' to themselves that they made a good initial decision. The phenomenon pervades our society; even doctors have been shown to follow their initial hunches, Lindgaard says, relying heavily on a patient's most immediately obvious symptom when making a diagnosis. "It's awfully scary stuff, but the tendency to jump to conclusions is far more widespread than we realize," she says.

The study lends further support to the hypothesis that, especially online, ranking and voting sites (such as, or Google) are more representative of popularity contests than anything else, including the quality of the content that is voted to the top.

Just like in high school, it’s looks that really matter.

Indigo Children: Fact or Fiction?

represent the newest big thing in parapsychology. The story (courtesy of the ) relates that these children, with their indigo aura,
share traits like high I.Q., acute intuition, self-confidence, resistance to authority and disruptive tendencies, which are often diagnosed as attention-deficit disorder, known as A.D.D., or attention-deficit hyperactivity disorder, or A.D.H.D.

And they're also here to save the world.
Of course to skeptics the issue is not when these children are going to eradicate the world of terrorists and reverse global warming. It is whether the notion of indigo children, and for that matter parapsychology, has any basis in reality.

In an attempt to avoid any further backsliding into epistemic uncertainty (especially after my recent confusion concerning , , and even "") allow me to assume, for the purposes of this post, that we have an adequate degree of competency in distinguishing those beliefs or assertions which have compelling evidentiary warrant from those that do not.

Evidentiary warrant is distinct from “truth.” For instance, we have good inductive evidence in support of the belief, often framed as a form of knowledge, that the sun will rise in the East tomorrow. By contrast, most of us could posit little support for the proposition that on Sunday, when the sun appears over the horizon, it will be wearing a giant pair of sunglasses. The former has a basis of some sort in reality; the latter is just amusing to think about.

So into which category do "indigo children" fall? All evidence points squarely in the direction of giant solar shades. Which is to say, there is nothing that a scientific empiricist would accept as support for the claims made on behalf of indigo children.

I understand, of course, that many people believe in claims for which there is decidedly little empirical support -- a belief in a certain instantiation of god chief among them. And I'm not suggesting that all beliefs held in the absence of a proper evidentiary basis are ultimately and fundamentally wrong. Quite to the contrary. It is (very) occasionally the case that some such beliefs are later discovered to have quite a firm footing in reality.

For instance, the of species by way of natural selection is attributed, more or less, to the work and thoughts of . Which is as it should be. But Darwin was certainly not the first to propose a theory of evolution, or to suggest that human beings did not appear on Earth in their present form.

The pre-Socratic philosopher (c. 611 BC – c. 547 BC), for instance, offered the theory that man and the other species had come to exist through a process of transmutation. Man himself was thought likely to have evolved from some other, aquatic, species of animal. Needless to say this notion did not gain popular acceptance for more than two millennia.

Am I suggesting, even for a moment, that this business of indigo children, and colored auras and whatnot, is well-grounded in reality? In a word, definitely not.

But history suggests that there are perhaps a handful of ideas and beliefs floating around out there that most rational people would find to be entirely incredible and yet, when all is said and done, will prove to have a much more substantial grounding in reality than would have ever been suspected.

Is this anything more than an accident of statistics? Probably not. But it's interesting to ponder which conspiracy theory, which pseudo-science, which mysterious story about will be vindicated with time.

Friday, January 13, 2006

Accumulated Errors

The safest road to hell is the gradual one - the gentle slope, soft underfoot, without sudden turnings, without milestones, without signposts.

- C.S. Lewis

A recent article on looks at the South Korea cloning controversy from a slightly different angle. Rather than focusing on the one blockbuster example of unethical scientific research that (fleetingly) captures the public's attention, the piece focuses on the .
In particular the distribution system for scientific research grants is examined and found to be wanting. While the piece is short and correspondingly light on examples of, and solutions for, the problem with grants it's a provocative read.

Thanks to the for pointing this out.

Sorting the Wheat from the Chaff: The Wisdom of "Digging"

That’s the biggest challenge for an individual crawling through the modern World Wide Web. With the explosion of available information on almost every topic imaginable the modern information consumer must be efficient and discerning. There’s far too much to read it all, and we’d all like to productively invest the time we do spend reading and digesting quality information.

Earlier this week I wrote about the tension between large, active user bases and traditional notions of “truth,” in my discussion of facts and errors in Wikipedia and the Encyclopedia Britannica. Today, in a nice open-ended post on Gene Smith’s , Smith raises a related issue. Rather than worrying about the “truth” of online information, a la Wikipedia, Smith wonders how we can discern between the “quality” and the “popularity” of information.
The Problem

Smith gently criticizes , one of the most popular social bookmarking and blogging sites, as being overly attuned to popularity, and not focused enough on quality. You can argue that sites like Digg represent the ascension of quality information – users vote up quality stories and commend them to the attention of the time-strapped information consumer – but I, like Smith, am skeptical of this claim because I believe there is a difference between quality and numeric popularity (i.e., a large number of votes).

But therein lies the rub: How do you distinguish between the two? Smith proposes an admittedly vague “betting” scheme that would pit different stories against each other. Another solution is the one that Google employs – effectively weighting certain votes (in the form of links) more heavily than others, a highly undemocratic system but one which works in many situations.

But neither of these solutions address the problem that the Wikipedia / Britannica debate makes clear. Any label, whether it be one of “fact” or of “quality”, that derives its warrant largely from strength of numbers (whether they are links, or votes, or bets, or anything else that can be tabulated) is inherently threatened by what John Stuart Mill termed the “tyranny of the majority.”

Do You Vote Before You Think?

In his essay , Mill points out that the democratic majority is dangerous just because it does not require a ruling tyrant in order to act tyrannically. “Society can and does exercise its own mandates; and if it issues wrong mandates instead of right…it practices a social tyranny more formidable than many kinds of political oppression…”

While it is true that this risk – the runaway majority - is a risk run by every committedly democratic institution the online democracy, embodied by sites like Digg, pose additional dangers above and beyond those that attend more traditional democratic activities, such as political elections. First, and most obviously, is the problem of anonymity and the attendant risk of cheating. Much of the voting that happens online is far from transparent and, unlike in political elections, there are no recounts.

Secondly, and more debatably, is the relative online presence of the “herd” mentality, which Smith’s post draws attention to. At first blush it appears to me that, in an environment of hyperlinks, dynamic content, and instant voting there is a tendency to receive votes that are not as well considered. The effort of voting (pasting a link or “digging” a story) is minimal and the consequences of “mis-voting” are seemingly inconsequential.

But the aggregate result of unconsidered “votes” may yet be substantial. “Popularity” is frequently a self-fulfilling and self-referential phenomenon and the ease of voting, combined with the “herd” mentality (“if 463 people think this story is cool then it must be cool. I’m going to vote for it too”) might make a substantial impact on the ultimate popularity of a quantum of information, independent of its quality.

I’m not suggesting that the majority of Digg users vote multiple times, follow the crowd, or are anything less than considerate community members who vote for quality, however they personally define it. But, certainly, some voters do fit this profile.

The question is how many? If the number is significant – and even a relatively small minority can be significant where the tolerance for error is small – then we have reason to share Smith’s concern that there may be disconnect between popularity and quality.

The Solution (?)

The United States does not possess anything like a voter qualification test. The only thing we do have is a law that requires, in most political elections, that the voter be 18 years or older in order to vote. One (but not the only) rationale for this requirement is that individuals, having reached the age of majority, can by and large be expected to vote intelligently and responsibly. You can take issue with whether or not people in this country vote at all “intelligently”, but that at least is the claim.

In addition almost any election that requires you to walk into a voting booth and fill out a ballot is also going to require proof that the election is applicable to you in some meaningful way. Frequently, but not always, this is established through a residency requirement. For instance, citizens of Massachusetts can’t vote in the school board elections in Kansas, no matter how much they might wish it. Similarly, citizens of foreign countries can’t vote in American presidential elections. And so on.

Online we have nothing of the sort. Anybody who can use a mouse and has an internet connection can vote on sites like Digg. While there are (necessarily incomplete) protections in place to discourage multiple voting and other forms of cheating, there is nothing remotely resembling a voter qualification requirement, either of the residency or the age of majority variety.

Does this ultimately matter? I’m not sure, nor am I sure what form such an online voter qualification would take. Screening or barring voters because they are “unqualified” would be an incredibly tricky proposition and one that I’m uncertain would do much to improve the situation.

But in a firmly democratic environment such as Digg, where the “popularity” and “quality” of the “elected” information diverges, it’s certainly some food for thought….

At Least We Still Have Snuppy

Snuppy, the cloned Afghan hound produced by South Korean researcher Woo Suk Hwang, is actually a clone. Unfortunately, that's about the only piece of Hwang's once celebrated breakthroughs left standing after the flurry of retractions and admissions of guilt that have been coming fast and furious for over a month now.
I've left this topic largely untreated as its developed for two reasons: 1) it's getting plenty of coverage everywhere else, and 2) it was (and still is) a rapidly developing story that often rendered statements, and their attendant commentary, irrelevant almost immediately.

For those who want a complete blow-by-blow account of the entire Hwang controversy you'll find one here. And for those who wonder what will remain of stem cell research after the dust has finally settled in South Korea, I commend your attention to a recent piece in Forbes magazine (thank you to for bringing this to my attention).

The basic premise of the piece, with which I agree, is that the state of stem cell research now returns to it's pre-Woo Suk Hwang state. Which is to say, there's still a lot of work to do. What the cloning controversy almost certainly won't do is cripple funding or public support for stem cell research. While increased skepticism is sure to greet future breakthroughs in the field, the potential promise of therapeutic cloning is too great to allow one unfortunate incident to derail the entire field.

As David Magnus, director of the Stanford Center for Biomedical Ethics put it: "It's certainly disappointing that we're back to where we were, but it's not a disaster."

Update: The journal Science has formally retracted both of Hwang's articles.

The Future of Nanobiotechnology

The Interdisciplinary Center for Technology Analysis & Forecasting (iCTAF), an Israeli think-tank at Tel-Aviv University, has forecasted the future of Nanobiotechnology. The survey (available only as a .pdf) summarizes the predictions of 139 experts from 30 countries who were asked to assess the prospects of 20 future developments in the field.
The experts were asked, among other things, in what year they envisioned specific technologies becoming available. Among the developments we can expect to see in the short term (before 2010) are biodetection with smart adaptable nanosurfaces, and nano-agents for analysis inside cells.

The value of such predictions is debatable but they're intriguing nevertheless. Plus, it's fun to imagine that by 2016-2020 we'll be able to engage in in vitro construction of artificial human organs.

Grading Wikipedia: A Closer Look at the Scores

Last month the journal Nature published a study comparing the accuracy of the online, user-edited encyclopedia Wikipedia with the traditional research staple the Encyclopedia Britannica. In a recent article in the New York Times, George Johnson goes behind the Nature numbers and reveals that the competition between the two is simply too close to call.

The Nature study found an average of four errors per article in Wikipedia, compared to three in Britannica. But a close examination of those errors leads Johnson to ask, "Just what counts as an error?"
In many of the situations that Johnson investigates, the error represents little more than a matter of judgment. For instance, Britannica referred to "Croton" as the home of the Greek mathematician Pythagoras. Other options included "Crotone", "Crotona", and "Kroton".

Without digressing too far into the metaphysics of truth it seems to me that, if not now then someday, tools like Wikipedia may legitimately act as their own authority. A trivial example: What is the proper pronunciation of the town "Edinburgh". The answer, of course, is it depends. If you live in Scotland it's Edin-burrow (well, it is if you're an American in Scotland) and if you live in Indiana it's likely to be Edin-berg. Which is "right" and which one is an "error"? You can argue all you want about historical roots and proper pronunciation but I don't suspect there really is a "right" answer here.

What's the point of all these burrows and bergs? For years institutions like the Encyclopedia Britannica have operated as a factual authority, lending the descriptor of "truth" to what were really nothing more than discretionary judgments. Measured and well-supported, but judgments not truths nonetheless.

Wikipedia will not fundamentally change that. There is never going to be one "true" spelling for the birthplace of Pythagoras, or one "true" pronunciation of "Edinburgh." But what there will be is a vote by a majority, rather than a fiat issued by the authority.

Johnson titles his article "The Nitpicking of the Masses vs. the Authority of the Experts." What the Nature study really represents is an evaluation of the Authority of the Masses vs. the Authority of the Experts. It won't be long until resources like Wikipedia can lay legitimate claim, in matters of factual judgment, to a more persuasive authority than even a journal like Nature.

Thursday, January 12, 2006

Spam Smarts?

Charles Arthur, writing in today's Guardian, points out that spam is down and argues that the reason is two-fold:

1) Anti-spam technology has become more sophisticated.
2) People are now less likely to be fooled by spam, making it less profitable.
The first reason I can buy. And for many people, although not all, spam blocking software (whether you know it's there or not, it is, and it's stopping spam from reaching you) has made an enormous difference in what their inboxes look like on a day to day basis.

But the second reason I find harder to swallow. Arthur writes that savvy emailers have become adept at filtering the credible mail from the spam.
Similarly, common sense suggests that unheard-of stock probably won't boom; and I think that after the stock market deceptions of the 1990s, people won't get fooled again. (The volume of spam touting stocks doesn't necessarily indicate that people are buying them - only that people trying to pump them have hired a spammer to spread the word.)

I'm sorry, but I disagree. As long as there are people offering to make you rich for a few easy mouse clicks or a small up front investment there are going to be people willing to accept (and to empty out their wallets in the process).

While it's true that we may be slowly developing a form of institutional knowledge that allows us, as email users, to implicitly identify and reject spam in our inboxes I can hardly agree with Arthur's conclusion: "Spam hasn't been solved. But I think our attitude to it could be."

To quote the great American showman P.T. Barnum, "there's a customer born every minute." Barnum is often incorrectly believed to have said "there's a sucker born every minute." From the point of view of the spammer, it's not clear that there's all that much difference.

Ants Go to School Too

Members of the ant species Temnothrax albipennis engage in teacher-pupil interactions. Researchers at the University of Bristol, who are publishing their findings tomorrow in the journal Science, claim that such instruction has never previously been documented outside of human beings.
The ants engage in a form of teaching known as "tandem running" whereby one ant teaches another how to find newly discovered sources of food. According to the researchers it is the give and take of this behavior, which is costly to the individual "teacher" ant, that distinguishes it from mimicry, found in many species.

For now the ant's educational offerings appear confined to "Introductory Tandem Running." No word yet on whether there are any plans to introduce "Intelligent Design" to the curriculum...

The Norwegian "Doomsday Vault"

The Norwegian government, like many people in this country, has been contemplating doomsday. However, rather than worry over trivialities such as heaven, hell, and the state of their immortal (national) soul, the Norwegians have decided to build...a seed bank.

What were you expecting, a bigger studio for Pat Robertson?
Next year, on an Arctic island, Norway will begin construction on its "doomsday vault." Housed inside a bunker, inside a mountain, the vault will contain all known varieties of the world's crops.

The vault is being constructed on an Arctic island with the rationale being that permafrost will keep the temperature below freezing and ensure the longevity of the seeds. Norway, unlike the United States, has ratified the Kyoto protocol so I assume that they have acknowledged the reality of global warming and factored in rising Arctic temperatures.

Although the BBC article gives no indication who thought up this idea it's not a bad one. In case of a global calamity, provided that life is not totally obliterated, such a reserve could prove invaluable.

But the idea is also mildly unnerving? Does the Norwegian government know something we don't or, more likely, are they simply reading the same newspapers and drawing their own (not implausible) conclusions about the future of humankind?

Scientists: "Mad, bad, and dangerous to know"

That is the public perception of the modern scientist. Drawn from 1960s iconography the public perception of the scientist is one part Albert Einstein, one part Dr. Frankenstein, and zero parts cool.
There is, perhaps unsurprisingly, a rather wide disconnect between the public's perception of scientists and scientific research and reality. "The image of the mad scientist, free to do his own thing in a laboratory near you, is a far cry from the reality of scientific life, which is dependent on rather more mundane concerns."

Overcoming this gap will require a commitment from the scientific community to communicate with the public, and to make themselves, as well as their research, accessible. And, in turn, the public needs to make an effort to purge itself of what sometimes appears to be willful scientific evidence.

I wonder, more than half seriously, whether this isn’t one information disconnect that neither side is overly anxious to correct. The image of the scientist, mad as a hatter, running around inside a laboratory as beakers bubble over and lightening flashes, is a clear myth. As is the belief that scientific research has become so esoteric and specialized that it is entirely inaccessible to the public.

But if the public isn't willing to put forth the effort to learn, and if the professional scientists would prefer to be left to themselves (preferably in relatively clean, sunny labs), then who is there to put a stop to the perpetuation of these myths? And does anybody even really care?

Identity Theft: What if you lose your genome?

Israeli researchers are investigating techniques to encapsulate an individual's entire genome on a computer swipeable card. This idea isn't a new one, and your individual genetic sequence isn't going to be showing up on a specially-issued AmEx card anytime soon.

Still, if you thought losing your wallet was bad before, imagine what identity theft will be like when your wallet contains literally every piece of (genetic) information about you...

Wednesday, January 11, 2006

Intelligent Design: Back from the Dead?

Intelligent design is back in the courts. Again.

The setting this time is the rural California town of Lebec, just north of Los Angeles. In Lebec a group of parents are seeking a temporary restraining order to prevent Frazier Mountain High School from teaching a course on intelligent design, creationism and evolution that they allege is a thinly disguised vehicle for the promotion of intelligent design.
Do the parents have a valid complaint or are they overreacting? While it's impossible to evaluate from afar the intent of the school district or the proposed teacher, who, incidentally, is married to the pastor of the local Assemblies of God church, the syllabus did promise to bring two evolutionary experts in as guest speakers for the class.

One expert, a local parent and scientist, had previously refused the invitation. He is now one of the parties suing the school district.

The other scheduled guest expert was Francis H.C. Crick, renowned biologist and one of the original discoverers of the structure of DNA. Unfortunately, Dr. Crick died in 2004, otherwise he might be suing the school district as well.

Creative Math

A Phoenix municipal judge has ruled that fetuses do not count as passengers for purposes of determining carpool lane eligibility. Unfortunately, the ruling came too late in the afternoon for the senate to work up any questions for Supreme Court nominee Samuel Alito on the matter.

Monday, January 09, 2006


British secret agent, member of the genus Mus, and sacker of homes.

Brokeback Mormons

A Utah movie theatre owned by the owner of the NBA's Utah Jazz has pulled the film "Brokeback Mountain" from its schedule. While the theatre management refused to comment, "Gayle Ruzicka, president of the conservative Utah Eagle Forum, said not showing the film set an example for the people of Utah."

Indeed. And that message is: 'we see the narrow-mindedness of those people in Kansas and we will not let it go unanswered. We can be narrow-minded too!' While I'm frankly surprised that something like this didn't happen earlier (probably because the general conservative strategy has been to ignore the film and hope that it goes away) it doesn't make it any better.
As an advance apologia of my indictment of the entire state of Utah: it's a useful shorthand, and I recognize its limitations. I certainly don't believe that Ms. Ruzicka represents the view of everyone in the state of Utah, just as I don't believe Sam Brownback represents the views of everyone in Kansas. Nevertheless, it is these kind of people acting as the spokesmen for their states. And will be treated as such.

For those that don't share their views and object to such de facto spokesmanship don't worry, you're undergoing the same thing on a much larger scale all around the world. Remember, for many people in this world, whether you are a resident of Utah or California, Kansas or Massachusetts, you are an American. Which makes George W. Bush your leader, spokesman and, for many people, your ideological and philosophical mouthpiece.

You know - Mr. Brownback actually is starting to look not half-bad.

Sunday, January 08, 2006


The American Dialect Society today voted "truthiness" its word of the year for 2005. We can thank Stephen Colbert, of the Colbet Report, for coining this term which means "the quality of stating concepts or facts one wishes or believes to be true, rather than concepts or facts known to be true."

An appropriate choice to memorialize a year (really, it's been longer) gone by in which facts became increasingly optional, and nobody in Washington appeared particularly concerned. Now we turn our sites to 2006 and wonder what this coming year will bring?
Here's an early entry to the field: "circumlegally."

The term refers to "the ability to skirt or bypass existing legal requirements by successfully arguing that the behavior or action was appropriate because either a) the law, in the interest of national security, doesn't apply to you, b) the law doesn't mean what the rest of the country thought it meant or c) it is what the law ought to allow anyway."

FAA: Terrorists not to be allowed in outer space

Draft regulations issued by the FAA include the suggestion that space tourism companies screen potential passengers with the Department of Homeland Security to avoid carrying any potential terrorists into outer space.

Although I find it slightly ludicrious to be worrying about "space terrorism", the FAA regulations are another encouraging (and legitimizing) step along the road to full-fledged space tourism.

More Google, More Questions

The behemoth that is Google, now that it's publicly traded, ripples the waters in both the technology and the financial spheres. A column in today's New York Times wonders if Google is riding a financial bubble that is bound for the bursting. That's more or less what I asked yesterday with respect to Google's technological market ascension.

Bush Proposes Broader Language Training

Educational programs to teach language in public schools are going to get more funding. Why? Because it's a national security concern.

I should've guessed.
Whatever. More money ($114 million) for language instruction is a good thing, regardless of the motivation behind it. Here's how Bush voiced his support for the new initiative:
"You know, when somebody comes to me and speaks Texan, I know they appreciate the Texas culture. When somebody takes time to figure out how to speak Arabic, it means they're interested in somebody else's culture."

It's not clear whether "Texan" is going to be one of the languages receiving increased funding. Frankly, that might not be a bad use of resources: it might help decipher exactly what Bush is saying some of the time. To wit:
"Neither in French nor in English nor in Mexican." — George W. Bush refusing to answer reporters' questions at the Summit of the Americas, Quebec City, Canada, April 21, 2001

If Bush has mastered Texan and dabbled in Mexican, is he going to take in interest in yet another culture through the study and practice of language?

Anyone know how to say "not bloody likely" in Arabic?

Spying On the Cheap

Forget using the NSA for wiretaps. Forget using the Department of Homeland Security to monitor mail. Both are large government agencies with hefty overhead costs. If you really want your spying done economically then, like anything else, you should be doing it yourself.

According to an article in the Chicago Sun-Times dozens of online services are selling complete phone records to anyone who submits a phone number and ponies up $110. Unbelievable, but true.
Perhaps the most disheartening fact is that Congress and the Administration have been aware of this "most powerful investigative tool" for months. Rather than repeat the myriad of ways in which such a service can (and assuredly is) being abused, I'll leave that to AMERICAblog and the rest of the blogosphere.

Boys Being Boys, Hurting Their Sons

The Guardian reports on a study by Marcus Pembrey, a clinical geneticist at the Institute of Child Health in London, which suggests that the behavior of pre-pubescent boys can have a negative impact on the health of their future sons.

Specifically, Pembrey found that men who took up smoking before puberty fathered sons that were overweight as compared to other children. While not quite Lamarckian in scope the study, if confirmed by future findings, is certainly an eye-opener. Boys may be boys, but if Pembrey is right they may want to wait until after puberty.

Saturday, January 07, 2006

Spy vs. Spy You

By and large I haven't had much new to say about the NSA wiretap controversy. Basically, it's a mess - both legally and politically - and still very much in its infancy.

However, if I may, I'd like to draw your attention to Frank Rich's most recent column in the New York Times. Rich asks a question that I'd like to hear answered: Just who did we think we were fooling by keeping the NSA wiretaps secret?
Rich writes thusly:
If fictional terrorists concocted by Hollywood can figure out that the National Security Agency is listening to their every call, guess what? Real-life terrorists know this, too. So when a hyperventilating President Bush rants that the exposure of his warrant-free wiretapping in a newspaper is shameful and puts "our citizens at risk" by revealing our espionage playbook, you have to wonder what he is really trying to hide. Our enemies, as America has learned the hard way, are not morons. Even if Al Qaeda hasn't seen "Sleeper Cell" because it refuses to spring for pay cable, it has surely assumed from the get-go that the White House would ignore legal restraints on eavesdropping, just as it has on detainee jurisprudence and torture.
That's a very good point. Maybe there are (or were) terrorists out there that believe their communications to be safe. But do we really think that's because they thought that the government, though it had the capability to monitor their communiqués, was refraining from doing so because it was illegal?

As Rich points out, that's a bit of stretch. Actually, it's not even a stretch, it's utterly unbelievable.

So why the secrecy of the NSA wiretaps? Rich has his own list of theories which is far from exhaustive. While he suggests that the wiretaps might have been put in place to "eavesdrop on American journalists and political opponents" he ignores what strikes me as one of the most likely explanations: the administration and the NSA thought they needed the wiretaps to monitor terrorist activity, but didn't think they could get the courts and the congress to authorize it. In many respects that scenario fits most closely with an administration's behavior since 9/11.

Regardless, we're unlikely to ever know the real motive behind the NSA wiretaps. But the purported secrecy rationale - that to conduct the wiretaps openly would have tipped off the terrorists - is an absolute joke.

A Googol of Google News

I love Google. They make great software and tools of all kinds, and I've never paid a penny for any of it. And guess what? I'm not the only one. If you look at the main page of /. right now you'll notice that three of the ten articles are about Google. Google's migrating to phones. Google unveils the "Google Pack." Google announces a video store.

What's so unusual about this? Nothing. And that's exactly the point. Google has become such a behomoth that their every move is newsworthy. And so I wonder...
How long can Google's ascension last? And, though it's blasphemous to even speculate, will we ever grow tired of seeing Google starring as technology's White Knight? Not that I am. Like I said, Google makes great products and gives them away. What's not to like, right?

Mark Twain wrote that "Truth is stranger than fiction, but it is because fiction is obliged to stick to possibilities, truth isn't." In a fictional world Google would be easing us into a complacent dependence upon all things Google, preparing to take over our technological lives. Stranger things have happened and, if Mr. Twain is right, they probably will here too.

Coming Clean on Domestic Spying

As the hullaballoo over the NSA spying controversy continues to boil on another government agency is trying a different tactic: spying openly. The Department of Homeland Security has acknowledged that it "can, will and does open mail coming to U.S. citizens from a foreign country whenever it's deemed necessary."

Which, I wonder, is worse: spying on the sly or spying that everyone can see, and nobody can stop? With the threat of democracy terrorism in this country greater than ever, it's a prudent administration that covers its bases by spying every which way imaginable.

A Piece of the Pleistocene, in Florida

Last month we reported on two proposals to develop Pleistocene Parks - one in the United States and one in Siberia - where species long since extinct in those parts of the world would be restored.

While it's not quite the same thing as the proposed "Pleistocene re-wilding", Lion County Safari has been bringing African megafauna face to face with Americans since 1967. Lion County Safari, and other similar endeavors, are in the news because, after almost three decades of almost totally unencumbered encounters between lions and men, they've been forced to build a fence.
Lion County Safari cites the need "to keep rule-breaking visitors from becoming lunch" as the impetus for the chain-link fence that now separates lions from auto-ensconced tourists.

Too bad. I'd been not-so-secretly hoping that re-introducing African megafauna to North America would exert a certain selection pressure on those dimmer denizens inclined to imagine that cuddling up to a five hundred pound carnivore might make a nice photo-op.


There is a new frontier in medical ethics: the morgue. According to an article in the Chicago Sun-Times researchers are increasingly interested in using the recently departed in all manner of trials that could not be safely or ethically conducted using the living.

And with this development several ethicists have been dragged away from contemplating the thorny question "When does life begin?" to consider when it ends, and how science should proceed after it does.
Determining that somebody is dead is not always a straightforward proposition. As Robert Sapolsky writes, "cultures...differ as to when they decide someone is good and dead. And sometimes, individuals whom we would consider robustly alive are considered dead." ("Monkeyluv", 189-190). That's sufficient reason to make pronouncements of death with care, especially before proceeding with scientific research.

But even in those cases were the person is unambiguously dead - don't think Terry Schiavo, think rotting corpse - it goes without saying that the dead should be treated with dignity and with respect. Indeed, the suggestion that "respect for persons, a pivotal principle in research ethics, should be extended to the recently dead," is a key component of a new set of voluntary research guidelines. But with all due respect to both the living and the dead, recently or otherwise, I have to wonder, "why?"

Why is it that the dead are worth - both emotionally and financially - so much to us? We cringe at the thought of their dissection, and spend small (and sometimes very large) fortunes to entomb them in style. Why such a large investment for these no-longer-people? According to the National Funeral Directors Association, the Funeral Industry alone generates $11 billion in annual revenue. Why not save our tears, and our money, to benefit the living (or future generations)?

I strongly suspect that there are colorable anthropological explanations for our treatment of and attitudes toward the dead. I also suspect that, like so many behaviors and attitudes that are so obvious and widespread that they are rarely if ever questioned, many of these may now be anachronistic and, at least in some respects, unnecessary. As long as we're bringing in the ethicists to scrutinize how science handles our dead, mightn't we ask them to scrutinize, at the same time, the unquestioned assumptions underlying our own treatment of the dead?

Friday, January 06, 2006

Stem Cells In Their First Stockings

That's what many newborns received this year, after Smart Cells International began selling gift vouchers for a procedure to extract and store umbilical cord stem cells. While the tiny gift - frequently given by grandparents to their grandchildren - made a novel holiday present, with a price tag of over $2,000 it hardly qualified as a stocking stuffer.

The Birds and the Bees (and the Hornets)

If you've grown weary of the coverage of the war (Lingering conflict? Insurgent uprising? What is it being called now anyhow?) in Iraq, how about a war between bees and hornets instead.

This may not be "newsworthy", but it is incredibly cool. Check out more amazing footage at National Geographic's top ten videos of 2006.

The Ultimate Thrill Walk

Grand Canyon Skywalk Photo
That's what the Hualapai Indians are promising to deliver this summer: a glass walkway extending 70 ft. out over the edge of the Western rim of the Grand Canyon.

Imagine staring straight down at the Colorado River, 4,000 feet below, with nothing but you, some glass, and a whole lot of very empty space. Gives you vertigo just thinking about it.

Among the many other buildings to be located at the Hualapai tourist complex will be a restaurant. No word yet on whether the restaurant will serve meals only after the tourists have completed their walk.

Thursday, January 05, 2006

Thinking Small, Thinking Ahead

That’s what Davis Baird and his team at the University of South Carolina NanoCenter are doing. Baird, a philosopher by trade, isn’t interested in how many angels can dance on the head of a pin – probably because pins aren’t small enough. Baird is interested in nanotechnology, specifically the impact of emerging and imagined nanotechnologies. Baird’s prophylactic mission – the goal is to uncover (potential) social and ethical problems posed by nanotechnology – is an excellent idea, and both he and USC deserve to be applauded for their forward-thinking.

The Most "Exciting" Sport

What makes sports exciting? A recent BBC article tells us that a group of scientists from Los Alamos National Laboratory have come up with a one word answer: upsets.

The researchers concluded that the “upset frequency” of the major sports is highest in soccer (or football, if you’re the BBC) and, therefore, that soccer is the most exciting sport. Any definition of what makes sport “exciting” is bound to be largely arbitrary. Nevertheless, it strikes me that there are good reasons to reject “upset frequency” as a synonym for “excitement.”
From all indications (I haven't seen the actual published study), the study treats an upset as any outcome in which an inferior team triumphs over a numerically superior one, no matter how close in record the two teams. So, for example, a baseball team with a 79-80 won-loss record that defeats another baseball team with an 80-79 record will record an “upset.” This, as almost any sports fan worth his or her giant foam finger will quickly recognize, hardly corresponds with the colloquial usage of “upset.”

Furthermore, there’s little chance that such a hypothetical game - with those records it was almost certainly a meaningless late-season game between two mediocre teams - proved "exciting" to the spectators in attendance. If the game was exciting it was probably because the final score was 12-11, or because something unusual or noteworthy happened, not because the "underdog" won.

Equating “excitement” with “upset”, without considering spectator expectations about the result (the average spectator recognizes that, in the example above, the outcome is more or less a coin toss. Whereas, if the two teams sport 104-55 and 55-104 records, respectively, there is a much more substantial expectation as to the result), has the functional equivalent of rewarding parity (or mediocrity?) by labeling it “exciting.”

And, perhaps, it actually is the case that sports with a high degree of parity are more "exciting." Casual observation suggests that, in many respects, this is the direction in which several of the major sports in this country have moved in the past decade. But if parity corresponds to excitement, there is undoubtedly more to the equation than mere won-loss records. The closeness of the result, the general improvement of play which increasing parity generally represents, are both much more likely candidates to contribute to a sport's "excitement" level.

So, what's the conclusion? The Los Alamos study has a neat little premise – “what makes a sport exciting?” – but is executed in a meaninglessly simplistic fashion. Better luck next season...

Wednesday, January 04, 2006

What Makes a Good Book?

Ever read a "critically acclaimed" book and wondered what all the fuss was about? Can't understand why "Paul Clifford" didn't snag a Pulitzer? You're not alone in your literary confusion.

As noted in today's New York Times, twenty of twenty one publishers and agents rejected the opening chapters of two books submitted by the Sunday Times of London. The catch? Both books were previous Booker Prize winners in the 1970s.

Does this mean that publishers and agents can't tell good writing from bad? Maybe. Is this evidence of corruption and scandal in the awardance of 1970's-era Booker prizes? Probably.