Wednesday, December 21, 2005

Happy Holidays

Farr Ago News is farr away on vacation.
The news will resume in 2006.

Tuesday, December 20, 2005

An Intelligent Decision

Judge John E. Jones III released his much anticipated decision today in the case of Kitzmiller v. Dover Area School District, otherwise known as the "Intelligent Design Case." In summarily (if you can call anything that is 139 pages long "summarily) dismissing the defendants' claims that Intelligent Design did not violate the constitutionally required separation of church and state, Jones was unsparing in his criticism. Illustrative of the majority of the opinion was a footnote, on page 26, in which Jones noted that the "Defendants' argument lacks merit legally and logically."

Due to time and physical constraints I'm going to withhold most of my comments on the decision for the time being. But I will say that, beyond everything else, I am deeply ambivalent about this opinion. On the one hand, I feel both vindicated and relieved to know that science, at least for the moment, has triumphed over sophistry in our law courts. On the other hand, I find it ridiculous and dismaying that, nearly one hundred and fifty years after Darwin first made his theory known, this sort of question even made it to court in the first place.

Scopes was eighty years ago and teaching creationism in schools has been unconstitutional for decades. Eighty years from now will we still be mired in a dispute between evolutionary theory and the latest incarnation of re-packaged creationism? I sure hope not.

Monday, December 19, 2005

Making Sense of Science

In the broadening wake of the South Korean cloning controversy, the New York times has asked a pertinent question: just how much scientific fraud is out there? Unfortunately, more than we would like to think. While it's impossible to know exactly how much fictitious research is passed off as legitimate - some frauds must inevitably remain undetected - the Times article makes it clear that in a field as international and variegated as scientific research, cracking down is not easy.
read more...
Part of the problem is that there is simply too much scientific research to evaluate:
Contributing to the problem is a drastic rise in the number of scientific journals published around the world: more than 54,000, according to Ulrich's Periodicals Directory. This glut can confuse researchers, overwhelm quality-control systems, encourage fraud and distort the public perception of findings.

These overwhelming numbers are hardly unique to scientific journals. The last several decades have brought several milestone innovations - particularly the personal computer and the world wide web - that have facilitated the dissemination of information in ways that were previously unimaginable. Not surprisingly, this has resulted in the contemporaneous rise and spread of misinformation as well.

So, if an increase of misinformation - fraudulent scientific research, in this particular case - is a predictable byproduct of an increase in information generally, is the problem of fraud in science really worsening, or is it growing at roughly the same rate as honest, rigorous scientific research? An interesting question but hardly a solution. Science, like any other discipline, has a reputation to protect and an increase in the perpetuation of frauds, even if subject to ready explanation, is a trend that must be addressed.

How will the scientific community respond to this rise in research, of both the honest and the fraudulent variety? One promising avenue may be open source scientific journals. The Directory of Open Access Journals currently boasts 1980 journals, 488 of which are searchable at the article level. While those numbers are certain to rise, the larger problem of managing massive quantities of new information is not going anywhere.

We are living in an era of unprecedented information and data generation. In searching for the proverbial needle - whether a key piece of useful data, or a well-hidden fabrication - how will we respond now that our haystack has become a hayfield?

Is It Happening Here?

Ever read Sinclair Lewis's novel "It Can't Happen Here"? Doesn't matter. You already know the story. Joe Keohane, writing in yesterday's Boston Globe, has an excellent review of Lewis's eerie work in which "A folksy, self-consciously plainspoken Southern politician rises to power during a period of profound unrest in America."
read more...
Written in 1935, and recently reissued, if Lewis's novel is as Keohane describes it (and I'll know soon enough, I ordered it today) it should send chills down the spine of anyone who reads it. In Lewis's fictional America democracy has fallen to despotism - and nobody seems to care. But as Keohane writes, the smooth-talking Southerner with his legions of cronies and disastrous plans isn't the only one at fault. "The blame also falls on the 'it can't happen here' crowd, those yet to realize that being American doesn't change your human nature; whatever it is that attracts people to tyranny is in Americans like it's in anyone else."

Do we marvel at the prescience of Sinclair Lewis seventy years ago, writing words that seem destined for today? Or do we shudder at the realization that we stand today, just as we stood seven decades ago, nearer to the edge between democracy and despotism then we care to admit? I wish I could say 'neither,' I just don't believe that's the truth.

Choking on their own Smoke

The WHO announced a radical new strategy in their campaign against smoking: they will no longer hire smokers. This isn't banning smoking in the workplace, or promoting a healthy lifestyle for employees, this is bald discrimination. Like it or not, smoking is a legal activity in this country.

Leonard Glantz has responded in the Washington Post with a withering critique of the WHO's new hiring practice. Glantz's article is well worth reading and I won't attempt a summary here. I would like to add, however, that it's not entirely clear to me, at least off the top of my head, whether the WHO's actions are even legal.
read more...
The legality, I suspect, depends heavily on the legal status of the WHO which by virtue of its affiliation with the United Nations isn't exactly a private employer but, on the other hand, isn't exactly a governmental organization either. And, while smoking status clearly is not a suspect classification (as race is, and as gender and sexual orientation arguably should be), shouldn't the WHO have to satisfy a "rational basis" requirement nevertheless?

Perhaps that standard is met by the self-stated principle that the "WHO tries to encourage people to try and lead a healthy life." But if that is the case then where, as Glantz rightly asks, do we draw the line? How many behaviors, in addition to smoking, are arguably incompatible with leading a "healthy life"?

Legality aside, the WHO's anti-smoking statement, if there is one, is being obliterated in this instance by a much more prominent display of narrow-minded intolerance. Imagine if I were a smoker myself - then I would really be up in arms.

Sunday, December 18, 2005

A Joke and Two Plugs

The Only Existing Living Will Joke:

A man and his wife are sitting in the living room and he says to her:
"Just so you know, I never want to live in a vegetative state dependent on some machine. If that ever happens, just pull the plug."
His wife gets up and unplugs the TV.

That bit of bad humor comes courtesy of the American Journal of Bioethics blog which, if you're interested at all in bioethics, is quite an interesting regular read.

More Clones

More Clones


New Jersey, obviously trying to create more good news for the Newark Weekly to report on, announced $5 million in stem cell research grants yesterday. The milestone grants, three of which involve research using human embryonic stem cells, make New Jersey the first state to use public money for such research.

The news is particularly welcome as human cloning researchers and the journal "Science" continue to scramble to come to grips with the increasingly complicated and confused saga of South Korean stem cell research Hwang Woo Suk. Suk has been accused of fabricating data and several other violations of scientific research ethics.

Researching Race

Last week I wrote about the "end of racism", with a big giant question mark on the end of that post title. Here's an interesting new development in that same vein: the discovery of a gene which figures prominently in the determination of skin pigmentation.

While scientists tout the discovery as a further step in understanding the genetic evolution of human traits and, potentially, the causes of skin cancer, there's another dimension to this story, even if nobody is talking about it.
read more...
What are we to make of the possibility, entirely remote at this point in time, of cosmetic alteration of skin pigmentation? For starters, tinkering with such genes might at some point be vastly less expensive than taking a weekly trip to the tanning salon. Of much greater interest is the possibility, some day, of embryos genetically modified for skin color.

Such a technology would no doubt be monumentally controversial. But why, exactly? Is race really about skin color, or is it about something both deeper (a suite of genetic characteristics extending far beyond skin pigmentation) and broader (encompassing cultural and social histories and identities)? Is there an independent importance to skin color itself, even detached from questions of race?

While cosmetic pigmentation alteration is a technology that, if it ever arrives, will be years in the making, its mere possibility is an invitation to look at race and, by extension, racism from a novel perspective. If we could alter the skin color of our children, or even of ourselves, how would react, and what does that reaction tell us about our views on race?

The Holy Grail: Now Required

And we thought our government was ridiculous? Officials in the Brazilian town of Biritiba Mrim have announced a new bill that, if passed, would prohibit residents from dying.

The bill is a novel form of protest against governmental bureaucracy - the city's request for more cemetery space has been held up by red tape - that not everyone is finding amusing. "'I haven't got a job, nor am I healthy. And now they say I can't die. That's ridiculous,' Amarildo do Prado, an unemployed resident, told local media." No kidding.

Pleistocene Park East

If the U.S. version of Pleistocene Park doesnt' get off the ground as hoped, you may need to travel a bit further to get your fix of Pleistocene era megafauna. The BBC is reporting on yet another plan to recreate a Pleistocene ecosystem: in Siberia.

With the support of the government of the Republic of Yakutia (That's a federal subject of Russia. I looked it up) the plan's champion, Dr. Sergey Zimov, is pursuing ecosystem experimentation and research, not tourism. Still, with news of the partial sequencing of the Woolly Mammoth genome, one has to wonder...

Dover Debut

Meet Judge John E. Jones III, Pennsylvania federal court judge and the presiding judge in Kitzmiller v. Dover Area School District, the Dover "intelligent design" case.

Nobody expects the upcoming decision, to be released sometime next week, to be the end of the intelligent design controversy. (And, anyway, there's a new theory already waiting in the wings: "sudden emergence.") Nevertheless, many, including myself, are looking forward to hearing much more from Judge Jones, and to what is certain to be a landmark judicial opinion.

Throwing Down the Gauntlet

New York Times columnist Nicholas Kristof took a week off from writing about Darfur to write a column about a proposed Pleistocene Park. This week, Kristof is back to what he knows (and cares) much more about. And this time he's picked a fight with Fox News anchor Bill O'Reilly to boot.

Huh? What do O'Reilly, Kristof, and Darfur have in common? Actually, not a whole heck of a lot.
read more...
Kristof challenges O'Reilly to stop ranting and raving about the atrocity of being wished a Merry X-Mas or, Ishvara forbid, Happy Holidays and to take a trip to Africa to see atrocities in action. Does Kristof believe that there's any chance Bill O'Reilly is going to be rubbing elbows with Sudanese refugees in this lifetime? No. Does Kristof believe that slamming O'Reilly's show, views, and questionable motives is going to change any of those things? Nope. So why waste the ink?

What Kristof seems to have realized is that, despite the real atrocities happening far away in Darfur, the American public (and the media) is more interested in imagined atrocities right here at home. Like the War on Christmas. By goading Bill O'Reilly, Kristof is betting that a well-publicized verbal sparring match, with Darfur as the backdrop, will generate more publicity for the situation there than a month's worth of serious, O'Reilly-free columns on the same topic.

It's an intriguing strategy. It's just a shame that Kristof has to pick a fight in order to get anyone to listen.

Saturday, December 17, 2005

Exclusive: More Payola!

Update: After nearly twenty-four hours of exhaustive investigative reporting, Farr Ago News has learned that, shockingly, there are still more examples of payola than previously reported in this space. It's not exactly new news, but it's so brazen that it's worth re-reporting on. And yes, by exclusive I meant "exclusive to this blog." Everybody else in the press is lying, why can't I? Let's just move on.

In October, in the face of mounting financial concerns, the Newark Weekly News found a new, and profitable, partner: The Newark City Council. For more on the $100,000 deal between the city and the paper, which will obligate the Weekly News to publish articles focusing on all that is good and wholesome and un-Newark-like about Newark, you can click here. And, if all the good news out of Newark is causing you to rethink your plans for your (please don't yell at me Mr. O'Reilly) holiday vacation, might I suggest the Newark Tourism Bureau?

Friday, December 16, 2005

This Space for Rent

Lest anyone think that the shrinking freedom of the press was the product solely of shadowy dealings between major newspapers and Washington officials, I'm here to set the record straight. BusinessWeek is reporting that one of indicted lobbyist Jack Abramoff's multitude of misdealings includes the bribing of an op-ed columnist at Copley News Service. And, in case there was any doubt, the columnist (Doug Bandow, now an ex-senior fellow at the Cato Institute) wasn't being paid off in the interest of "protecting our national security." He was talking up Abramoff's Indian tribal clients, while Abramoff was busy fleecing them.
read more...
Oh, and let's not forget to mention Peter Ferrara. Ferrara, a senior policy adviser at the conservative Institute for Policy Innovation, also wrote articles supportive of Abramoff's clients in exchange for payments. Amazingly, neither Ferrara, nor his boss Tom Giovanetti, have found this at all problematic. "I've done that in the past, and I'll do it in the future," says Ferrara. Wonderful. At least he's being (somewhat) honest.

Thankfully, as the extent of this troubling practice of paying for opinions is uncovered piece by piece, there is one bright spot in Washington. [your name here], an increasingly influential and well-respected political insider, has openly denounced "pay-for-play," as the practice is more commonly known, and his/her employer, [your company here], is actively supporting his/her position. Said [your company here]'s president, "Mr./Ms. [your last name here]'s work, as an independent and principled writer and thinker is in keeping with our organizational mission. We have no intention of encouraging or tolerating any outside influence whatsoever, financial or otherwise. Our reputation has been built on our integrity, and we intend to keep it that way."

Fine words from the fine folks at [your company here].

A Better Battle for Bush

The United States isn't the only country with a controversial president roundly criticized in today's press. While George W. Bush was busying saying "no comment" about the NSA's secret domestic spying operation, one of his fellow heads of state was accused by the EU of "provocative political moves."

The president in question? Mahmoud Ahmadinejad of Iran. The cause for outcry? Describing the Holocaust as a "myth" and suggesting that Israel be "wiped off the mat." Here's a suggestion: rather than an invasion of Iran in early 2007, can we just have George and Mahmoud battle in single combat to the death? The TV ratings would be astronomical, and it'd be a win-win situation for the entire world.

Has the Stem Cell Bubble Burst?

Questions continue to percolate about the science and the ethics of the South Korean stem cell research conducted by Hwang Woo-suk. Earlier this week I wrote that these charges, if proven, would be a blow for the field of stem cell research, especially the cloning of human embryos.

But a blow in what way? Predictably, moral opponents of this research have pounced on the trouble in South Korea, arguing that the entire field is corrupt, overblown, and ripe for abandonment. Not so. One (as yet unproven) setback does not an entire discipline undo.
read more...
No, the moral of this story is that if you want something done right, you should do it yourself. Governmental (particularly presidential) feet-dragging on the issue of stem cell research has driven research in the area to the private sector and/or overseas. In the absence of federal funding, as well as agreed upon research and ethical guidelines, stem cells are sought in laboratories that are largely obscured from our view. The surest way to reverse this trend is to create a favorable research climate here in our own country, to help stop the exodus of scientific talent and research dollars overseas, and to ensure that an admittedly delicate field of scientific research is pursued with all reasonable care.

Freedom of the Press?

This paragraph comes intact from a headline article in the New York Times discussing the secret monitoring of domestic phone calls by the NSA:
The White House asked The New York Times not to publish this article, arguing that it could jeopardize continuing investigations and alert would-be terrorists that they might be under scrutiny. After meeting with senior administration officials to hear their concerns, the newspaper delayed publication for a year to conduct additional reporting. Some information that administration officials argued could be useful to terrorists has been omitted.

It just makes you wonder, what stories has the White House asked newspapers, including those which is much more ideologically comfortable with than the Times, not to publish? The Times, as I would hope and expect, published anyway. What we will never hear about are the stories that, in accordance with political requests or demands, never made it into the paper at all.

I know I sound like a broken record, but the lack of governmental honesty and accountability is depressing.

Wikipedia: The Free (And Mostly Accurate) Encyclopedia

Wikipedia, the oft-used and recently maligned internet "encyclopedia," has been given a vote of confidence by one of the world's premier scientific journals. A recent study, published in the journal Nature, found that Wikipedia compares favorably to the bundle of processed trees that is the Encyclopedia Britannica. That's good news for Wikipedia's reputation, and good news for all of us that link to it with abandon.

Tuesday, December 13, 2005

Falsification on the other hand...

Earlier today I spoke out against the "wisdom of repugnance" argument, so often used as a sword with which to attack advances in the fields of genetics and biotechnology. While that particular argument carries no weight whatsoever, charges of fabricated research are much more serious. South Korean superstar stem cell researcher Hwang Woo-suk, already besieged by allegations of research ethics violations, is now being accused of falsifying results. The charges, if proven, would be a significant blow for Woo-suk as well as other human cloning researchers and one that the field, already beset by a chorus of "yuck factor" critics, can ill afford.

No Wisdom in Repugnance

Researchers at the Salk Institute in San Diego have created mice with small amounts of human cells injected into their brains. The breakthrough, a possible step along the path to develop realistic models of neurological diseases such as Parkinson's disease in lab animals, is newsworthy because of the "yuck factor."
read more...
This "yuck factor," or "the wisdom of repugnance," as both the President's Council on Bioethics and it's former chairman Leon Kass have termed it, is one of the most specious arguments still retaining traction in bioethics today. Kass writes that "in crucial cases...repugnance is the emotional expression of deep wisdom, beyond reason's power fully to articulate it."

I disagree. Historically repugnance, or other forms of disapprobation, have greeted most new and substantial changes - whether they are scientific or technological innovations or changing social norms. From in vitro fertilization to gay marriage, the "wisdom of repugnance" is little more than a thinly veiled cloak for personal dislikes signaling a fear of the unknown.

We should be suspicious of viewpoints that demand that we faithfully adhere to them without question, for they are beyond the purview of rational expression. As Stephen Jay Gould pointed out in "Full House",
"our prejudices often overwhelm our limited information. [They] are so venerable, so reflexive, so much a part of our second nature, that we never stop to recognize their status as social decisions with radical alternatives - and we view them instead as given and obvious truths."
There is nothing inherently repugnant about injecting human brain cells into laboratory mice. Objections (and some genetic research is subject to legitimate objections, although I do not believe that is the case in this instance) ought to be recognized for what they are - social and personal decisions and not a form of deep-seated, ineffable wisdom - and accordingly evaluated on their merits. The mere fact that some of us blench in the face of change, without more, is no ground for drawing a line in the sand.

Sophisticated Anthropocentrism

Recent research on animal cognition suggests that chimps and humans may have different learning mechanism. A New York Times essay, "Children Learn by Monkey See, Monkey Do. Chimps Don't," details research findings that suggest humans learn by imitation, whereas chimps learn by focusing on goals, and ignoring unnecessary actions.
read more...
Beyond the clever but unfortunate title (chimpanzees aren't monkeys. They're apes, like us) this is an interesting discovery, made even more interesting by its conclusion. Uncovering this cognitive difference between chimps and humans suggests an immediate and highly anthropocentric conclusion: that imitation, once thought to be "a simple, primitive action compared with figuring out the intentions of others," is actually a much more sophisticated and complex behavior than initially thought. Why? Because humans do it, and chimps don't - and humans are more complex and sophisticated than chimps.

I know I'm only going to get myself into hot water here. I haven't reviewed either the original paper or the follow-up research, and I doubt if I could understand them even if I did. Nevertheless, I'm skeptical of a conclusion that immediately attaches the label of "sophisticated," simply because it crops up in Homo sapiens and not in our nearest relatives. Recently evolved? Sure. Sophisticated? Perhaps.

And, as a final thought, I find the sophistication conclusion problematic on another level. Whether or not over-imitation represents a form of evolutionary and cognitive sophistication or not, the labeling of a behavior as "sophisticated" implies, at least in common parlance, an increased refinement and worldliness. So does that suggest that imitation is better than ends-driven reasoning? While it may be more commonplace, if I was forced to choose I'd side with the chimps.

Pleistocene Park

Always wanted to see the deer and the antelope play? Or run from lions? Articles by Nicholas Kristof and Alan Burdick, both in the NY Times, comment favorably on a proposal, announced this past August in the journal Nature, for "Pleistocene re-wilding."
read more...
The idea, in brief, is to re-populate parts of the the United States with the African relatives of Pleistocene era species that once inhabited North America. Is it a good idea? I have no idea, although I'm suspicious. These species haven't existed on this continent for some 13,000 years. While that's a blink of the eye in terms of species evolution, my guess is that when it comes to ecosystems, 13,000 is a much more substantial number. I'd expect that the introduction of African megafauna into existing North American ecosystems would be a considerable challenge, to say the least.

However, the idea does have a certain Hollywood appeal to it - of both the PG and the R variety. On the one hand, there's the prospect of kids (and parents) clamoring to take a vacation to New Mexico, which would be a nice change of pace for the tourism industry there. On the other hand, there's the possibility of a herd of elephants stampeding through downtown Santa Fe, which would be...bad.

I suspect that, in the end, this will be a political decision. In an era where we can, if we decide it's politically expedient, convince ourselves that global warming doesn't even exist, there seems little that scientific skepticism can do to stop Pleistocene re-wilding if the proposal has the votes.

Finally, just to prove my complete ignorance on this topic, Kristof's column hypothesizes a "Pleistocene reserve on, say, private land in North Dakota." Having never been to North Dakota (or Africa) I can't say for certain, but wouldn't this be a bit of an adjustment for many of these animals? Are lions accustomed to mean temperatures of eight degrees Fahrenheit (Bismark in January)? Just curious...

Denying the Obvious

"Racial Violence Continues in Australia." That's the headline. The response from Australian Prime Minister John Howard? "I do not accept that there is underlying racism in this country."

I know that earlier today I asked whether racism will ever go away. Mr. Howard apparently thinks it's already come and gone, at least in Australia. All eyewitness accounts and brutal photography to the contrary, Mr. Howard is adamant in denying that racism is a problem for Australia.
read more...
Here's a personal plea to Mr. Howard who, of course, stands no chance of actually reading this: learn from the example of our dilapidated nation. Five years of denying obvious truths (climate change, a failing war, an inability to engage in critical self-evaluation) have certainly left us worse for the wear. There's no need for you to follow suit.

Racism is a problem. It is an unfortunate fact of life in Australia, just as it likely is in every other country on this planet. Address the problem now and help convert it into an isolated incident. Don't deny its existence, slap a bandage on it, and allow the infection to fester.

Monday, December 12, 2005

The Hedgehog, the Fox, and the Blog

Isaiah Berlin, in an essay 0n Tolstoy, categorized human beings as either hedgehogs or foxes. There's a good summary on Kheper.net but, basically, the distinction is between those who pursue a single, grand vision (the "hedgehog") as opposed to many divergent ends (the "fox").
read more...
One assertion, made in the Kheper.net summary, is that it is hedgehogs who have made the biggest impact on the world. Maybe so, when it comes to individuals, but what about when it comes to blogs? Are the "best" blogs those that have a clear and consistently expounded theme or idea, or are they the ones that jump from topic to topic, focusing on whatever is most interesting?

I don't yet know how Farr Ago News will ultimately unfold, but I'd be happy to hear your thoughts about foxes and hedgehogs.

Update: If you're curious, there's a hedgehog-fox test on the admissions website for Hobart and William Smoth Colleges (New York).

The End of Racism?

This post asks a question that has been lingering in my mind for the past week, prompted first by my re-reading of “To Kill a Mockingbird” and, more recently, the race riots last month in France and yesterday in Australia. The question is this: “Will racism ever go away?”
read more...
In the past I’ve seen fit to suggest that given enough time, truth always wins out. What do I mean by “truth”? Certainly nothing like the triumph of good over evil, or morality over whatever suite of behaviors you might find immoral. In many respects I was referring to the vindication of scientific '”truths” - the heliocentric model, evolution, and the like.

At the moment I’m not prepared to personally take on the enormous literature discussing truth generally, and the “truth” of racism specifically. But I do have a question: Is the notion of racism – that individuals can be identified as somehow inferior on the basis of their external appearance – one that is grounded in biology? Or is it a socially created myth, designed to perpetuate and reinforce inequalities and narrow-mindedness that will, eventually, wither away and die as an idea.

To Kill a Mockingbird was written in 1960, which is almost half a century ago, and is set in the mid-1930s. At least from what we see in the papers, and from what I’ve witnessed in my own life, I'm not convinced that we've come too terribly far in this country since then. While things have changed at the institutional level - e.g. we don't allow schools to require or permit segregation by law - I'm not sure all that much has changed at the level of individual relations. If anything I suppose that I’m wary that unchanged attitudes have simply been driven forced to take refuge and to fester behind closed doors and repressed thoughts.

So, in summary, my questions are these:
- Is racism an idea that will go away?
- If so, is it now in the process of receding and withering?
- Is there biological truth in the idea of racism that, although indubitably harmful, renders us unable to purge it from our society specifically, and from the species Homo sapiens more generally?
- And finally, and relatedly, are primates (or other animals) ever “racist”? Is that even an intelligible question to ask?

While I’m certain there’s a wide body of literature already available on this topic, I’m curious to learn what people think.

The Difference between Wisdom and Ignorance

A short piece by Anna Bernasek in the New York Times yesterday asks "What's the Return on Education?" The piece investigates the economic return on an investment in education, both for the individual and for society as a whole, and comes to the stunning conclusion that we don't really know.

Now let's be clear - concluding "I don't know," when you don't actually know, is worlds better than concluding "I don't know, so I better just make it up as I go." After all, as Socrates pointed out, there is wisdom in the recognition of one's ignorance. Or was that George Bush? I always get them confused.
read more...
Actually, Ms. Bernasek refers to Socrates in explaining the merit of an article that asks an important question - What sort of results is our investment in education producing? - and fails to come up with much of an answer. And she was correct to do so - she just needs a little work on the details.

Bernasek writes that "Socrates once said that the more he learned, the more he became convinced of his own ignorance." That's a nice but rather inaccurate summary of Socrates' description of his own wisdom, set forth in the Apology, Plato's account of the trial and sentencing of Socrates.

Without being overly dogmatic, I think it's important to point out that Socrates disclaimed any knowledge from the beginning. He was, he admits, baffled by the Delphic oracle's assertion that no man in Athens was wiser than Socrates. What happened next was not Socrates embarking on a quest for knowledge and coming to the realization, as he learned more and more, that he was indeed supremely ignorant. In fact, it was quite the opposite. Socrates set out to disprove the oracle by engaging in coversation those reputed to possess knowledge. In this process what Socrates discovered was that neither he nor the prominent men of Athens - the politicians, the poets, and the craftsmen - possessed any real knowledge. The only difference was that Socrates realized he was ignorant, whereas the rest believed themselves to be possessed of knowledge.
So I withdrew and thought to myself: "I am wiser than this man; it is likely that neither of us knows anything worthwhile, but he thinks he knows something when he does not, whereas when I do not know, neither do I think I know; so I am likely to be wiser than eh to this small extent, that I do not think I know what I do not know." (21d)

So Bernasek, unfortunately, has Socrates backwards. In reality, the more Socrates learned (if we confine "learned" to the very narrow description of uncovering the beliefs held by others) the more he became convinced of his own, albeit limited, wisdom - not of his own ignorance.

So what? Bernasek got the basic point right, more or less, so why do we care if she's playing a little fast and loose with the details of perhaps the most important speech in Western legal history? Maybe we don't. But if Bernasek is to be believed when she says, "Taking our cue from Socrates, the first step may be to recognize what we don't know," that recognition has to start with what Socrates himself actually said.

Sunday, December 11, 2005

Three Drinks and You're Out

"Three drinks and you're out?" That's the plan proposed by John Smith, president of the Royal College of Surgeons of Edinburgh, and one of Britain's leading medical authorities. In the words of Jon Stewart, "Whhaaaaa?"
read more...
Following the recent ban on smoking in public buildings, Smith is agitating for a three drink minimum per visit to a pub or bar. Let's forget for a moment that we're talking about a nation that, until last month, still forced its pubs to close earlier than anyplace else in the civilized world. And let's focus several of the gaping holes in this proposal:

Pragmatics: Three drinks per pub is highly unlikely to mean three drinks per evening. Rather, it sounds like an invitation to pub-hop which, while it might be a big hit with the kids, and get the regulars out of their local and moving about town, doesn't seem terribly likely to reduce total consumption. It does, however, seem like it might be encouraging a fair number of people to go hop into their cars and drive to another bar after three pints. Wonderful.

Paternalism: Smith considers the restrictions on alcohol a logical extension of recent curbs on public smoking. Unfortunately, there are substantial and significant differences. The public smoking bans - a growing trend in much of the world, and on both sides of the pond - are aimed not at individual smokers but at the experience of individuals, typically non-smokers, in public places. While it may be a fortunate side effect if public smoking bans cut down on the actual amount of smoking that takes place their real import, and the reason they have garnered such popular support, is that they allow non-smokers to go for a drink (or five) at a pub and not have to dryclean their clothes afterward to get rid of the smell.

The three drink limit proposal shares very little in common with the smoking bans in this respect. While the health of smokers is a secondary goal, the health of the drinkers is cited as the primary purpose of the proposed legislation. And that is where the word "paternalism" comes in.

The smoke from the gentleman at the next table gets in my hair, ruins my appetite, and gives me, eventually, lung cancer. That same gentleman's fourth vodka tonic doesn't really impact me in the slightest. So why, I might reasonably ask, should I take that fourth drink away from him?

Sure it's probably eroding his liver and, perhaps, costing me some money ( in the form of higher national medical costs) but it's not going to kill him today. He might get hit by a bus first. Plus, he might be a big-time sky-diver. And that's dangerous, and he might get injured, but I'm not going to tell him to give that up, am I? Unless that fourth drink is going to make him get up, walk over to my booth, and vomit in my lap, as far as I'm concerned he can have it.

Frankly, if we're going to tell people how to live their lives simply on the basis of what is good for them - as opposed to, with the smoking bans, what is good for us, which we're perfectly happy to do - we could aim a lot higher than that fourth drink.

On the other hand, I would find it amusing if the British, shortly after they finally managed to get their pubs to stay open past eleven, enacted legislation that made sure every patron had been cut off by eight.

(Still) Silencing the Naysayers

Apparently nothing has changed in the last twelve hours. Big surprise. This article from yesterday's New York Magazine details how the Bush administration tried to strong-arm the U.N. into cancelling Bill Clinton's speech at the recently concluded Climate Change Conference in Montreal.

On the one hand, it's further (and disheartening) confirmation of what I wrote about earlier today. On the other hand, it's funny that the administration's threat - to pull out of talks on a new Kyoto deal - was taken seriously, given that they have absolutely no intention of signing onto the treaty, regardless of its terms.

Saturday, December 10, 2005

Reframing the Abortion Debate?

Two recent articles, in the New York and the Los Angeles Times respectively, examine the abortion debate in an interesting (although I'd suspect not an entirely novel) light.
read more...
The first, by Dalton Conley in the New York Times, wonders whether a man should have a say when a woman decides she wants an abortion. Conley argues in the affirmative, suggesting that a man should have the right to obtain an injunction preventing the woman from aborting the pregnancy, even if the two aren't married.

Meghan Daum, in the LA Times, takes the argument a step farther and wonders whether men don't deserve even more extensive rights. If, she argues, our goal is to promote equality of choice, then shouldn't both adults have the right to influence whether or not the pregnancy will proceed? While Daum doesn't suggest that men should be granted the right to force an abortion, she does speculate that, just perhaps, men should be able to legally sever their financial and legal ties to an unwanted and unborn child, at least up to a certain point in the pregnancy.

What's interesting is that both of these suggestions, in different ways, cut across traditional pro-life / pro-choice battle lines and reframe the abortion rights question as one of equality between the man and the woman. Both Conley and Daum are quick to mention the considerable difficulties with their proposals - and equally quick to point out that a cataloguing of those difficulties shouldn't be the beginning, middle, and end of this discussion.

As for me, I'm not altogether sure what to make of this. I self-identify as pro-choice because, in accordance with Mill's harm principle, I could never conceive of telling another person, male or female, what they could or could not do with their own body.

And viewed from that angle, Daum's proposal actually seems less radical than Conley's. No matter how committed the father is to raising the child, he can't birth it. And, though I can't speak from experience, I suspect that the type of pain inflicted on the mother during child birth would, in a criminal context, be on the order of cruel and unusual. Conley's position seems fundamentally compatible with a pro-choice attitude toward abortion, at least when that attitude is derived from a belief in a woman's right to autonomous control of her body.

However, while I don't think I could support a measure that would allow a father to force a woman to bring a child to term, I'm not so certain that Daum's proposal is without merit.

First, and importantly, it would not require women to have an abortion in order to let the man off the hook. While it would certainly alter the landscape of abortion decisions - such a rule would be a substantial consideration when a woman wanted to bring the child to term but felt unable to do so without financial support from the male involved - this limitation of freedom would be one of degree, not of kind, and would be balanced by an increased freedom for the male involved.

Off the top of my head, one particularly problematic element of Daum's proposal (which she does not address) is that it shifts the responsibility for contraception almost entirely to the woman, at least with respect to birth control. If a man can opt out (legally and financially) of an unwanted pregnancy without any reprecussions, he has a much weaker incentive to avoid creating such a situation in the first place. The woman, on the other hand, still must carefully consider contraception or risk a difficult (as well as costly, and potentially dangerous) decision about abortion. Of course, this entire concern should be mooted by concern for STDs, but I suspect that would nto be the case.

And there are, no doubt, plenty of other serious holes in Daum's proposal. But it's an interesting idea and, as long as we're going to continue to reargue the abortion debate anyway, it might as well receive a full hearing.

Silencing the Naysayers

The Washington Post has written about a troubling new shift in policy at the Department of Justice. According to columnist Dan Eggen, the DOJ "has barred staff attorneys from offering recommendations in major Voting Rights Act cases, marking a significant change in the procedures meant to insulate such decisions from politics."
read more...
While frightening in its own right, the DOJ's decision to suppress the analysis of its non-political appointees is truly scary precisely because it is not an isolated event. The current administration has made a habit out of ignoring information that doesn't mesh with its agenda, and of silencing voices that critique its strategies. While no one is suggesting (at least not yet) that this most recent development at the Justice Department is the direct result of White House pressure, it's quite conceivable that this never would have taken place in a different political climate.

Our government has cultivated an atmosphere of political conformity - by promoting dogmatism among its supporters and by aggressively attacking its detractors - which is anathema to a properly functioning democrarcy. Discourse, dissent, and the free exchange of ideas have been supplanted by the patriotric requirement of producing a unified national message, whether right or wrong.

Why? Why are we as a nation more concerned with appearing right than with being right? As John Stuart Mill wrote in "On Liberty," the "unity of opinion, unless resulting from the fullest and freest comparison of opposite opinions, is not desirable." Our government, in its attempt to bring liberty to the rest of the world, is stiffling it in its own backyard.

Thursday, December 08, 2005

Ann Coulter Races CNN to the bottom

CNN reports conservative columnist and pundit Ann Coulter showed off her Ivy League smarts by talking down to a crowd of 2,600 at the University of Connecticut. After growing tired of boos and jeers, Coulter cut off her speech and moved on to the Q&A portion of the event. Said Coulter, "I love to engage in repartee with people who are stupider than I am."
read more...
It's not surprising to see Coulter, whose game is to get people riled up by making inflammatory statements, lay into a group of college students and grab a piece of the headlines. Love her or despise her, this isn't likely to change your opinion one bit. What is surprising is CNN's poll which asks, of all things: "Who is stupider, Ann Coulter or jeering UConn studnets?"

An interesting tactic from CNN but that's hardly a close call(Coulter currently leads by a 2:1 margin). A much better poll question would be: "Who is stupider, Ann Coulter or CNN?"

"Secular Central" Apologizes for Hating on Christmas

Fox News's Bill O'Reilly recently announced that unless Jon Stewart and all the other secular Christmas-bashers out there stopped being so darn mean, and quit carrying on with their "War on Christmas", he was going to have no choice but to respond by writing to their mothers. Obviously unable to resist the compelling logic and fairmindeness of O'Reilly's request, the good folks at Secular Central issued this apology.

Mexico City Sinking. Washington next?

According to reports, Mexico City is sinking. Built on a dried, unstable lake bed, Mexico City has sunk almost three stories in the past century. You know, Washington, D.C. sits on a swamp. Any chance it's going to start sinking soon, and take some of its less respectable inhabitants down with it?

Mooting the Disability Debate

Two recent articles discuss breakthroughs in Down syndrome research. The first, published several weeks ago in the New York Times, discusses a recently announced screening technique designed to detect fetuses with Down. The second, appearing Tuesday on the BBC’s website, describes a recent study that identifies a molecule associated with Down that may be a potential target for developing treatments for the syndrome.

Both articles hasten to point out that these new developments do not constitute anything like a cure for Down Syndrome. The piece in the Times goes much farther, discussing the ethical issues surrounding the provision of prenatal genetic testing for such conditions.
read more...
One issue that the article touches on, but doesn’t address in detail, is how we determine what traits or characteristics will be classified as “disabilities.” With respect to many conditions, of which Down is just one example, the application of the disability label is highly contested. One of the key concerns, then, is whether the development of targeted genetic screening will render these important debates moot.

How or why would this happen? The reasons are two-fold – one more obvious than the other. The obvious reason is that the very act of searching for and developing treatments, cures, and screens indicates that the condition is likely considered a disability. After all, there aren’t two many research labs actively pursuing a cure for musical ability.

The less obvious reason is that what some people consider a disability others would prefer to consider a normal human trait. Adjudication of this issue requires us to determine which characteristics are part of the normal range of diversity within the human population, and which constitute disabilities. Whether we choose to consider an individual with Down (or one who is deaf, or left-handed, or has poor eyesight) as disabled is a matter of definition and opinion, about which people can and do disagree (although not always reasonably).

What is not open for dispute is that, in the absence of societal support for a particular condition, typically in the form of resources and tolerance, any condition can become functionally disabling. To illustrate, let’s consider the trivial example, mentioned above, of impaired vision. Few would deny near- or far-sightedness as a form of physical disability. However, due to the presence of corrective lenses, ophthalmologists, etc., this condition is hardly disabling in any real sense for most members of our society.

Consider, however, what would happen if a genetic test was developed to screen out embryos likely to develop impaired eyesight later in life. Now, all of a sudden, a minor visual impairment has the potential to become a serious disability. As genetic screening reduces the number of individuals in the population with visual impairment, demand for vision correction technologies and services decreases, and may ultimately wither away due to numeric and economic considerations. This loss of support (discussed at some length in the NY Times piece) can help convert an inconsequential impairment into a socially constructed disability.

This particular example is, of course, both trivial and far-fetched. But it illustrates one of the underlying objections to genetic screening for conditions that are not unambiguously disabilities. A genetic screen, though ostensibly neutral, can potentially tip the balance and moot the debate about whether or not to classify a condition, Down syndrome for example, as a disability.

None of this is intended to suggest that genetic screens should not be developed or that, once developed and demonstrated to be accurate, they should not be employed. However, when using such techniques it is important that we are mindful that they possess the potential to produce results far beyond a simple positive or negative probabilistic test. Making this point clear, to those that offer as well as to those that make use of such tests, is fundamental to providing informed consent.

Satire or just plain scary?

Burt Prelutsky, humor columnist and movie critic for LA Magazine has a new column on the Jews who stole Christmas (posted on the blog Townhall.com).

I admit that, until this morning, I was unfamiliar with Prelutsky's work and, since the description of "humor columnist" doesn't appear until you reach the column's footer, I read merrily along, taking Prelutsky at face value. Without question, the mark of good satire is something that, while ridiculous, can also be believed to be true. And in this Prelutsky has succeeded.
read more...
But, in this case, is this really such a good thing? Should we be encouraged that something written to lampoon cries of 'anti-Christian sentiment' may be (mis)used to fuel them, or just depressed? Prelutsky's column brought to mind a similarly confusing experience I had earlier this week:

…perhaps two or three illustrations will sufficiently indicate this disturbing trend which if it remains unchecked will make those who are opposed to traditional religion into second-class citizens. A few months ago, for instance, a sub-committee of the House of Representatives included in a Concurrent Resolution the amazing proposition that ‘loyalty to God’ is an essential qualification for the best government service. ‘Service of any person in any capacity in or under the government,’ the legislators officially asserted, ‘should be characterised by devotion to God.’…Another resolution making ‘In God We Trust’ the national motto of the United States has been passed by both Houses and is now the law of the land


That was written by Paul Edwards in 1956 as part of the Editor's Introduction to "Why I am not a Christian", a collection of essays (including the one for which the book is titled) by Bertrand Russell. It wasn't until I reached the final page of the introduction, and saw the date, that I had any inkling whatsoever that it wasn't penned in 2003 (the same year as the the preface by Simon Blackburn which immediately preceeds it in the Routledge Classics Edition).

To my mind, there's something scary about not being able to distinguish religious satire from reality, or the religious activism of the 1950s from that of today. Perhaps I'm just easily confused. Then again, the official motto of the United States, half a century later, remains "In God We Trust."

(A New) Brave New World?

Gary Rosen recently published an interesting review (originally published in the NY Times Magazine) of Kazua Ishiguro's 2005 novel "Never Let Me Go." I haven't yet read the book (it's on order and should be arriving this week) but I'm eagerly looking forward to it.

Rosen's review pays homage, of course, to the seminal (and sometimes what seems to be the only) genetic dystopia, Aldous Huxley's "Brave New World." It's amazing the extent to which Huxley's vision, first published almost three quarters of a century ago, has almost completely determined how we imagine a future of reproductive human cloning.
read more...
Huxley writes of “Standard men and women…Millions of identical twins. The principle of mass production at last applied to biology.” This characterization has long dominated our imagination and discussion of reproductive human cloning. But is there any real reason to suspect that a clone, even if it possessed an identical genetic profile (and there are a number of reasons why even this might not be true. For one such reason, take a look at this article on microchimerism, originally published in New Scientist), would be truly "identical" to the individual from whom they were cloned.

As Rosen writes, "Though Kathy is a genetic duplicate, she is nobody's double or distorted reflection. She is her own person, indeed a young woman of growing self-awareness and independence."

I have high hopes for Ishiguro's novel. If he can shake our foundational assumption that clones are identical in every way, and not just with respect to the genes they possess, it will help dispel the long-standing myth that it is those genes, and those genes alone, that determine who we are.

While I don't expect that we'll see reproductive cloning in action any time soon, this is an important first step down that path.

Bringin' the Horror

That's what Bill O'Reilly is promising to do (or listen to the rant) if the world doesn't shape up and stop undermining X-Mas. Excuse me. Christmas.

O'Reilly, part of the oft-oppressed Christian minority of this country, is threatening to "use all the power that I have on radio and television to bring horror into the world of people" who "diminish and denigrate" Christmas. That's an actual quote.
read more...
I don't mean to continually play the WWJD card but who in their right mind can possibly believe that the appropriate Christian response to "oppressive, totalitarian, anti-Christian forces", whether real or imagined, is to "bring horror" into those people's lives?

Update
: The Brad Blog also recently reported that O'Reilly used a 2004 clip from The Daily Show as evidence of secular society's 2005 "War on Christmas." You can read all about O'Reilly's latest blatant fabrication at The Brad Blog.

I'm left to wonder whether O'Reilly is going the way of the 2004 version of Maureen Dowd. She became progressively more hysterical leading up to the November election and then suddenly disappeared and emerged nearly a year later with a new book. The thought of O'Reilly disappearing for a year is a heartening one. Not so much the thought of him authoring another book...

Big Game Hunting in Africa

I was browsing Google images for a picture of a sable (the NY Times article about the looming extinction crisis in Mongolia mentions that marmot fur is often used in sable coats. Since I didn't even know that 'sable' was a species of animal, I wanted to see what they look like. They're kinda ugly) and I stumbled across this African hunting safari website.

Apart from the fact that this is a bunch of rednecks (along with their wives and children - wonderful) smiling smugly after shooting animals with arrows, is this even legal? I'm curious about the hunting of elephants and cheetahs, both of which are listed as vulnerable according to the 2004 IUCN Red List of Threatened Species. Is this sort of activity not covered by CITES (the Convention on International Trade in Endangered Species)?
read more...
Update: A conservation biologist wrote the following to me by email:
Hunting safaris like this one are becoming more and more common. In fact, trophy hunting is actually being considered by many biologists as a legitimate conservation strategy. Once studies are conducted to determine population viability statistics, ecologists can provide local hunting agencies with sustainable harvest quotas. It's quite the fundraising technique, and adds tremendous value to otherwise worthless "vermin" (at least in the eyes of Africans). If you can poision a lion so it doesn't kill your cows or have some rich American redneck pay $30,000 to kill it for you, which would you choose? As long as the limits are properly enforced (always a problem in Africa), this could actuallly work.

Anyway, that's my best explanation of the carnage you have witnessed.

Interesting explanation. I never would've suspected that the sanctioned killing of vulnerable species was a viable conservation strategy.

Wednesday, December 07, 2005

The Unidentified Needle

Interesting article posted on /. describing a new technique developed at Case Western University for locating the needle in a haystack of data.

It appears that this technique relies on prior knowledge of the needle (in this case a particular signal of interest in the data) for which you are searching. While exciting in its own right, this breakthrough generates a natural and even more challenging problem: How do you sift through a haystack of data when you have no idea if what you're looking for is a needle or for something else entirely?
read more...
When it comes to searching through piles of information, search engines and other digital resources often prove very effective...provided that you know what you're looking for. The challenge now is to learn how to uncover the right piece of information when the user doesn't know exactly what it is they are looking for.

Too much science fiction in that idea? Perhaps. Probably not though. Does Googlzon in 2014 really seem so far-fetched? Either way, the ability to locate the 'needle' in the haystack, despite not knowing anything about the particular needle or haystack, is something that I'm confident some very clever people are hard at work on.

Prepare for the Apocalypse (part two)

If good old fashioned Holy War doesn't do us in, perhaps this meteor will. Thankfully, at least for those already drawing social security, Apophis (the meteor is fittingly named for the Egyptian god of darkness and chaos) isn't due until 2036.

As for the rest of us, NASA has decided that 2036 is a long time from now and will wait until 2013 to start thinking up ways to avoid a catastrophic collision. My suspicion is that NASA has run the numbers and calculated that the odds of a meteor strike (currently estimated at 1:5,500) are substantially smaller than the odds that we'll wipe ourselves out before then, with or without any assistance from celestial bodies.

Who I am to disagree?

Upgrade your Bible

Read the transcript or watch the video: it's 'trade in your bible for porn day.'

The most entertaining part is watching Tucker Carlson declare, apropos of almost nothing, that the law is derived from an independent notion of right and wrong, which implies a higher power. Apart from the logical flaws in this argument (an explanation of which I leave in the capable hands of Mr. Bertrand Russell), many aspects of the Anglo-American law descend in substantial part from Roman law. While my knowledge of the New Testament is admittedly terrible, I'm fairly certain that it was those same Romans who crucified Christ. If our legal system does rest upon an independent notion of right and wrong, isn't it more likely pulled from the teachings of Jupiter than Jesus?

I can't believe The Daily Show hasn't picked this up yet...

Prepare for the Apocalypse

Now we know why Christmas is now X-Mas. Jesus has more important things to do.

In a Washington Post article on the president's secular holiday card (Why didn't I get one?), Rev. Bob Edgar, general secretary for the National Council on Churches, explains: "I think it's more important to put Christ back into our war planning than into our Christmas cards." Apparently, in the search for a plausible strategy for victory in Iraq, the son of god has been recruited as a consultant.

Why didn't anybody think of this earlier?