2011-01-02

acroyear: (fof not quite right)
2011-01-02 11:29 am
Entry tags:

not that Benneigans exists anymore, but you get the idea...

Wake Up, Geek Culture. Time to Die | Magazine:
We needed it, too, because the essence of our culture—our “escape hatch” culture—would begin to change in 1987.

That was the year the final issue of Watchmen came out, in October. After that, it seemed like everything that was part of my otaku world was out in the open and up for grabs, if only out of context. I wasn’t seeing the hard line between “nerds” and “normals” anymore. It was the last year that a T-shirt or music preference or pastime (Dungeons & Dragons had long since lost its dangerous, Satanic, suicide-inducing street cred) could set you apart from the surface dwellers. Pretty soon, being the only person who was into something didn’t make you outcast; it made you ahead of the curve and someone people were quicker to befriend than shun. Ironically, surface dwellers began repurposing the symbols and phrases and tokens of the erstwhile outcast underground.

Fast-forward to now: Boba Fett’s helmet emblazoned on sleeveless T-shirts worn by gym douches hefting dumbbells. The Glee kids performing the songs from The Rocky Horror Picture Show. And Toad the Wet Sprocket, a band that took its name from a Monty Python riff, joining the permanent soundtrack of a night out at Bennigan’s. Our below-the-topsoil passions have been rudely dug up and displayed in the noonday sun. The Lord of the Rings used to be ours and only ours simply because of the sheer goddamn thickness of the books. Twenty years later, the entire cast and crew would be trooping onstage at the Oscars to collect their statuettes, and replicas of the One Ring would be sold as bling.
He later includes this brilliant line:
Can we all admit the final battle in Superman II looks like a local commercial for a personal-injury attorney?
acroyear: (makes sense)
2011-01-02 11:56 am
Entry tags:

on science - nothing has changed, and people should stop seeing that as a problem.

Science is not dead : Pharyngula:
But that's where the psychological dimension comes into play. Look at the loaded language in the article: scientists are "disturbed," "depressed," and "troubled." The issues are presented as a crisis for all of science; the titles (which I hope were picked by an editor, not Lehrer) emphasize that science isn't working, when nothing in the article backs that up. The conclusion goes from a reasonable suggestion to complete bullshit.
Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can't bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren't surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.) The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that's often not the case. Just because an idea is true doesn't mean it can be proved. And just because an idea can be proved doesn't mean it's true. When the experiments are done, we still have to choose what to believe.
I've highlighted the part that is true. Yes, science is hard. Especially when you are dealing with extremely complex phenomena with multiple variables, it can be extremely difficult to demonstrate the validity of a hypothesis (I detest the word "prove" in science, which we don't do, and we know it; Lehrer should, too). What the decline effect demonstrates, when it occurs, is that just maybe the original hypothesis was wrong. This shouldn't be disturbing, depressing, or troubling at all, except, as we see in his article, when we have scientists who have an emotional or profit-making attachment to an idea.

That's all this fuss is really saying. Sometimes hypotheses are shown to be wrong, and sometimes if the support for the hypothesis is built on weak evidence or a highly derived interpretation of a complex data set, it may take a long time for the correct answer to emerge. So? This is not a failure of science, unless you're somehow expecting instant gratification on everything, or confirmation of every cherished idea.

But those last few sentences, where Lehrer dribbles off into a delusion of subjectivity and essentially throws up his hands and surrenders himself to ignorance, is unjustifiable. Early in any scientific career, one should learn a couple of general rules: science is never about absolute certainty, and the absence of black & white binary results is not evidence against it; you don't get to choose what you want to believe, but instead only accept provisionally a result; and when you've got a positive result, the proper response is not to claim that you've proved something, but instead to focus more tightly, scrutinize more strictly, and test, test, test ever more deeply. It's unfortunate that Lehrer has tainted his story with all that unwarranted breast-beating, because as a summary of why science can be hard to do, and of the institutional flaws in doing science, it's quite good.

But science works. That's all that counts.
Again, this is the great gift of science over any other "ways of knowing" - science is inherently self-correcting. Only the scientific process has built-into it the mechanism for correcting an error. Any other form, say "divine revelation" has no external or objective means of supporting a conclusion, or of showing a conclusion to be incorrect.

I don't have "faith" in science: I trust science to eventually figure something out, even through the natural human biases of economics that currently drive most scientific research decisions.