Jonah Lehrer has some more thoughts on the Decline Effect:
The first letter, like many of the e-mails, tweets, and comments I’ve received directly, argues that the decline effect is ultimately a minor worry, since “in the long run, science prevails over human bias.”
Lehrer then quotes Feynman who discusses the famous 1909 oil-drop experiment and explains why it took so long for scientists to zero on the correct measure for the charge of the electron:
Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard.
As Lehrer notes, this is yet another example of the “selective reporting in science.” But Feynmann was trying to make another point:
he warned the Caltech undergrads to be rigorous scientists, because their lack of rigor would be quickly exposed by the scientific process. “Other experimenters will repeat your experiment and find out whether you were wrong or right,” Feynman said. “Nature’s phenomena will agree or they’ll disagree with your theory.”
But Lehrer is quick to puncture the obvious naivety associated with this claim:
Remember my concern about naïve realism?
Naïve realism is the conviction that one sees the world as it is and that when people don’t see it in a similar way, it is they that do not see the world for what it is. Ross characterized naïve realism as “a dangerous but unavoidable conviction about perception and reality”. The danger of naïve realism is that while humans are good in recognizing that other people and their opinions have been shaped and influenced by their life experiences and particular dogmas, we are far less adept at recognizing the influence our own experiences and dogmas have on ourselves and opinions. We fail to recognize the bias in ourselves that we are so good in picking out in others.
Of course, many people might be tempted to dismiss this as being rather insignificant, given that science has provided a means to “see the world for what it is.” Not so fast. I encourage you to read Jonah Lehrer’s article, The Truth Wears Off : Is there something wrong with the scientific method?
Lehrer explains the Decline Effect, where scientific findings are reported and with time, it becomes harder and harder for others to replicate the findings. The problem is widespread and there appear to be many factors that bring about this phenomenon. For those who have heard me talk about confirmation bias in the past, you might enjoy this example:
Neuroscientist David Eagleman outlines a position that would be quite at home in the DM:
I have no doubt that we will continue to add to the pier of knowledge, appending several new slats in each generation. But we have no guarantee how far we’ll get. There may be some domains beyond the tools of science – perhaps temporarily, perhaps always. We also have to acknowledge that we won’t answer many of the big questions in our brief twinkling of a 21st-century lifetime: even if science can determine the correct answer, we won’t get to enjoy hearing it.
This situation calls for an openness in approaching the big questions of our existence. When there is a lack of meaningful data to weigh in on a problem, good scientists are comfortable holding many possibilities at once, rather than committing to a particular story over others. In light of this, I have found myself surprised by the amount of certainty out there.
This is why I call myself a “possibilian”. Possibilianism emphasises the active exploration of new, unconsidered notions. A possibilian is comfortable holding multiple ideas in mind and is not driven by the idea of fighting for a single, particular story. The key emphasis of possibilianism is to shine a flashlight around the possibility space. It is a plea not simply for open-mindedness, but for an active exploration of new ideas.
Throughout the years, I have noticed a pattern that occurs when arguing with various people on the internet – rather than focus and deal with the actual argument I am making, they are arguing against a point that they anticipate I will make later down the line. Rather than address the question I am asking, they address the answer they think I am trying to elicit. To argue and respond like this, I assume they think they are playing mental chess, relying on their clever foresight to anticipate my next move. But more often than not, they are simply relying on their own stereotypes. So what would it be like to play real chess with someone like that?
Over at the BioLogos blog, Michael Ruse offers a short summary of his new book, Science and Spirituality: Making Room for Faith in the Age of Science. Go read it and come back for a few observations below the fold.
From John F. Kihlstrom:
Technically, the duck-rabbit figure is an ambiguous (or reversible, or bistable) figure, not an illusion (Peterson, Kihlstrom, Rose, & Glisky, 1992). The two classes of perceptual phenomena have quite different theoretical implications. From a constructivist point of view, many illusions illustrate the role of unconscious inferences in perception, while the ambiguous figures illustrate the role of expectations, world-knowledge, and the direction of attention (Long & Toppino, 2004). For example, children tested on Easter Sunday are more likely to see the figure as a rabbit; if tested on a Sunday in October, they tend to see it as a duck or similar bird (Brugger & Brugger, 1993).
But the more important point of this letter concerns attribution: the duck-rabbit was “originally noted” not by Wittgenstein, but rather by the American psychologist Joseph Jastrow in 1899 (Jastrow, 1899, 1900; see also Brugger, 1999), when the famous philosopher (b. 1889) was probably still in short pants. Along with such figures as the Necker cube and the Schroeder staircase, Jastrow used the duck-rabbit to make the point that perception is not just a product of the stimulus, but also of mental activity – that we see with the mind as well as the eye.