Tag Archives: perception

More Thoughts on the Decline Effect

Jonah Lehrer has some more thoughts on the Decline Effect:

The first letter, like many of the e-mails, tweets, and comments I’ve received directly, argues that the decline effect is ultimately a minor worry, since “in the long run, science prevails over human bias.”

Lehrer then quotes Feynman who discusses the famous 1909 oil-drop experiment and explains why it took so long for scientists to zero on the correct measure for the charge of the electron:

Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard.

As Lehrer notes, this is yet another example of the “selective reporting in science.”  But Feynmann was trying to make another point:

he warned the Caltech undergrads to be rigorous scientists, because their lack of rigor would be quickly exposed by the scientific process. “Other experimenters will repeat your experiment and find out whether you were wrong or right,” Feynman said. “Nature’s phenomena will agree or they’ll disagree with your theory.”
But Lehrer is quick to puncture the obvious naivety associated with this claim:

Continue reading

The Decline Effect

Remember my concern about naïve realism?

Naïve realism is the conviction that one sees the world as it is and that when people don’t see it in a similar way, it is they that do not see the world for what it is. Ross characterized naïve realism as “a dangerous but unavoidable conviction about perception and reality”. The danger of naïve realism is that while humans are good in recognizing that other people and their opinions have been shaped and influenced by their life experiences and particular dogmas, we are far less adept at recognizing the influence our own experiences and dogmas have on ourselves and opinions. We fail to recognize the bias in ourselves that we are so good in picking out in others.

Of course, many people might be tempted to dismiss this as being rather insignificant, given that science has provided a means to “see the world for what it is.”  Not so fast.  I encourage you to read Jonah Lehrer’s article, The Truth Wears Off : Is there something wrong with the scientific method?

Lehrer explains the Decline Effect, where scientific findings are reported and with time, it becomes harder and harder for others to replicate the findings.  The problem is widespread and there appear to be many factors that bring about this phenomenon. For those who have heard me talk about confirmation bias in the past, you might enjoy this example:

Continue reading

Possibilian and Beyond

Neuroscientist David Eagleman outlines a position that would be quite at home in the DM:

I have no doubt that we will continue to add to the pier of knowledge, appending several new slats in each generation. But we have no guarantee how far we’ll get. There may be some domains beyond the tools of science – perhaps temporarily, perhaps always. We also have to acknowledge that we won’t answer many of the big questions in our brief twinkling of a 21st-century lifetime: even if science can determine the correct answer, we won’t get to enjoy hearing it.

This situation calls for an openness in approaching the big questions of our existence. When there is a lack of meaningful data to weigh in on a problem, good scientists are comfortable holding many possibilities at once, rather than committing to a particular story over others. In light of this, I have found myself surprised by the amount of certainty out there.
[….]

This is why I call myself a “possibilian”. Possibilianism emphasises the active exploration of new, unconsidered notions. A possibilian is comfortable holding multiple ideas in mind and is not driven by the idea of fighting for a single, particular story. The key emphasis of possibilianism is to shine a flashlight around the possibility space. It is a plea not simply for open-mindedness, but for an active exploration of new ideas.

Continue reading

What If We Played Chess the Way People Argue?

Throughout the years, I have noticed a pattern that occurs when arguing with various people on the internet – rather than focus and deal with the actual argument I am making, they are arguing against a point that they anticipate I will make later down the line. Rather than address the question I am asking, they address the answer they think I am trying to elicit.  To argue and respond like this, I assume they think they are playing mental chess, relying on their clever foresight to anticipate my next move.  But more often than not, they are simply relying on their own stereotypes. So what would it be like to play real chess with someone like that?

Continue reading

Metaphors and perception

Over at the BioLogos blog, Michael Ruse offers a short summary of his new book, Science and Spirituality: Making Room for Faith in the Age of Science.  Go read it and come back for a few observations below the fold.

Continue reading

Jastrow’s Bunny

From John F. Kihlstrom:

Technically, the duck-rabbit figure is an ambiguous (or reversible, or bistable) figure, not an illusion (Peterson, Kihlstrom, Rose, & Glisky, 1992). The two classes of perceptual phenomena have quite different theoretical implications. From a constructivist point of view, many illusions illustrate the role of unconscious inferences in perception, while the ambiguous figures illustrate the role of expectations, world-knowledge, and the direction of attention (Long & Toppino, 2004). For example, children tested on Easter Sunday are more likely to see the figure as a rabbit; if tested on a Sunday in October, they tend to see it as a duck or similar bird (Brugger & Brugger, 1993).

But the more important point of this letter concerns attribution: the duck-rabbit was “originally noted” not by Wittgenstein, but rather by the American psychologist Joseph Jastrow in 1899 (Jastrow, 1899, 1900; see also Brugger, 1999), when the famous philosopher (b. 1889) was probably still in short pants. Along with such figures as the Necker cube and the Schroeder staircase, Jastrow used the duck-rabbit to make the point that perception is not just a product of the stimulus, but also of mental activity – that we see with the mind as well as the eye.



Conformity

No commentary needed. 🙂

Incomplete Penetrance and the Complexity of Belief

In genetics, there is a concept known as penetrance. This concept is typically most relevant with dominant mutations that cause disease and the idea here is that not all genotypes elicit their expected phenotypes. For example, consider the phenonmena of polydactyly in humans. This is where an individual has extra fingers and/or toes. Since this trait is caused by a dominant mutation, you would expect that anyone with the dominant allele would have this trait. Yet this is not always true. The concept of penetrance comes into play when we estimate how many with a particular genotype express the trait. For example, if 90 out of 100 people who are heterozygous have the trait, we’d say the trait is 90% penetrant.

So why is it that many traits show less than 100% penetrance? Two factors come into play ““ the genetic background and the environment. Whether or not a particular allele at a specific locus is expressed can be a function of the expression of other alleles at other loci. Thus, without the right genetic context, a particular genotype may not be expressed. As for environment, it is well known that it can work in conjunction with a genotype to determine whether a particular phenotype is seen. This means that certain traits will be expressed only in the right environmental context.

I mention all of this simply because it makes for a nice metaphor in understanding how humans believe.

Continue reading

Group Think

From Researcher Condemns Conformity Among His Peers:

Journalists, of course, are conformists too. So are most other professions. There’s a powerful human urge to belong inside the group, to think like the majority, to lick the boss’s shoes, and to win the group’s approval by trashing dissenters.

The strength of this urge to conform can silence even those who have good reason to think the majority is wrong. You’re an expert because all your peers recognize you as such. But if you start to get too far out of line with what your peers believe, they will look at you askance and start to withdraw the informal title of “expert” they have implicitly bestowed on you. Then you’ll bear the less comfortable label of “maverick,” which is only a few stops short of “scapegoat” or “pariah.”

and

Conformity and group-think are attitudes of particular danger in science, an endeavor that is inherently revolutionary because progress often depends on overturning established wisdom. It’s obvious that least 100 genes must be needed to convert a human or animal cell back to its embryonic state. Or at least it was obvious to almost everyone until Shinya Yamanaka of Kyoto University showed it could be done with just 4.

The academic monocultures referred to by Dr. Bouchard are the kind of thing that sabotages scientific creativity.

Inattentional Blindness

Now that you have had a chance to experience inattentional blindness, let me propose that something very similar is at work when it comes to the hypothesis that life was designed such that the subsequent evolution was shaped by this design. If this hypothesis is valid, I suspect that most people would not notice it much as most people do not notice the gorilla pounding his chest in the middle of a group of basketball players.

Continue reading