Cargo Cult Science

Richard Feynman From a Caltech commencement address given in 1974 Also in “Surely You’re Joking, Mr. Feynman!”: Adventures of a Curious Character During the Middle Ages there were all kinds of crazy ideas, such as that a piece of of rhinoceros horn would increase potency. Then a method was discovered for separating the ideas–which was […]

Richard Feynman

From a Caltech commencement address given in 1974
Also in “Surely You’re Joking, Mr. Feynman!”: Adventures of a Curious Character

During the Middle Ages there were all kinds of crazy ideas, such as that a piece of of rhinoceros horn would increase potency. Then a method was discovered for separating the ideas–which was to try one to see if it worked, and if it didn’t work, to eliminate it. This method became organized, of course, into science. And it developed very well, so that we are now in the scientific age. It is such a scientific age, in fact, that we have difficulty in understanding how witch doctors could ever have existed, when nothing that they proposed ever really worked–or very little of it did.

But even today I meet lots of people who sooner or later get me into a conversation about UFO’s, or astrology, or some form of mysticism, expanded consciousness, new types of awareness, ESP, and so forth. And I’ve concluded that it’s not a scientific world.

Most people believe so many wonderful things that I decided to investigate why they did. And what has been referred to as my curiosity for investigation has landed me in a difficulty where I found so much junk that I’m overwhelmed. First I started out by investigating various ideas of mysticism and mystic experiences. I went into isolation tanks and got many hours of hallucinations, so I know something about that. Then I went to Esalen, which is a hotbed of this kind of thought (it’s a wonderful place; you should go visit there). Then I became overwhelmed. I didn’t realize how MUCH there was.

At Esalen there are some large baths fed by hot springs situated on a ledge about thirty feet above the ocean. One of my most pleasurable experiences has been to sit in one of those baths and watch the waves crashing onto the rocky slope below, to gaze into the clear blue sky above, and to study a beautiful nude as she quietly appears and settles into the bath with me.

One time I sat down in a bath where there was a beatiful girl sitting with a guy who didn’t seem to know her. Right away I began thinking, “Gee! How am I gonna get started talking to this beautiful nude woman?”

I’m trying to figure out what to say, when the guy says to her, “I’m, uh, studying massage. Could I practice on you?” “Sure,” she says. They get out of the bath and she lies down on a massage table nearby. I think to myself, “What a nifty line! I can never think of anything like that!” He starts to rub her big toe. “I think I feel it,” he says. “I feel a kind of dent–is that the pituitary?” I blurt out, “You’re a helluva long way from the pituitary, man!” They looked at me, horrified–I had blown my cover–and said, “It’s reflexology!” I quickly closed my eyes and appeared to be meditating.

That’s just an example of the kind of things that overwhelm me. I also looked into extrasensory perception, and PSI phenomena, and the latest craze there was Uri Geller, a man who is supposed to be able to bend keys by rubbing them with his finger. So I went to his hotel room, on his invitation, to see a demonstration of both mindreading and bending keys. He didn’t do any mindreading that succeeded; nobody can read my mind, I guess. And my boy held a key and Geller rubbed it, and nothing happened. Then he told us it works better under water, and so you can picture all of us standing in the bathroom with the water turned on and the key under it, and him rubbing the key with his finger. Nothing happened. So I was unable to investigate that phenomenon.

But then I began to think, what else is there that we believe? (And I thought then about the witch doctors, and how easy it would have been to check on them by noticing that nothing really worked.) So I found things that even more people believe, such as that we have some knowledge of how to educate. There are big schools of reading methods and mathematics methods, and so forth, but if you notice, you’ll see the reading scores keep going down–or hardly going up–in spite of the fact that we continually use these same people to improve the methods. There’s a witch doctor remedy that doesn’t work. It ought to be looked into; how do they know that their method should work? Another example is how to treat criminals. We obviously have made no progress–lots of theory, but no progress–in decreasing the amount of crime by the method that we use to handle criminals.

Yet these things are said to be scientific. We study them. And I think ordinary people with commonsense ideas are intimidated by this pseudoscience. A teacher who has some good idea of how to teach her children to read is forced by the school system to do it some other way–or is even fooled by the school system into thinking that her method is not necessarily a good one. Or a parent of bad boys, after disciplining them in one way or another, feels guilty for the rest of her life because she didn’t do “the right thing,” according to the experts.

So we really ought to look into theories that don’t work, and science that isn’t science.

I think the educational and psychological studies I mentioned are examples of what I would like to call cargo cult science. In the South Seas there is a cargo cult of people. During the war they saw airplanes with lots of good materials, and they want the same thing to happen now. So they’ve arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head to headphones and bars of bamboo sticking out like antennas–he’s the controller–and they wait for the airplanes to land. They’re doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn’t work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they’re missing something essential, because the planes don’t land.

Now it behooves me, of course, to tell you what they’re missing. But it would be just about as difficult to explain to the South Sea islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones. But there is one feature I notice that is generally missing in cargo cult science. That is the idea that we all hope you have learned in studying science in school–we never say explicitly what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty–a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid–not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked–to make sure the other fellow can tell they have been eliminated.

Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can–if you know anything at all wrong, or possibly wrong–to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.

In summary, the idea is to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgement in one particular direction or another.

The easiest way to explain this idea is to contrast it, for example, with advertising. Last night I heard that Wesson oil doesn’t soak through food. Well, that’s true. It’s not dishonest; but the thing I’m talking about is not just a matter of not being dishonest; it’s a matter of scientific integrity, which is another level. The fact that should be added to that advertising statement is that no oils soak through food, if operated at a certain temperature. If operated at another temperature, they all will–including Wesson oil. So it’s the implication which has been conveyed, not the fact, which is true, and the difference is what we have to deal with.

We’ve learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature’s phenomena will agree or they’ll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven’t tried to be very careful in this kind of work. And it’s this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in cargo cult science.

A great deal of their difficulty is, of course, the difficulty of the subject and the inapplicability of the scientific method to the subject. Nevertheless, it should be remarked that this is not the only difficulty. That’s why the planes don’t land–but they don’t land.

We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It’s a little bit off because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.

Why didn’t they discover the new number was higher right away? It’s a thing that scientists are ashamed of–this history–because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong–and they would look for and find a reason why something might be wrong. When they got a number close to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that. We’ve learned those tricks nowadays, and now we don’t have that kind of a disease.

But this long history of learning how to not fool ourselves–of having utter scientific integrity–is, I’m sorry to say, something that we haven’t specifically included in any particular course that I know of. We just hope you’ve caught on by osmosis

The first principle is that you must not fool yourself–and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I am not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi. I’m talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you’re maybe wrong, that you ought to have when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.

For example, I was a little surprised when I was talking to a friend who was going to go on the radio. He does work on cosmology and astronomy, and he wondered how he would explain what the applications of his work were. “Well,” I said, “there aren’t any.” He said, “Yes, but then we won’t get support for more research of this kind.” I think that’s kind of dishonest. If you’re representing yourself as a scientist, then you should explain to the layman what you’re doing– and if they don’t support you under those circumstances, then that’s their decision.

One example of the principle is this: If you’ve made up your mind to test a theory, or you want to explain some idea, you should always decide to publish it whichever way it comes out. If we only publish results of a certain kind, we can make the argument look good. We must publish BOTH kinds of results.

I say that’s also important in giving certain types of government advice. Supposing a senator asked you for advice about whether drilling a hole should be done in his state; and you decide it would be better in some other state. If you don’t publish such a result, it seems to me you’re not giving scientific advice. You’re being used. If your answer happens to come out in the direction the government or the politicians like, they can use it as an argument in their favor; if it comes out the other way, they don’t publish at all. That’s not giving scientific advice.

Other kinds of errors are more characteristic of poor science. When I was at Cornell, I often talked to the people in the psychology department. One of the students told me she wanted to do an experiment that went something like this–it had been found by others that under certain circumstances, X, rats did something, A. She was curious as to whether, if she changed the circumstances to Y, they would still do A. So her proposal was to do the experiment under circumstances Y and see if they still did A.

I explained to her that it was necessary first to repeat in her laboratory the experiment of the other person–to do it under condition X to see if she could also get result A, and then change to Y and see if A changed. Then she would know the the real difference was the thing she thought she had under control.

She was very delighted with this new idea, and went to her professor. And his reply was, no, you cannot do that, because the experiment has already been done and you would be wasting time. This was in about 1947 or so, and it seems to have been the general policy then to not try to repeat psychological experiments, but only to change the conditions and see what happened.

Nowadays, there’s a certain danger of the same thing happening, even in the famous field of physics. I was shocked to hear of an experiment being done at the big accelerator at the National Accelerator Laboratory, where a person used deuterium. In order to compare his heavy hydrogen results to what might happen with light hydrogen, he had to use data from someone else’s experiment on light hydrogen, which was done on different apparatus. When asked why, he said it was because he couldn’t get time on the program (because there’s so little time and it’s such expensive apparatus) to do the experiment with light hydrogen on this apparatus because there wouldn’t be any new result. And so the men in charge of programs at NAL are so anxious for new results, in order to get more money to keep the thing going for public relations purposes, they are destroying–possibly–the value of the experiments themselves, which is the whole purpose of the thing. It is often hard for the experimenters there to complete their work as their scientific integrity demands.

All experiments in psychology are not of this type, however. For example, there have been many experiments running rats through all kinds of mazes, and so on–with little clear result. But in 1937 a man named Young did a very interesting one. He had a long corridor with doors all along one side where the rats came in, and doors along the other side where the food was. He wanted to see if he could train the rats to go in at the third door down from wherever he started them off. No. The rats went immediately to the door where the food had been the time before.

The question was, how did the rats know, because the corridor was so beautifully built and so uniform, that this was the same door as before? Obviously there was something about the door that was different from the other doors. So he painted the doors very carefully, arranging the textures on the faces of the doors exactly the same. Still the rats could tell. Then he thought maybe the rats were smelling the food, so he used chemicals to change the smell after each run. Still the rats could tell. Then he realized the rats might be able to tell by seeing the lights and the arrangement in the laboratory like any commonsense person. So he covered the corridor, and still the rats could tell.

He finally found that they could tell by the way the floor sounded when they ran over it. And he could only fix that by putting his corridor in sand. So he covered one after another of all possible clues and finally was able to fool the rats so that they had to learn to go in the third door. If he relaxed any of his conditions, the rats could tell.

Now, from a scientific standpoint, that is an A-number-one experiment. That is the experiment that makes rat-running experiments sensible, because it uncovers that clues that the rat is really using– not what you think it’s using. And that is the experiment that tells exactly what conditions you have to use in order to be careful and control everything in an experiment with rat-running.

I looked up the subsequent history of this research. The next experiment, and the one after that, never referred to Mr. Young. They never used any of his criteria of putting the corridor on sand, or being very careful. They just went right on running the rats in the same old way, and paid no attention to the great discoveries of Mr. Young, and his papers are not referred to, because he didn’t discover anything about the rats. In fact, he discovered all the things you have to do to discover something about rats. But not paying attention to experiments like that is a characteristic example of cargo cult science.

Another example is the ESP experiments of Mr. Rhine, and other people. As various people have made criticisms–and they themselves have made criticisms of their own experiements–they improve the techniques so that the effects are smaller, and smaller, and smaller until they gradually disappear. All the para-psychologists are looking for some experiment that can be repeated–that you can do again and get the same effect–statistically, even. They run a million rats–no, it’s people this time–they do a lot of things are get a certain statistical effect. Next time they try it they don’t get it any more. And now you find a man saying that is is an irrelevant demand to expect a repeatable experiment. This is science?

This man also speaks about a new institution, in a talk in which he was resigning as Director of the Institute of Parapsychology. And, in telling people what to do next, he says that one of things they have to do is be sure the only train students who have shown their ability to get PSI results to an acceptable extent–not to waste their time on those ambitious and interested students who get only chance results. It is very dangerous to have such a policy in teaching–to teach students only how to get certain results, rather than how to do an experiment with scientific integrity.

So I have just one wish for you–the good luck to be somewhere where you are free to maintain the kind of integrity I have described, and where you do not feel forced by a need to maintain your position in the organization, or financial support, or so on, to lose your integrity. May you have that freedom.

[photo] — Richard Feynman


Monty Hall problem

Bertrand’s box paradox is a classic paradox of elementary probability theory. It was first posed by Joseph Bertrand in his Calcul des probabilités, published in 1889. There are three boxes: a box containing two gold coins, a box containing two silver coins, a box containing one gold coin and one silver coin. After choosing a box at random and withdrawing one coin […]

Bertrand’s box paradox is a classic paradox of elementary probability theory. It was first posed by Joseph Bertrand in his Calcul des probabilités, published in 1889.

There are three boxes:

  1. a box containing two gold coins,
  2. a box containing two silver coins,
  3. a box containing one gold coin and one silver coin.

After choosing a box at random and withdrawing one coin at random, if that happens to be a gold coin, it may seem that the probability that the remaining coin is gold is 1?2; in fact, the probability is actually 2?3. Two problems that are very similar are the Monty Hall problem and the Three Prisoners problem.

These simple but slightly counterintuitive puzzles are used as a standard example in teaching probability theory. Their solution illustrates some basic principles, including theKolmogorov axioms.


The Monty Hall problem is a brain teaser, in the form of a probability puzzle (Gruber, Krauss and others), loosely based on the American television game show Let’s Make a Deal and named after its original host, Monty Hall. The problem was originally posed in a letter by Steve Selvin to the American Statistician in 1975 (Selvin 1975a), (Selvin 1975b). It became famous as a question from a reader’s letter quoted inMarilyn vos Savant‘s “Ask Marilyn” column in Parade magazine in 1990 (vos Savant 1990a):

Suppose you’re on a game show, and you’re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what’s behind the doors, opens another door, say No. 3, which has a goat. He then says to you, “Do you want to pick door No. 2?” Is it to your advantage to switch your choice?

Vos Savant’s response was that the contestant should switch to the other door. (vos Savant 1990a)

The argument relies on assumptions, explicit in extended solution descriptions given by Selvin (1975b) and by vos Savant (1991a), that the host always opens a different door from the door chosen by the player and always reveals a goat by this action—because he knows where the car is hidden. Leonard Mlodinow stated: “The Monty Hall problem is hard to grasp, because unless you think about it carefully, the role of the host goes unappreciated.” (Mlodinow 2008) It is also assumed that the contestant prefers to win a car, rather than a goat.

Contestants who switch have a 2/3 chance of winning the car, while contestants who stick to their choice have only a 1/3 chance. One explanation notices that 2/3 of the time, the initial choice of the player is a door hiding a goat. The host is then forced to open the other goat door, and the remaining one must, therefore, hide the car. “Switching” only fails to give the car when the player picks the “right” door to begin with, which only has a 1/3 chance.

Many readers of vos Savant’s column refused to believe switching is beneficial despite her explanation. After the problem appeared in Parade, approximately 10,000 readers, including nearly 1,000 with PhDs, wrote to the magazine, most of them claiming vos Savant was wrong (Tierney 1991). Even when given explanations, simulations, and formal mathematical proofs, many people still do not accept that switching is the best strategy (vos Savant 1991a). Paul Erd?s, one of the most prolific mathematicians in history, remained unconvinced until he was shown a computer simulation confirming the predicted result (Vazsonyi 1999).

The Monty Hall problem has attracted academic interest from the surprising result and simple formulation. Variations of the Monty Hall problem are made by changing the implied assumptions and can create drastically different consequences. For one variation, if Monty only offers the contestant a chance to switch when the contestant initially chose the door hiding the car, then the contestant should never switch. For another variation, if Monty opens another door randomly and happens to reveal a goat, then it makes no difference (Rosenthal, 2005a), (Rosenthal, 2005b).

The problem is a paradox of the veridical type, because the correct result (you should switch doors) is so counterintuitive it can seem absurd, but is nevertheless demonstrably true. The Monty Hall problem is mathematically closely related to the earlier Three Prisoners problem and to the much older Bertrand’s box paradox.

The problem continues to attract the attention of cognitive psychologists. The typical behaviour of the majority, i.e., not switching, may be explained by phenomena known in the psychological literature as: 1) the endowment effect (Kahneman et al., 1991); people tend to overvalue the winning probability of the already chosen – already “owned” – door; 2) the status quo bias (Samuelson and Zeckhauser, 1988); people prefer to stick with the choice of door they have already made; 3) the errors of omission vs. errors of commission effect (Gilovich et al., 1995); all else considered equal, people prefer that any errors that they are responsible for to have occurred through ‘omission’ of taking action rather than through having taken an explicit action that later becomes known to have been erroneous. Experimental evidence confirms that these are plausible explanations which do not depend on probability intuition (Kaivanto et al., 2014Morone and Fiore, 2007).

Criticism of the simple solutions

As already remarked, most sources in the field of probability, including many introductory probability textbooks, solve the problem by showing the conditional probabilities the car is behind door 1 and door 2 are 1/3 and 2/3 (not 1/2 and 1/2) given the contestant initially picks door 1 and the host opens door 3; various ways to derive and understand this result were given in the previous subsections. Among these sources are several that explicitly criticize the popularly presented “simple” solutions, saying these solutions are “correct but … shaky” (Rosenthal 2005a), or do not “address the problem posed” (Gillman 1992), or are “incomplete” (Lucas et al. 2009), or are “unconvincing and misleading” (Eisenhauer 2001) or are (most bluntly) “false” (Morgan et al. 1991). Some say that these solutions answer a slightly different question – one phrasing is “you have to announce before a door has been opened whether you plan to switch” (Gillman 1992, emphasis in the original).

The simple solutions show in various ways that a contestant who is determined to switch will win the car with probability 2/3, and hence that switching is the winning strategy, if the player has to choose in advance between “always switching”, and “always staying”. However, the probability of winning by always switching is a logically distinct concept from the probability of winning by switching given the player has picked door 1 and the host has opened door 3. As one source says, “the distinction between [these questions] seems to confound many” (Morgan et al. 1991). This fact that these are different can be shown by varying the problem so that these two probabilities have different numeric values. For example, assume the contestant knows that Monty does not pick the second door randomly among all legal alternatives but instead, when given an opportunity to pick between two losing doors, Monty will open the one on the right. In this situation the following two questions have different answers:

  1. What is the probability of winning the car by always switching?
  2. What is the probability of winning the car given the player has picked door 1 and the host has opened door 3?

The answer to the first question is 2/3, as is correctly shown by the “simple” solutions. But the answer to the second question is now different: the conditional probability the car is behind door 1 or door 2 given the host has opened door 3 (the door on the right) is 1/2. This is because Monty’s preference for rightmost doors means he opens door 3 if the car is behind door 1 (which it is originally with probability 1/3) or if the car is behind door 2 (also originally with probability 1/3). For this variation, the two questions yield different answers. However as long as the initial probability the car is behind each door is 1/3, it is never to the contestant’s disadvantage to switch, as the conditional probability of winning by switching is always at least 1/2. (Morgan et al. 1991)


Players who STAY have won 49040 cars out of 145987 games yielding a winning percentage of 34%
players who SWITCH have won 68356 cars out of 103063 games yielding a winning percentage of 66%

Video Transcript:

You’re on a game show and there are three doors in front of you. The host, Monty Hall, says, “Behind one door is a brand new car. Behind the other two doors are goats. Pick a door!” You think, “Well, it doesn’t matter which door I choose, every door has a 1/3 chance of having the car behind it.” So, you choose door number 1. Now it gets interesting. Monty, the host, who knows where the car is, opens door number 2 and reveals a goat. The host always opens a door to reveal a goat. The host says, “If you want, you can switch to door number 3.” What should you do? Stay with your original choice or switch to the other door? All right, so what are you going to do? Stay or switch? Well, it’s a fifty-fifty chance of winning the car in either door. Right? [Wrong!] You actually double your chances of winning the car by switching doors. And that is why the Monty Hall Problem is so evasive!
Choose an explanation to the Monty Hall Problem:

1/3 vs 2/3 – Solution #1 to the Monty Hall Problem
There is a 1/3 chance of the car being behind door number 1 and a 2/3 chance that the car isn’t behind door number 1. After Monty Hall opens door number 2 to reveal a goat, there’s still a 1/3 chance that the car is behind door number 1 and a 2/3 chance that the car isn’t behind door number 1. A 2/3 chance that the car isn’t behind door number 1 is a 2/3 chance that the car is behind door number 3.
100 Doors! – Solution #2 to the Monty Hall Problem
Imagine that instead of 3 doors, there are 100. All of them have goats except one, which has the car. You choose a door, say, door number 23. At this point, Monty Hall opens all of the other doors except one and gives you the offer to switch to the other door. Would you switch? Now you may arrogantly think, “Well, maybe I actually picked the correct door on my first guess.” But what’s the probability that that happened? 1/100. There’s a 99% chance that the car isn’t behind the door that you picked. And if it’s not behind the door that you picked, it must be behind the last door that Monty left for you. In other words, Monty has helped you by leaving one door for you to switch to, that has a 99% chance of having the car behind it. So in this case, if you were to switch, you would have a 99% chance of winning the car.
Pick a Goat – Solution #3 to the Monty Hall Problem
To win using the stay strategy, you need to choose the car on your first pick because you’re planning to stay with your initial choice. The chance of picking the car on your first pick is clearly one out of three. But, in order to win using the switch strategy, you only need to pick a goat on your first pick because the host will reveal the other goat and you’ll end up switching to the car. So you want to use the strategy that lets you win if you choose a goat initially because you’re twice as likely to start by picking a goat.
Scenarios – Solution #4 to the Monty Hall Problem
To understand why it’s better to switch doors, let’s play out a few scenarios. Let’s see what will happen if you were to always stay with your original choice. We’ll play out three scenarios, one for each door that the car could be behind (door number 1, door number 2, or door number 3). And it doesn’t matter which door you start out with, so, to keep it simple, we’ll always start by choosing door number 1.
Stay strategy, scenario 1: the car is behind door number 1. You choose door number 1, then the host reveals a goat behind door number 2 and because you always stay, you stay with door number 1. You win the car! Stay strategy, scenario 2: the car is behind door number 2. You start by picking door number 1, the host reveals a goat behind door number 3, and you’re using the stay strategy so you stay with door number 1. You get a goat and don’t win the car. Stay strategy, scenario 3: the car is behind door number 3. You pick door number 1, the host opens door number 2 to reveal a goat, you stay with door number 1, and you get a goat. So, using the stay strategy, you won the car one out of three times. That means that in any one instance of playing the game, your chance of winning the car if you choose to stay is 1/3 or about 33%.

Now let’s try switching doors. Again, we’ll always start by picking door number 1. Switch strategy, scenario 1: the car is behind door number 1. You choose door number 1, the host opens door number 2 to reveal a goat, you are using the switch strategy so you switch to door number 3. You get a goat. Switch strategy, scenario 2: the car is behind door number 2. You start by picking door number 1, the host opens door number 3 to reveal a goat, you switch to door number 2 and win the car! Switch strategy, scenario 3: the car is behind door number 3. You pick door number 1, the host opens door number 2 to reveal a goat, you switch to door number 3 and win the car again! So, with the switch strategy you won the car 2 out of 3 times. That means, that in any one instance of the game, your chance of winning the car if you choose to switch doors is 2/3 or about 67%.

Therefore, if you play the game three times and stay, on average you’ll win the car once. But if you play the game three times and switch each time, on average you’ll win the car twice. That’s twice as many cars!


The candle problem

Uploaded on Aug 25, 2009 http://www.ted.com Career analyst Dan Pink examines the puzzle of motivation, starting with a fact that social scientists know but most managers don’t: Traditional rewards aren’t always as effective as we think. Listen for illuminating stories — and maybe, a way forward. The candle problem or candle task, also known as […]

Uploaded on Aug 25, 2009

http://www.ted.com Career analyst Dan Pink examines the puzzle of motivation, starting with a fact that social scientists know but most managers don’t: Traditional rewards aren’t always as effective as we think. Listen for illuminating stories — and maybe, a way forward.

The candle problem or candle task, also known as Duncker’s candle problem, is a cognitive performance test, measuring the influence of functional fixedness on a participant’s problem solving capabilities. The test was created [1] by Gestalt psychologist Karl Duncker and published posthumously in 1945. Duncker originally presented this test in his thesis on problem solving tasks at Clark University.

The test presents the participant with the following task: how to fix a lit candle on a wall (a cork board) in a way so the candle wax won’t drip onto the table below.[3] To do so, one may only use the following along with the candle:

  • a book of matches
  • a box of thumbtacks

The solution is to empty the box of thumbtacks, put the candle into the box, use the thumbtacks to nail the box (with the candle in it) to the wall, and light the candle with the match.[3] The concept of functional fixedness predicts that the participant will only see the box as a device to hold the thumbtacks and not immediately perceive it as a separate and functional component available to be used in solving the task.

Response

Many of the people who attempted the test explored other creative, but less efficient, methods to achieve the goal. For example, some tried to tack the candle to the wall without using the thumbtack box,[4] and others attempted to melt some of the candle’s wax and use it as an adhesive to stick the candle to the wall.[1] Neither method works.[1] However, if the task is presented with the tacks piled next to the box (rather than inside it), virtually all of the participants were shown to achieve the optimal solution, which is self defined.[4]

The test has been given to numerous people, including M.B.A. students at the Kellogg School of Management in a study investigating whether living abroad and creativity are linked.[5]

Glucksberg

Glucksberg (1962)[6] used a 2 × 2 design manipulating whether the tacks and matches were inside or outside of their boxes and whether subjects were offered cash prizes for completing the task quickly. Subjects who were offered no prize, termed low-drive, were told “We are doing pilot work on various problems in order to decide which will be the best ones to use in an experiment we plan to do later. We would like to obtain norms on the time needed to solve.” The remaining subjects, termed high-drive, were told “Depending on how quickly you solve the problem you can win $5.00 or $20.00. The top 25% of the Ss [subjects] in your group will win $5.00 each; the best will receive $20.00. Time to solve will be the criterion used.” (As a note, adjusting for inflation since 1962, the study’s publish year, the amounts in today’s dollars would be approximately $39 and $154, respectively.[7]) The empty-boxes condition was found to be easier than the filled-boxes condition: more subjects solved the problem, and those who did solve the problem solved it faster. Within the filled-boxes condition, high-drive subjects performed worse than low-drive subjects. Glucksberg interpreted this result in terms of “neobehavioristic drive theory”: “high drive prolongs extinction of the dominant habit and thus retards the correct habit from gaining ascendancy”. An explanation in terms of the overjustification effect is made difficult by the lack of a main effect for drive and by a nonsignificant trend in the opposite direction within the empty-boxes condition.

Another way to explain the higher levels of failure during the high-drive condition is that the process of turning the task into a competition for limited resources can create mild levels of stress in the subject, which can lead to the Sympathetic nervous system, otherwise known as the Fight-or-flight response, taking over the brain and body. This stress response effectively shuts down the creative thinking and problem solving areas of the brain in the prefrontal cortex.

Linguistic implications

E. Tory Higgins and W. M. Chaires found that having subjects repeat the names of common pairs of objects in this test, but in a different and unaccustomed linguistic structure, such as “box and tacks” instead of “box of tacks”, facilitated performance on the candle problem.[3] This phrasing helps one to distinguish the two entities as different and more accessible.[3]

In a written version of the task given to people at Stanford University, Michael C. Frank and language acquisition researcher Michael Ramscar reported that simply underlining certain relevant materials (“on the table there is a candle, a box of tacks, and a book of matches…”) increases the number of candle-problem solvers from 25% to 50%.[4]

References

  1. ^ Jump up to: a b c “Dan Pink on the surprising science of motivation”. Retrieved 16 January 2010.
  2. Jump up ^ Daniel Biella and Wolfram Luther. “A Synthesis Model for the Replication of Historical Experiments in Virtual Environments”. 5th European Conference on e-Learning. Academic Conferences Limited. p. 23. ISBN 978-1-905305-30-8.
  3. ^ Jump up to: a b c d Richard E. Snow and Marshall J. Farr, ed. (1987). “Positive Affect and Organization”. Aptitude, Learning, and Instruction Volume 3: Conative and Affective Process Analysis. Routledge. ISBN 978-0-89859-721-9.
  4. ^ Jump up to: a b c Frank, Michael. “Against Informational Atomism”. Retrieved 15 January 2010.
  5. Jump up ^ “Living Outside the Box: Living abroad boosts creativity”. April 2009. Retrieved 16 January 2010.
  6. Jump up ^ Glucksberg, S. (1962). “The influence of strength of drive on functional fixedness and perceptual recognition”. Journal of Experimental Psychology 63: 36–41. doi:10.1037/h0044683. PMID 13899303. edit
  7. Jump up ^ Inflated values automatically calculated.

Cognitive bias

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people’s innumeracy, or inability to reason intuitively with the greater orders of magnitude. They and their colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. They […]

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people’s innumeracy, or inability to reason intuitively with the greater orders of magnitude. They and their colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. They explained these differences interms of heuristics; rules which are simple for the brain to compute but introduce systematic errors. For instance the availability heuristic, when the ease with which something comes to mindis used toindicate how often (or how recently)it has been encountered.

Cognitive bias is any of a wide range of observer effects identified in cognitive science, including very basic statistical and memory errors that are common to all human beings (many of which have been discussed by Amos Tversky and Daniel Kahneman) and drastically skew the reliability of anecdotal and legal evidence. They also significantly affect the scientific method which is deliberately designed to minimize such bias from any one observer.

cognitive bias is a pattern of deviation in judgment that occurs in particular situations, which may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.[1][2][3] Implicit in the concept of a “pattern of deviation” is a standard of comparison with what is normatively expected; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts. A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive sciencesocial psychology, and behavioral economics.

Some cognitive biases are presumably adaptive, for example, because they lead to more effective actions in a given context or enable faster decisions when timeliness is more valuable than accuracy (heuristics). Others presumably result from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing.

Biases can be distinguished on a number of dimensions. For example, there are biases specific to groups (such as the risky shift) as well as biases at the individual level.

Some biases affect decision-making, where the desirability of options has to be considered (e.g., Sunk Cost fallacy). Others such as Illusory correlation affect judgment of how likely something is, or of whether one thing is the cause of another. A distinctive class of biases affect memory,[12] such as consistency bias (remembering one’s past attitudes and behavior as more similar to one’s present attitudes).

Some biases reflect a subject’s motivation,[13] for example, the desire for a positive self-image leading to Egocentric bias[14] and the avoidance of unpleasant cognitive dissonance. Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as “Hot cognition” versus “Cold Cognition”, as motivated reasoning can involve a state of arousal.

Among the “cold” biases, some are due to ignoring relevant information (e.g. Neglect of probability), whereas some involve a decision or judgement being affected by irrelevant information (for example the Framing effect where the same problem receives different responses depending on how it is described) or giving excessive weight to an unimportant but salient feature of the problem (e.g., Anchoring).

The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself[14] accounts for the fact that many biases are self-serving or self-directed (e.g.Illusion of asymmetric insightSelf-serving biasProjection bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and “better” in many respects, even when those groups are arbitrarily-defined (Ingroup biasOutgroup homogeneity bias).

Some cognitive biases belong to the subgroup of attentional biases which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop Task[15][16] and the Dot Probe Task.

The following is a list of the more commonly studied cognitive biases:

For other noted biases, see List of cognitive biases.
  • Framing by using a too-narrow approach and description of the situation or issue.
  • Hindsight bias, sometimes called the “I-knew-it-all-along” effect, is the inclination to see past events as being predictable.
  • Fundamental attribution error is the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior.
  • Confirmation bias is the tendency to search for or interpret information in a way that confirms one’s preconceptions; this is related to the concept of cognitive dissonance.
  • Self-serving bias is the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.
  • Belief bias is when one’s evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion.

A 2012 Psychological Bulletin article suggests that at least 8 seemingly unrelated biases can be produced by the same information-theoretic generative mechanism.[17] It is shown that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce regressive conservatism, the conservatism (Bayesian)illusory correlationsbetter-than-average effect and worse-than-average effectsubadditivity effectexaggerated expectationoverconfidence, and the hard–easy effect.

However, as recent research has demonstrated,, even scientists who adhere to the scientific method can’t guarantee they will draw the best possible conclusions. “How could such highly-educated and precisely-trained professionals veer off the path of objectivity?” The answer is simple: Being human.

As the fields of psychology and behavioral economics have demonstrated, homo sapiens is a seemingly irrational species that appears to, more often than not, think and behave in nonsensical rather than commonsensical ways. The reason is that we fall victim to a veritable laundry list of cognitive biases that cause us to engage in distorted, imprecise and incomplete thinking which, not surprisingly, results in “perceptual distortion, inaccurate judgment or illogical interpretation” (thanks Wikipedia), and, by extension, poor and sometimes catastrophic decisions.

Well-known examples of the results of cognitive biases include the Internet, the housing and financial crises of the past decade, truly stupid use of social media by politicians, celebrities and professional athletes, the existence of the $2.5 billion self-help industry, and, well, believing that a change in the controlling party in Washington will somehow change its toxic political culture.

What is interesting is that many of these cognitive biases must have had, at some point in our evolution, adaptive value. These distortions helped us to process information more quickly (e.g., stalking prey in the jungle), meet our most basic needs (e.g., help us find mates) and connect with others (e.g., be a part of a “tribe”).

The biases that helped us survive in primitive times when life was much simpler (e.g., life goal: live through the day) and speed of a decision rightfully trumped its absolute accuracy doesn’t appear to be quite as adaptive in today’s much more complex world. Due to the complicated nature of life these days, correctness of information, thoroughness of processing, precision of interpretation and soundness of judgment are, in most situations today, far more important than the simplest and fastest route to a judgment.

Unfortunately, there is no magic pill that will inoculate us from these cognitive biases. But we can reduce their power over us by understanding these distortions, looking for them in our own thinking and making an effort to counter their influence over us as we draw conclusions, make choices and come to decisions. In other words, just knowing and considering these universal biases (in truth, what most people call common sense is actually common bias) will make us less likely to fall victim to them.

Here are some of the most widespread cognitive biases that contaminate our ability to use common sense:

  • The bandwagon effect (aka herd mentality) describes the tendency to think or act in ways because other people do. Examples include the popularity of Apple products, use of “in-group” slang and clothing style and watching the “The Real Housewives of … ” reality-TV franchise.
  • The confirmation bias involves the inclination to seek out information that supports our own preconceived notions. The reality is that most people don’t like to be wrong, so they surround themselves with people and information that confirm their beliefs. The most obvious example these days is the tendency to follow news outlets that reinforce our political beliefs.
  • Illusion of control is the propensity to believe that we have more control over a situation than we actually do. If we don’t actually have control, we fool ourselves into thinking we do. Examples include rally caps in sports and “lucky” items.
  • The Semmelweis reflex (just had to include this one because of its name) is the predisposition to deny new information that challenges our established views. Sort of the yang to the yin of the confirmation bias, it exemplifies the adage “if the facts don’t fit the theory, throw out the facts.” An example is the “Seinfeld” episode in which George Costanza’s girlfriend simply refuses to allow him to break up with her.
  • The causation bias suggests the tendency to assume a cause-effect relationship in situations in which none exists (or there is a correlation or association). An example is believing someone is angry with you because they haven’t responded to your email when, more likely, they are busy and just haven’t gotten to it yet.
  • The overconfidence effect involves unwarranted confidence in one’s own knowledge. Examples include political and sports prognosticators.
  • The false consensus effect is the penchant to believe that others agree with you more than they actually do. Examples include guys who assume that all guys like sexist humor.
  • Finally, the granddaddy of all cognitive biases, the fundamental attribution error, which involves the tendency to attribute other people’s behavior to their personalities and to attribute our own behavior to the situation. An example is when someone treats you poorly, you probably assume they are a jerk, but when you’re not nice to someone, it’s because you are having a bad day.

Memory bias — Memory biases may either enhance or impair the recall of memory, or they may alter the content of what we report remembering. There are many memory …  > read more

Anchoring bias in decision-making — Anchoring or focalism is a term used in psychology to describe the common human tendency to rely too heavily, or “anchor,” on one trait or piece of …  > read more

Many of these biases are studied for how they affect belief formation and business decisions and scientific research.

  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthinkcrowd psychologyherd behaviour, and manias.
  • Bias blind spot — the tendency not to compensate for one’s own cognitive biases.
  • Choice-supportive bias — the tendency to remember one’s choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one’s preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one’s own profession, forgetting any broader point of view.
  • Endowment effect — “the fact that people often demand much more to give up an object than they would be willing to pay to acquire it”.[2]
  • Exposure-suspicion bias - a knowledge of a subject’s disease in a medical study may influence the search for causes.
  • Extreme aversion — most people will go to great lengths to avoid extremes. People are more likely to choose an option if it is the intermediate choice.
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing - drawing different conclusions from the same information, depending on how that information is presented.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Loss aversion — “the disutility of giving up an object is greater than the utility associated with acquiring it”.[3] (see also sunk cost effects and Endowment effect).
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Obsequiousness bias - the tendency to systematically alter responses in the direction they perceive desired by the investigator.
  • Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times. Also formulated as Hofstadter’s Law: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance - the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Selective perception — the tendency for expectations to affect perception.
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).[4]
  • Unacceptability bias - questions that may embarrass or invade privacy are refused or evaded.
  • Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
  • Von Restorff effect — the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.
  • Zero-risk bias — the preference for reducing a small risk to zero over a greater reduction in a larger risk. It is relevant e.g. to the allocation of public health resources and the debate about nuclear power.

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.

  • Ambiguity effect — the avoidance of options for which missing information makes the probability seem “unknown”.
  • Anchoring — the tendency to rely too heavily, or “anchor,” on a past reference or on one trait or piece of information when making decisions.
  • Anthropic bias — the tendency for one’s evidence to be biased by observation selection effects.
  • Attentional bias — neglect of relevant data when making judgments of a correlation or association.
  • Availability heuristic — a biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.
  • Clustering illusion — the tendency to see patterns where actually none exist.
  • Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
  • Gambler’s fallacy — the tendency to assume that individual random events are influenced by previous random events. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”
  • Hindsight bias — sometimes called the “I-knew-it-all-along” effect: the inclination to see past events as being predictable, based on knowledge of later events.
  • Hostile media effect — the tendency to perceive news coverage as biased against your position on an issue.
  • Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
  • Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
  • Neglect of prior base rates effect — the tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand.
  • Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions. Found to be linked to the “left inferior frontal gyrus” section of the brain, and disrupting this section of the brain removes the bias. Article summarising this finding
  • Overconfidence effect — the tendency to overestimate one’s own abilities.
  • Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
  • Primacy effect — the tendency to weigh initial events more than subsequent events.
  • Recency effect — the tendency to weigh recent events more than earlier events (see also ‘peak-end rule’).
  • Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
  • Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.

Most of these biases are labeled as attributional biases.

  • Actor-observer bias — the tendency for explanations for other individual’s behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation. This is coupled with the opposite tendency for the self in that one’s explanations for their own behaviors overemphasize their situation and underemphasize the influence of their personality. (see also fundamental attribution error).
  • Dunning-Kruger effect — “…when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, …they are left with the mistaken impression that they are doing just fine.”[5] (See also the Lake Wobegon effect, and overconfidence effect).
  • Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  • False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
  • Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
  • Halo effect — the tendency for a person’s positive or negative traits to “spill over” from one area of their personality to another in others’ perceptions of them (see also physical attractiveness stereotype).
  • Herd instinct – a common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
  • Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers’ knowledge of them.
  • Illusion of transparency — people overestimate others’ ability to know them, and they also overestimate their ability to know others.
  • Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  • Just-world phenomenon — the tendency for people to believe that the world is “just” and therefore people “get what they deserve.”
  • Lake Wobegon effect — the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, and overconfidence effect).
  • Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
  • Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
  • Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Self-serving bias — the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  • Self-fulfilling prophecy — the tendency to engage in behaviors that elicit results which will (consciously or subconsciously) confirm our beliefs.
  • System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
  • Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
  • Beneffectance - perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones. (Term coined by Greenwald (1980))
  • Consistency bias- incorrectly remembering one’s past attitudes and behaviour as resembling present attitudes and behaviour.
  • Cryptomnesia - a form of misattribution where a memory is mistaken for imagination.
  • Egocentric bias - recalling the past in a self-serving manner, e.g. remembering one’s exam grades as being better than they were, or remembering a caught fish as being bigger than it was
  • Confabulation or false memory - Remembering something that never actually happened.
  • Hindsight bias - filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the ‘I-knew-it-all-along effect’.
  • Selective Memory and selective reporting
  • Suggestibility - a form of misattribution where ideas suggested by a questioner are mistaken for memory. Often a key aspect of hypnotherapy.

 


Unconscious Decision Making

Published on Jul 26, 2012 Instinct is the driving force behind human decision making. Irrationality must be recognized if we’re going to get beyond the risks of not being built as thinking machines, says David Ropeik. David P. Ropeik is … Continue reading

Published on Jul 26, 2012 Instinct is the driving force behind human decision making. Irrationality must be recognized if we’re going to get beyond the risks of not being built as thinking machines, says David Ropeik. David P. Ropeik is an international consultant, author, teacher, and speaker on risk perception and risk communication.[1] He is also creator and director of Improving Media Coverage of Risk, a training program for journalists. He is a regular contributor to Big Think,[2] Psychology Today,[3] Cognoscenti,[4] and the Huffington Post.[5] http://bigthink.com


Published on Nov 26, 2012 Animation describing the Universal Principles of Persuasion based on the research of Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University. Dr. Robert Cialdini & Steve Martin are co-authors (together with Dr. Noah Goldstein) of the New York Times, Wall Street Journal and Business Week International Bestseller Yes! 50 Scientifically Proven Ways to be Persuasive. US Amazon http://tinyurl.com/afbam9g UK Amazon http://tinyurl.com/adxrp6c IAW USA: http://www.influenceatwork.com IAW UK: http://www.influenceatwork.co.uk/


Nobel Prize winning neuropsychiatrist Eric Kandel describes new research which hints at the possibility of a biological basis to the unconscious mind. Directed / Produced by Elizabeth Rodd and Jonathan Fowler

Eric Richard Kandel (born November 7, 1929) is an American neuropsychiatrist. He was a recipient of the 2000 Nobel Prize in Physiology or Medicine for his research on the physiological basis of memory storage in neurons. He shared the prize with Arvid Carlsson and Paul Greengard.

Kandel, who had studied psychoanalysis, wanted to understand how memory works. His mentor, Harry Grundfest, said, “If you want to understand the brain you’re going to have to take a reductionist approach, one cell at a time.” So Kandel studied the neural system of the sea slug Aplysia californica, which has large nerve cells amenable to experimental manipulation and is a member of the simplest group of animals known to be capable of learning.[1]

Starting in 1966 James Schwartz collaborated with Kandel on a biochemical analysis of changes in neurons associated with learning and memory storage. By this time it was known that long-term memory, unlike short-term memory, involved the synthesis of new proteins. By 1972 they had evidence that the second messenger molecule cyclic AMP (cAMP) was produced in Aplysia ganglia under conditions that cause short-term memory formation (sensitization). In 1974 Kandel moved his lab moved to Columbia University and became founding director of the Center for Neurobiology and Behavior. It was soon found that the neurotransmitter serotonin, acting to produce the second messenger cAMP, is involved in the molecular basis of sensitization of the gill-withdrawal reflex. By 1980, collaboration with Paul Greengard resulted in demonstration that cAMP-dependent protein kinase, also known as protein kinase A (PKA), acted in this biochemical pathway in response to elevated levels of cAMP. Steven Siegelbaum identified a potassium channel that could be regulated by PKA, coupling serotonin’s effects to altered synaptic electrophysiology. In 1983 Kandel helped form the Howard Hughes Medical Research Institute at Columbia devoted to molecular neural science. The Kandel lab then sought to identify proteins that had to be synthesized to convert short-term memories into long-lasting memories. One of the nuclear targets for PKA is the transcriptional control protein CREB (cAMP response element binding protein). In collaboration with David Glanzman and Craig Bailey, Kandel identified CREB as being a protein involved in long-term memory storage. One result of CREB activation is an increase in the number of synaptic connections. Thus, short-term memory had been linked to functional changes in existing synapses, while long-term memory was associated with a change in the number of synaptic connections. Some of the synaptic changes observed by Kandel’s laboratory provide examples of Hebbian learning. One article describes the role of Hebbian learning in the Aplysia siphon-withdrawal reflex.[4] The Kandel lab has also performed important experiments using transgenic mice as a system for investigating the molecular basis of memory storage in the vertebrate hippocampus.[5][6][7] Kandel’s original idea that learning mechanisms would be conserved between all animals has been confirmed. Neurotransmitters, second messenger systems, protein kinasesion channels, and transcription factors like CREB have been confirmed to function in both vertebrate and invertebrate learning and memory storage.[8][9]

Kandel is a professor of biochemistry and biophysics at the College of Physicians and Surgeons at Columbia University. He is a Senior Investigator in the Howard Hughes Medical Institute. He was also the founding director of the Center for Neurobiology and Behavior, which is now the Department of Neuroscience at Columbia University. Kandel’s popularized account chronicling his life and research, In Search of Memory: The Emergence of a New Science of Mind,[2] was awarded the 2006 Los Angeles Times Book Awardfor Science and Technology.


Learn every gesture and body language cue in one video. Eye, hand, leg, arm, and mouth gestures are completely covered. Gestures and Body Language Series Be an expert in body language. Applies to his and her body language. Article is here http://bit.ly/apSipQ


The Campaign to Mislead the Public on Climate Change

Wyoming just became the first state in the nation to reject world-class science standards that teach our kids about climate change. Interest groups that pushed for this move would like to bar climate science from being taught in other states too.
Wyoming legislators voted to support a last-minute budget amendment that prohibits the Wyoming State Board of Education from spending funds to even consider the Next Generation Science Standards (NGSS) because students would learn about climate change in this fossil fuel-dependent state.
Wyoming State Rep. Matt Teeters, who authored the anti-science budget amendment, told the Casper Star-Tribune that teaching climate change as fact would «wreck Wyoming’s economy…and cause other unwanted political ramifications.» Governor Mead signed the anti-science provision into law.
The Board of Education could push back against political meddling and assert its authority over setting science standards, but it will take an outcry from parents and science supporters from Wyoming and throughout the country.
It is a dangerous precedent to allow those who deny climate science for ideological or economic reasons to censor the science education kids need to be ready for college, career and a changing climate. Science education standards should be written by scientists and educators, not by legislators whose concerns may be more political than educational.
Ten states have adopted the NGSS, a set of science standards for K-12 students developed by an arm of the National Academy of Sciences.But Wyoming could be the beginning of an assault on climate education in the states still considering these 21st century science standards. 
Wyoming students deserve access to high quality, world-class science standards just as much as students anywhere else in the nation. Given the stakes for our children’s future, it’s imperative that kids everywhere learn the facts of climate change so they can all become part of the solution.
Thank you for taking action for our kids and grandkids!
Sincerely,
Marguerite Herman, Advocate for Science Education, Climate Parents Member
Cheyenne, Wyoming


Climate Denialists Spending Billions in untraceable Dark Money to fool the Public

By | Dec. 24, 2013
(By Lauren McCauley)

The expansive misinformation campaign behind climate change denial is increasingly being funded in the dark, reveals a new report published Friday in the journal Climatic Change

According to the study titled «Institutionalizing Delay: foundation funding and the creation of U.S. climate change counter-movement organizations,» while the largest and most consistent funders of climate change denial are a number of well-known conservative foundations and industry groups, the majority of donations come from «dark money,» or concealed funding.

Delving into what he calls the climate change counter-movement, or CCCM, report author and Drexel University environmental sociologist Robert Brulle uncovers the various players behind the powerful global warming misinformation campaign.

“It is not just a couple of rogue individuals doing this,” Brulle told the Guardian. “This is a large-scale political effort.”
«Powerful funders are supporting the campaign to deny scientific findings about global warming and raise public doubts about the roots and remedies of this massive global threat,» Brulle adds. «At the very least, American voters deserve to know who is behind these efforts.”
In an earlier interview with PBS’s Frontline, Brulle compares the tactical spending of the climate change counter movement to that of pro-environment groups:

What’s interesting is that in comparison to the environmental movement, it actually doesn’t have as much money. The environmental movement actually has more funding, but it’s the nature of the spending that makes the difference.

When you look at what the environmental movement spends its money on, it actually tries to spend its money on developing solutions to climate change, such as developing a solar panel industry in China, making sure everybody in India has an appropriate solar oven to reduce CO2 emissions, things like that. And they spend hardly anything on political or cultural processes. The climate change countermovement spends all of its money there.

So you end up with this great difference between the two movements. As one movement is actually out there trying to develop technological solutions on the ground, the other is engaged in political action to delay any kind of action…(Source: Institutionalizing Delay)

_____________________
This work is licensed under a Creative Commons Attribution-Share Alike 3.0 License.

Mirrored from Commondreams.org


Sheldon Whitehouse
US senator for Rhode Island

Time to Wake Up: The Campaign to Mislead the Public on Climate Change
As delivered on the Senate floor
Tuesday, April 16, 2013

Thank you, Madam President. We are gathered here in the Senate in the somber shadow of the events in Boston at the marathon. And I guess I will start by conveying my sympathies to the individuals and their families who were killed or hurt in that terrible act. I share the determination of so many people that our law enforcement folks will indeed get to the bottom of this, and will get the resources they need, and we will have answers and justice for the families that are affected.

I rise again though, on the subject I come to the floor every week we are in session to discuss, which is the need for this body to wake up to the reality of the clear scientific consensus that human activity is driving serious changes in our climate and oceans.

For more than two decades, the fossil fuel companies and certain right-wing extremists have cooked up a well-organized campaign to call into question the scientific evidence of climate change. The paid-for deniers then manufacture an interesting product, they manufacture uncertainty so the polluters who are doing the paying can also keep polluting, because a sufficient atmosphere of uncertainty has been created to inhibit progress.

This is not a new strategy. We have seen this played before. Industries eager to drown out scientific evidence to maximize profit is not a new story. They questioned the merits of requiring seatbelts in automobiles. They questioned the toxic effects of lead exposure. And they questioned whether tobacco was really bad for people.

Well, they were wrong then and they’re wrong now about climate. Interestingly, they do not actually care. It is not their purpose to be accurate. They just want to create doubt; to sow enough of a question to stop progress. So these sophisticated campaigns are launched to give the public the false impression that there actually is a real scientific debate over climate change. And here in the Senate, regrettably, some of my colleagues even promote this view.

But, let’s be practical here. Which is the more likely case?

April 16 2013 Climate Change Speech Chart 1

Are a handful of nonprofit environmental groups using their limited funding to pay off literally hundreds and hundreds of climate scientists in an internationally coordinated hoax to falsify complicated climate research? Really?

April 16 2013 What is More Likely Chart 2

Or is it more likely that fossil-fuel corporations are using a slice of their immense profits to float front groups to protect their immense profits?

Well, the answer to that question is obvious I think, just from the logic. But we don’t have to apply logic. We can follow the money and look at evidence.

According to an analysis by the Checks and Balances Project, a self-described pro-clean-energy government and industry watchdog group, from 2006 to 2010, four sources of fossil-fuel money, just 4 of them, contributed more than $16 million to a group of conservative think tanks that go about the business of being publicly critical of climate science and clean energy. Those four sources are the Charles G. Koch Foundation, the Claude R. Lambe Charitable Foundation, the Earhart Foundation, and oil giant ExxonMobil. On the receiving end is a lengthy roster of well-known and often-cited rightward-leaning outfits: The top 10 (we’ll just talk about the top 10 in this set of remarks):

American Enterprise Institute
Cato Institute
Competitive Enterprise Institute
Heartland Institute
Heritage Foundation
Hudson Institute
Institute for Energy Research
George C. Marshall Institute
Manhattan Institute
Mercatus Center

Who’s giving? Well, Charles Koch is chairman and CEO of Koch Industries and he is the 6th-richest person on the planet. Koch Industries is the second-largest privately held company in the United States. Koch companies include the Koch Pipeline Company, and Flint Hills Resources, which operates refineries with a combined crude oil processing capacity of more than 292 million barrels per year. That much oil accounts for 126 million metric tons of carbon pollution each year—as much as 35 coal-fired power plants produce or 26 million cars. So to put it mildly, this fellow has got some skin in the game.

Between 2006 and 2010, the Charles G. Koch Foundation gave almost $8 million to think tanks and institutes, including $7.6 million to the Mercatus Center, and $100,000 to the American Enterprise Institute.

Charles Koch, along with his brother David, also established the Claude R. Lambe Charitable Foundation – those two have the same source – and they direct that foundation’s giving as well. This foundation provided almost $5 million to climate denying think tanks and institutes, including over $1 million to the Cato Institute, and more than $2 million to the Heritage Foundation.

The Earhart Foundation was started by Henry Boyd Earhart using funds from his oil business, White Star Refining Company—now a part of, you guessed it, ExxonMobil. The Earhart Foundation has donated almost $1.5 million to climate denier groups: $370,000 to the American Enterprise Institute, $330,000 to the Cato Institute, and another $195,000 for the George C. Marshall Institute.

That leaves us, of course, ExxonMobil itself, which is the second largest corporation in the world, and among the most profitable. Ranked number one among Fortune 500 companies, its total revenues reached nearly half a trillion dollars in 2012, and their profits were nearly $45 billion. ExxonMobil produces over 6 million barrels of oil per day at its 36 refineries in 20 countries, so it’s the world’s largest oil producer. From 2006 to 2010, the petroleum giant gave institutes more than $2.3 million: $1.2 million for the American Enterprise Institute, $220,000 for the Heritage Foundation, $160,000 for the Institute for Energy Research, and $115,000 for the Heartland Institute.

So what did the Charles G. Koch Foundation, the Claude R. Lambe Charitable Foundation, the Earhart Foundation, and ExxonMobil get for all of that so-called charitable giving? Well, the Checks and Balances Project found that from 2007 to 2011, these ten organizations that I cited, the top 10, were quoted, cited, or had articles published over 1000 times, over 1000 times in 60 mainstream newspapers and print publications, and invariably they were promoting fossil fuels, undermining renewable energy, or attacking environmental policies.

That’s good investing. Spend millions of dollars on a handful of think tanks to protect billions of dollars in profits; really a thousand-to-one return. But here’s the problem: the public is unaware of the connection, usually. Only a handful of these attacks were accompanied by any explanation by the media that the fossil-fuel industry was involved in them.

Here’s one prime example. Last summer, when the Navy displayed its Great Green Fleet, a carrier strike group that runs on a 50-50 blend of biodiesel and petroleum, Institute for Energy Research president Thomas Pyle wrote a column for U.S. News and World Report, calling that initiative “ridiculous,” and “a costly and pointless exercise.”

Never mind that our defense and intelligence communities have repeatedly warned of the threats posed by climate change to national security and international stability, and of their own need to secure a reliable and secure fuel supply. What’s misleading here is that U.S. News and World Report in publishing that article attributed the column simply thus: “Thomas Pyle is the president of the Institute for Energy Research,” with no mention that the Institute for Energy Research is a front for big donors like the Claude R. Lambe Charitable Foundation and ExxonMobil.

This is one example of this misleading practice in the media.

[Show Media Description Chart]

More than half of the time, media outlets do nothing more than state the name of the publishing organization, like Thomas Pyle and the Institute for Energy Research, or they may add a functional description, like “think-tank” or “non-partisan group.”

The instances when the publication described the basic ideology of the group, for example as a “free-market” or “conservative” think tank, amount to less than a third.

In all of the media outlets reviewed between 2007 and 2011, the financial ties between the authors and the fossil-fuel industry were mentioned a mere 6 percent of the time. 94% of the time, the fossil fuel industry funders got away with it. This chart shows some examples:

[Show Anonymity Chart]

The Washington Post ignored the financial connection 88 percent of the time;
POLITICO – ignored the financial connection 95 percent of the time;
Christian Science Monitor – ignored it every time;
USA Today – ignored it 98 percent of the time;
The New York Times – ignored it 90 percent.

So the scam of laundering money through independent-sounding organizations works. The media lets it work. The vast majority of scientists agree that global warming is occurring, but a recent Gallup poll revealed that only 62 percent of Americans believe the vast majority of scientists agree global warming is occurring.

Well over 90 percent of climate scientists agree that climate change is happening and that humans are the main cause. The only uncertainty is about how bad it is going to be. And the leading research predicts warmer air and seas, rising sea levels, stronger storms, and more acidic oceans.

Most major players in the private sector actually get it. While the big fossil fuel polluters try to confuse the public to boost their bottom line and prolong their pollution, hundreds of leading corporations understand that climate change ultimately undermines our entire economy. Let me think of, let me mention some of the examples: Ford; Coca-Cola; GE; Walmart; the insurance giant Munich Re; Alcoa, the great aluminum maker; Maersk; Proctor and Gamble; Fedex; and the so-called BICEP Group which includes EBay, Intel, Starbucks, Adidas, and Nike. So this notion that this is a hoax, that there is doubt, is belied by some of the most respected names in the private sector. And those companies join the National Academies, NASA, the U.S. Department of Defense, the Government Accountability Office, the American Public Health Association, and yes, United States Conference of Catholic Bishops, as well as the majority of Americans, in understanding that it is time to wake up, to end the faux controversy that has been cooked up by the fossil fuel industry, and to do the work here in Congress that needs to be done to protect Americans from the harms of carbon pollution.

I thank the presiding officer and I yield the floor.

Wyoming just became the first state in the nation to reject world-class science standards that teach our kids about climate change. Interest groups that pushed for this move would like to bar climate science from being taught in other states too.
Wyoming legislators voted to support a last-minute budget amendment that prohibits the Wyoming State Board of Education from spending funds to even consider the Next Generation Science Standards (NGSS) because students would learn about climate change in this fossil fuel-dependent state.
Wyoming State Rep. Matt Teeters, who authored the anti-science budget amendment, told the Casper Star-Tribune that teaching climate change as fact would "wreck Wyoming's economy...and cause other unwanted political ramifications." Governor Mead signed the anti-science provision into law.
The Board of Education could push back against political meddling and assert its authority over setting science standards, but it will take an outcry from parents and science supporters from Wyoming and throughout the country.
It is a dangerous precedent to allow those who deny climate science for ideological or economic reasons to censor the science education kids need to be ready for college, career and a changing climate. Science education standards should be written by scientists and educators, not by legislators whose concerns may be more political than educational.
Ten states have adopted the NGSS, a set of science standards for K-12 students developed by an arm of the National Academy of Sciences.But Wyoming could be the beginning of an assault on climate education in the states still considering these 21st century science standards. 
Wyoming students deserve access to high quality, world-class science standards just as much as students anywhere else in the nation. Given the stakes for our children’s future, it’s imperative that kids everywhere learn the facts of climate change so they can all become part of the solution.
Thank you for taking action for our kids and grandkids!
Sincerely,
Marguerite Herman, Advocate for Science Education, Climate Parents Member
Cheyenne, Wyoming





Climate Denialists Spending Billions in untraceable Dark Money to fool the Public

By | Dec. 24, 2013
(By Lauren McCauley)

The expansive misinformation campaign behind climate change denial is increasingly being funded in the dark, reveals a new report published Friday in the journal Climatic Change


According to the study titled "Institutionalizing Delay: foundation funding and the creation of U.S. climate change counter-movement organizations," while the largest and most consistent funders of climate change denial are a number of well-known conservative foundations and industry groups, the majority of donations come from "dark money," or concealed funding.

Delving into what he calls the climate change counter-movement, or CCCM, report author and Drexel University environmental sociologist Robert Brulle uncovers the various players behind the powerful global warming misinformation campaign.

...

“It is not just a couple of rogue individuals doing this,” Brulle told the Guardian. “This is a large-scale political effort.”
"Powerful funders are supporting the campaign to deny scientific findings about global warming and raise public doubts about the roots and remedies of this massive global threat," Brulle adds. "At the very least, American voters deserve to know who is behind these efforts.”
In an earlier interview with PBS's Frontline, Brulle compares the tactical spending of the climate change counter movement to that of pro-environment groups:
What’s interesting is that in comparison to the environmental movement, it actually doesn’t have as much money. The environmental movement actually has more funding, but it’s the nature of the spending that makes the difference.
When you look at what the environmental movement spends its money on, it actually tries to spend its money on developing solutions to climate change, such as developing a solar panel industry in China, making sure everybody in India has an appropriate solar oven to reduce CO2 emissions, things like that. And they spend hardly anything on political or cultural processes. The climate change countermovement spends all of its money there.
So you end up with this great difference between the two movements. As one movement is actually out there trying to develop technological solutions on the ground, the other is engaged in political action to delay any kind of action…(Source: Institutionalizing Delay)
_____________________
Mirrored from Commondreams.org










Sheldon Whitehouse
US senator for Rhode Island


Time to Wake Up: The Campaign to Mislead the Public on Climate Change
As delivered on the Senate floor
Tuesday, April 16, 2013

Thank you, Madam President. We are gathered here in the Senate in the somber shadow of the events in Boston at the marathon. And I guess I will start by conveying my sympathies to the individuals and their families who were killed or hurt in that terrible act. I share the determination of so many people that our law enforcement folks will indeed get to the bottom of this, and will get the resources they need, and we will have answers and justice for the families that are affected.



I rise again though, on the subject I come to the floor every week we are in session to discuss, which is the need for this body to wake up to the reality of the clear scientific consensus that human activity is driving serious changes in our climate and oceans.

For more than two decades, the fossil fuel companies and certain right-wing extremists have cooked up a well-organized campaign to call into question the scientific evidence of climate change. The paid-for deniers then manufacture an interesting product, they manufacture uncertainty so the polluters who are doing the paying can also keep polluting, because a sufficient atmosphere of uncertainty has been created to inhibit progress.

This is not a new strategy. We have seen this played before. Industries eager to drown out scientific evidence to maximize profit is not a new story. They questioned the merits of requiring seatbelts in automobiles. They questioned the toxic effects of lead exposure. And they questioned whether tobacco was really bad for people.

Well, they were wrong then and they’re wrong now about climate. Interestingly, they do not actually care. It is not their purpose to be accurate. They just want to create doubt; to sow enough of a question to stop progress. So these sophisticated campaigns are launched to give the public the false impression that there actually is a real scientific debate over climate change. And here in the Senate, regrettably, some of my colleagues even promote this view.

But, let’s be practical here. Which is the more likely case?

April 16 2013 Climate Change Speech Chart 1

Are a handful of nonprofit environmental groups using their limited funding to pay off literally hundreds and hundreds of climate scientists in an internationally coordinated hoax to falsify complicated climate research? Really?

April 16 2013 What is More Likely Chart 2

Or is it more likely that fossil-fuel corporations are using a slice of their immense profits to float front groups to protect their immense profits?

Well, the answer to that question is obvious I think, just from the logic. But we don’t have to apply logic. We can follow the money and look at evidence.

According to an analysis by the Checks and Balances Project, a self-described pro-clean-energy government and industry watchdog group, from 2006 to 2010, four sources of fossil-fuel money, just 4 of them, contributed more than $16 million to a group of conservative think tanks that go about the business of being publicly critical of climate science and clean energy. Those four sources are the Charles G. Koch Foundation, the Claude R. Lambe Charitable Foundation, the Earhart Foundation, and oil giant ExxonMobil. On the receiving end is a lengthy roster of well-known and often-cited rightward-leaning outfits: The top 10 (we’ll just talk about the top 10 in this set of remarks):

American Enterprise Institute
Cato Institute
Competitive Enterprise Institute
Heartland Institute
Heritage Foundation
Hudson Institute
Institute for Energy Research
George C. Marshall Institute
Manhattan Institute
Mercatus Center

Who’s giving? Well, Charles Koch is chairman and CEO of Koch Industries and he is the 6th-richest person on the planet. Koch Industries is the second-largest privately held company in the United States. Koch companies include the Koch Pipeline Company, and Flint Hills Resources, which operates refineries with a combined crude oil processing capacity of more than 292 million barrels per year. That much oil accounts for 126 million metric tons of carbon pollution each year—as much as 35 coal-fired power plants produce or 26 million cars. So to put it mildly, this fellow has got some skin in the game.

Between 2006 and 2010, the Charles G. Koch Foundation gave almost $8 million to think tanks and institutes, including $7.6 million to the Mercatus Center, and $100,000 to the American Enterprise Institute.

Charles Koch, along with his brother David, also established the Claude R. Lambe Charitable Foundation – those two have the same source - and they direct that foundation’s giving as well. This foundation provided almost $5 million to climate denying think tanks and institutes, including over $1 million to the Cato Institute, and more than $2 million to the Heritage Foundation.

The Earhart Foundation was started by Henry Boyd Earhart using funds from his oil business, White Star Refining Company—now a part of, you guessed it, ExxonMobil. The Earhart Foundation has donated almost $1.5 million to climate denier groups: $370,000 to the American Enterprise Institute, $330,000 to the Cato Institute, and another $195,000 for the George C. Marshall Institute.

That leaves us, of course, ExxonMobil itself, which is the second largest corporation in the world, and among the most profitable. Ranked number one among Fortune 500 companies, its total revenues reached nearly half a trillion dollars in 2012, and their profits were nearly $45 billion. ExxonMobil produces over 6 million barrels of oil per day at its 36 refineries in 20 countries, so it’s the world’s largest oil producer. From 2006 to 2010, the petroleum giant gave institutes more than $2.3 million: $1.2 million for the American Enterprise Institute, $220,000 for the Heritage Foundation, $160,000 for the Institute for Energy Research, and $115,000 for the Heartland Institute.

So what did the Charles G. Koch Foundation, the Claude R. Lambe Charitable Foundation, the Earhart Foundation, and ExxonMobil get for all of that so-called charitable giving? Well, the Checks and Balances Project found that from 2007 to 2011, these ten organizations that I cited, the top 10, were quoted, cited, or had articles published over 1000 times, over 1000 times in 60 mainstream newspapers and print publications, and invariably they were promoting fossil fuels, undermining renewable energy, or attacking environmental policies.

That’s good investing. Spend millions of dollars on a handful of think tanks to protect billions of dollars in profits; really a thousand-to-one return. But here’s the problem: the public is unaware of the connection, usually. Only a handful of these attacks were accompanied by any explanation by the media that the fossil-fuel industry was involved in them.

Here’s one prime example. Last summer, when the Navy displayed its Great Green Fleet, a carrier strike group that runs on a 50-50 blend of biodiesel and petroleum, Institute for Energy Research president Thomas Pyle wrote a column for U.S. News and World Report, calling that initiative “ridiculous,” and “a costly and pointless exercise.”

Never mind that our defense and intelligence communities have repeatedly warned of the threats posed by climate change to national security and international stability, and of their own need to secure a reliable and secure fuel supply. What’s misleading here is that U.S. News and World Report in publishing that article attributed the column simply thus: “Thomas Pyle is the president of the Institute for Energy Research,” with no mention that the Institute for Energy Research is a front for big donors like the Claude R. Lambe Charitable Foundation and ExxonMobil.

This is one example of this misleading practice in the media.

[Show Media Description Chart]

More than half of the time, media outlets do nothing more than state the name of the publishing organization, like Thomas Pyle and the Institute for Energy Research, or they may add a functional description, like “think-tank” or “non-partisan group.”

The instances when the publication described the basic ideology of the group, for example as a “free-market” or “conservative” think tank, amount to less than a third.

In all of the media outlets reviewed between 2007 and 2011, the financial ties between the authors and the fossil-fuel industry were mentioned a mere 6 percent of the time. 94% of the time, the fossil fuel industry funders got away with it. This chart shows some examples:

[Show Anonymity Chart]

The Washington Post ignored the financial connection 88 percent of the time;
POLITICO – ignored the financial connection 95 percent of the time;
Christian Science Monitor – ignored it every time;
USA Today – ignored it 98 percent of the time;
The New York Times – ignored it 90 percent.

So the scam of laundering money through independent-sounding organizations works. The media lets it work. The vast majority of scientists agree that global warming is occurring, but a recent Gallup poll revealed that only 62 percent of Americans believe the vast majority of scientists agree global warming is occurring.

Well over 90 percent of climate scientists agree that climate change is happening and that humans are the main cause. The only uncertainty is about how bad it is going to be. And the leading research predicts warmer air and seas, rising sea levels, stronger storms, and more acidic oceans.

Most major players in the private sector actually get it. While the big fossil fuel polluters try to confuse the public to boost their bottom line and prolong their pollution, hundreds of leading corporations understand that climate change ultimately undermines our entire economy. Let me think of, let me mention some of the examples: Ford; Coca-Cola; GE; Walmart; the insurance giant Munich Re; Alcoa, the great aluminum maker; Maersk; Proctor and Gamble; Fedex; and the so-called BICEP Group which includes EBay, Intel, Starbucks, Adidas, and Nike. So this notion that this is a hoax, that there is doubt, is belied by some of the most respected names in the private sector. And those companies join the National Academies, NASA, the U.S. Department of Defense, the Government Accountability Office, the American Public Health Association, and yes, United States Conference of Catholic Bishops, as well as the majority of Americans, in understanding that it is time to wake up, to end the faux controversy that has been cooked up by the fossil fuel industry, and to do the work here in Congress that needs to be done to protect Americans from the harms of carbon pollution.

I thank the presiding officer and I yield the floor.

Science and Faith

Uploaded on Jun 14, 2011 Some scientists see religion as a threat to the scientific method that should be resisted. But faith “is really asking a different set of questions,” says Collins. Question: Why is it so difficult for scientists to believe in a higher power?Francis Collins: Science is about trying to get rigorous answers […]

Uploaded on Jun 14, 2011

Some scientists see religion as a threat to the scientific method that should be resisted. But faith “is really asking a different set of questions,” says Collins.

Question: Why is it so difficult for scientists to believe in a higher power?Francis Collins: Science is about trying to get rigorous answers to questions about how nature works. And it’s a very important process that’s actually quite reliable if carried out correctly with generation of hypotheses and testing of those by accumulation of data and then drawing conclusions that are continually revisited to be sure they are right. So if you want to answer questions about how nature works, how biology works, for instance, science is the way to get there. Scientists believe in that they are very troubled by a suggestion that other kinds of approaches can be taken to derive truth about nature. And some I think have seen faith as therefore a threat to the scientific method and therefore it to be resisted. But faith in its perspective is really asking a different set of questions. And that’s why I don’t think there needs to be a conflict here. The kinds of questions that faith can help one address are more in the philosophical realm. Why are we all here? Why is there something instead of nothing? Is there a God? Isn’t it clear that those aren’t scientific questions and that science doesn’t have much to say about them? But you either have to say, well those are inappropriate questions and we can’t discuss them or you have to say, we need something besides science to pursue some of the things that humans are curious about. For me, that makes perfect sense. But I think for many scientists, particularly for those who have seen the shrill pronouncements from extreme views that threaten what they’re doing scientifically and feel therefore they can’t really include those thoughts into their own worldview, faith can be seen as an enemy. And similarly, on the other side, some of my scientific colleagues who are of an atheist persuasion are sometimes using science as a club over the head of believers basically suggesting that anything that can’t be reduced to a scientific question isn’t important and just represents superstition that should be gotten rid of. Part of the problem is, I think the extremists have occupied the stage. Those voices are the ones we hear. I think most people are actually kind of comfortable with the idea that science is a reliable way to learn about nature, but it’s not the whole story and there’s a place also for religion, for faith, for theology, for philosophy. But that harmony perspective does not get as much attention, nobody’s as interested in harmony as they are in conflict, I’m afraid. Question: How has your study of genetics influenced your faith? Francis Collins: My study of genetics certainly tells me, incontrovertibly that Darwin was right about the nature of how living things have arrived on the scene, by descent from a common ancestor under the influence of natural selection over very long periods of time. Darwin was amazingly insightful given how limited the molecular information he had was; essentially it didn’t exist. And now with the digital code of the DNA, we have the best possible proof of Darwin’s theory that he could have imagined. So that certainly tells me something about the nature of living things. But it actually adds to my sense that this is an answer to a “how?” question and it leaves the “why?” question still hanging in the air. Other aspects of our universe I think also for me as for Einstein raised questions about the possibility of intelligence behind all of this. Why is it that, for instance, that the constance that determines the behavior of matter and energy, like the gravitational constant, for instance, have precisely the value that they have to in order for there to be any complexity at all in the Universe. That is fairly breathtaking in its lack of probability of ever having happened. And it does make you think that a mind might have been involved in setting the stage. At the same time that does not imply necessarily that that mind is controlling the specific manipulations of things that are going on in the natural world. In fact, I would very much resist that idea. I think the laws of nature potentially could be the product of a mind. I think that’s a defensible perspective. But once those laws are in place, then I think nature goes on and science has the chance to be able to perceive how that works and what its consequences are.
Recorded September 13, 2010
Interviewed by David Hirschman


with luck, balls are better than brains

Christopher Columbus (c. 31 October 1451 – 20 May 1506) was an explorer, colonizer, and navigator, born in the Republic of Genoa, in what is today northwestern Italy.[2][3][4][5] Under the auspices of the Catholic Monarchs of Spain, he completed four … Continue reading

Christopher Columbus (c. 31 October 1451 – 20 May 1506) was an explorer, colonizer, and navigator, born in the Republic of Genoa, in what is today northwestern Italy.[2][3][4][5] Under the auspices of the Catholic Monarchs of Spain, he completed four voyages across the Atlantic Ocean that led to general European awareness of the American continents in the Western Hemisphere. Those voyages, and his efforts to establish permanent settlements in the island of Hispaniola, initiated the process of Spanish colonization, which foreshadowed the general European colonization of the “New World“.

In the context of emerging western imperialism and economic competition between European kingdoms seeking wealth through the establishment of trade routes and colonies, Columbus’ far-fetched proposal to reach the East Indies by sailing westward received the support of the Spanish crown, which saw in it a promise, however remote, of gaining the upper hand over rival powers in the contest for the lucrative spice trade with Asia. During his first voyage in 1492, instead of reaching Japan as he had intended, Columbus landed in the Bahamas archipelago, at a locale he named San Salvador. Over the course of three more voyages, Columbus visited the Greater and Lesser Antilles, as well as the Caribbean coast of Colombia, Venezuela and Central America, claiming them for the Spanish Empire.

Never admitting that he had reached a continent previously unknown to Europeans, rather than the East Indies he had set out for, Columbus called the inhabitants of the lands he visited indios (Spanish for “Indians“).[7][8][9] Columbus’ strained relationship with the Spanish crown and its appointed colonial administrators in America led to his arrest and dismissal as governor of the settlements in Hispaniola in 1500, and later to protracted litigation over the benefits which Columbus and his heirs claimed were owed to them by the crown.

Washington Irving‘s 1828 biography of Columbus popularized the idea that Columbus had difficulty obtaining support for his plan because many Catholic theologians insisted that the Earth was flat.[25] In fact, most educated Westerners had understood that the Earth was spherical at least since the time of Aristotle, who lived in the 4th century BC and whose works were widely studied and revered in Medieval Europe.[26] The sphericity of the Earth is also accounted for in the work of Ptolemy, on which ancient astronomy was largely based. Christian writers whose works clearly reflect the conviction that the Earth is spherical include Saint Bede the Venerable in his Reckoning of Time, written around AD 723. In Columbus’ time, the techniques of celestial navigation, which use the position of the Sun and the Stars in the sky, together with the understanding that the Earth is a sphere, were widely used by mariners.

Where Columbus did differ from the view accepted by scholars in his day was in his estimate of the westward distance from Europe to Asia. Columbus’ ideas in this regard were based on three factors: his low estimate of the size of the Earth, his high estimate of the size of the Eurasian landmass, and his belief that Japan and other inhabited islands lay far to the east of the coast of China. In all three of these issues Columbus was both wrong and at odds with the scholarly consensus of his day.

As far back as the 3rd century BC, Eratosthenes had correctly computed the circumference of the Earth by using simple geometry and studying the shadows cast by objects at two different locations: Alexandria and Syene (modern-day Aswan).[27] Eratosthenes’s results were confirmed by a comparison of stellar observations at Alexandria and Rhodes, carried out by Posidonius in the 1st century BC. These measurements were widely known among scholars, but confusion about the old-fashioned units of distance in which they were expressed had led, in Columbus’s day, to some debate about the exact size of the Earth.


Click to view slideshow.

Sources (Taken from the oatmeal via Anacephalaeosis) :

A People’s History of the United States, by Howard Zinn, and Lies My Teacher Told Me, by James W. Loewen.


The Drake Equation

The Flake Equation ESTIMATING THE NUMBER OF PEOPLE WHO HAVE EXPERIENCED THE PARANORMAL OR SUPERNATURAL Michael Shermer The Drake Equation is the famous formula developed by the astronomer Frank Drake for estimating the number of extraterrestrial civilizations: N = R … Continue reading

The Flake Equation

ESTIMATING THE NUMBER OF PEOPLE WHO HAVE
EXPERIENCED THE PARANORMAL OR SUPERNATURAL

Michael Shermer


The Drake Equation is the famous formula developed by the astronomer Frank Drake for estimating the number of extraterrestrial civilizations:

N = R × fp × ne × fl × fi × fc × L where…

  • N = the number of communicative civilizations,
  • R = the rate of formation of suitable stars,
  • fp = the fraction of those stars with planets,
  • ne = the number of earth-like planets per solar system,
  • fl = the fraction of planets with life,
  • fi = the fraction of planets with intelligent life,
  • fc = the fraction of planets with communicating technology, and
  • L = the lifetime of communicating civilizations.

The equation is so ubiquitous that it has even been employed in the popular television series The Big Bang Theory for computing the number of available sex partners within a 40-mile radius of Los Angeles (5,812). My favorite parody of it is by the cartoonist Randall Munroe as one in a series of his clever science send-ups, entitled “The Flake Equation” (on xkcd.com) for calculating the number of people who will mistakenly think they had an ET encounter.

Such multiplicative equations for calculating the product of an increasingly restrictive series of fractional values are effective tools for making back-of-the-envelope calculations to solve problems for which we do not have precise data. To that end I thought it a useful addition to the Skeptic toolbox to create a Flake Equation for all paranormal and supernatural experiences (and in the Flake Equation I’m interested not in beliefs but in actual experiences that people report and that we hear about, because this becomes the foundation of paranormal and supernatural beliefs):

N = Pw × fp × fm × ft × nt × no × fm where…

  • N = Number of people we hear about who report having experienced a paranormal or supernatural phenomena,
  • Pw = Population of the United States (January 1, 2012: 312,938,813),
  • fp = Fraction of people who report having had an anomalous psychological experience or witnessed an unusual physical phenomena (1/5),
  • fm = Fraction of people who interpret such experiences and phenomena as paranormal or supernatural (1/5),
  • ft = Fraction of people who tell someone about their experience (1/10),
  • nt = Number of people they tell (15),
  • no = Number of other people told the story by original hearers (15), and
  • fm = Fraction of such stories reported in the media or on Internet blogs, tweets, and forums (1/10).

N = 28,164,493, or about 9 percent of the U.S. population.

To compute this figure I used the 2005/2007 Baylor Religion Survey, which reports that

  • 23.2% say that they have “witnessed a miraculous, physical healing,”
  • 16.3% “received a miraculous, physical healing,”
  • 27.5% “witnessed people speaking in tongues at a place of worship,”
  • 7.7% “spoke or prayed in tongues,”
  • 54.5% experienced being “protected from harm by a guardian angel,”
  • 5.9% “personally had a vision of a religious figure while awake,”
  • 19.1% “heard the voice of God speaking to me,”
  • 26.1% “had a dream of religious significance,”
  • 52% “had an experience where you felt that you were filled with the spirit,”
  • 22.1% “felt at one with the universe,”
  • 25.7% “had a religious conversion experience,”
  • 13.8% “had an experience where you felt that you were in a state of religious ecstasy,”
  • 14.2% “had an experience where you felt that you left your body for a period of time,”
  • 40.4% “had a dream that later came true,” and
  • 16.7% “witnessed an object in the sky that you could not identify (UFO).”

This works out to an average of 24.4 percent, thereby justifying my conservative 20 percent figure for fp and fm. The other numbers I gleaned from research on gossip and social networks, conservatively estimating that 10 percent of people will tell someone about their unusual experience, and that within their average social network of 150 people they will tell at least 10 percent of them (15) who in turn will pass on the story to 10 percent of their social network of 150 (15). Finally, I estimate that 10 percent of such stories will be reported in the media or recounted in blogs, tweets, forums, and the like.

Of course the final figure for N will vary considerably depending on what numbers are plugged into the equation, but the result will almost always be a number in the tens of millions, which goes a long way toward explaining why belief in the paranormal and supernatural is so ubiquitous. Experiencing is believing!


history of the universe

Uploaded on Apr 11, 2011 Backed by stunning illustrations, David Christian narrates a complete history of the universe, from the Big Bang to the Internet, in a riveting 18 minutes. This is “Big History”: an enlightening, wide-angle look at complexity, life and humanity, set against our slim share of the cosmic timeline.


Uploaded on Apr 11, 2011

Backed by stunning illustrations, David Christian narrates a complete history of the universe, from the Big Bang to the Internet, in a riveting 18 minutes. This is “Big History”: an enlightening, wide-angle look at complexity, life and humanity, set against our slim share of the cosmic timeline.