Dopamine

Dopamine is an organic chemical of the catecholamine and phenethylamine families that plays several important roles in the brain and body. Its name is derived from its chemical structure: it is an amine synthesized by removing a carboxyl group from a molecule of its precursor chemical L-DOPA, which is synthesized in the brain and kidneys. […]

Dopamine is an organic chemical of the catecholamine and phenethylamine families that plays several important roles in the brain and body. Its name is derived from its chemical structure: it is an amine synthesized by removing a carboxyl group from a molecule of its precursor chemical L-DOPA, which is synthesized in the brain and kidneys. Dopamine is also synthesized in plants and most multicellular animals.

In the brain, dopamine functions as a neurotransmitter—a chemical released by neurons (nerve cells) to send signals to other nerve cells. The brain includes several distinct dopamine pathways, one of which plays a major role in reward-motivated behavior. Most types of reward increase the level of dopamine in the brain, and most addictive drugs increase dopamine neuronal activity. Other brain dopamine pathways are involved in motor control and in controlling the release of various hormones. These pathways and cell groups form a dopamine system which is neuromodulatory.

Outside the central nervous system, dopamine functions in several parts of the peripheral nervous system as a local chemical messenger. In blood vessels, it inhibits norepinephrine release and acts as a vasodilator (at normal concentrations); in the kidneys, it increases sodium excretion and urine output; in the pancreas, it reduces insulin production; in the digestive system, it reducesgastrointestinal motility and protects intestinal mucosa; and in the immune system, it reduces the activity of lymphocytes. With the exception of the blood vessels, dopamine in each of these peripheral systems is synthesized locally and exerts its effects near the cells that release it.

Several important diseases of the nervous system are associated with dysfunctions of the dopamine system, and some of the key medications used to treat them work by altering the effects of dopamine. Parkinson’s disease, a degenerative condition causing tremor and motor impairment, is caused by a loss of dopamine-secreting neurons in an area of the midbrain called the substantia nigra. Its metabolic precursor L-DOPA can be manufactured, and in its pure form marketed as Levodopa is the most widely used treatment for the condition. There is evidence that schizophrenia involves altered levels of dopamine activity, and most antipsychotic drugs used to treat this are dopamine antagonists which reduce dopamine activity.[2] Similar dopamine antagonist drugs are also some of the most effective anti-nausea agents. Restless legs syndrome and attention deficit hyperactivity disorder (ADHD) are associated with decreased dopamine activity.[3] Dopaminergic stimulants can be addictive in high doses, but some are used at lower doses to treat ADHD. Dopamine itself is available as a manufactured medication for intravenous injection: although it cannot reach the brain from the bloodstream, its peripheral effects make it useful in the treatment of heart failure or shock, especially in newborn babies.

Dopamine exerts its effects by binding to and activating cell surface receptors.[8] In mammals, five subtypes of dopamine receptors have been identified, labeled from D1 to D5.[8] All of them function as metabotropic, G protein-coupled receptors, meaning that they exert their effects via a complex second messenger system.[17] These receptors can be divided into two families, known as D1-like and D2-like.[8] For receptors located on neurons in the nervous system, the ultimate effect of D1-like activation (D1 and D5) can be excitation (via opening of sodium channels) or inhibition (via opening of potassium channels); the ultimate effect of D2-like activation (D2, D3, and D4) is usually inhibition of the target neuron.[17] Consequently, it is incorrect to describe dopamine itself as either excitatory or inhibitory: its effect on a target neuron depends on which types of receptors are present on the membrane of that neuron and on the internal responses of that neuron to the second messenger cAMP.[17] D1 receptors are the most numerous dopamine receptors in the human nervous system; D2 receptors are next; D3, D4, and D5 receptors are present at significantly lower levels.[17]

Inside the brain, dopamine functions as a neurotransmitter and neuromodulator, and is controlled by a set of mechanisms common to allmonoamine neurotransmitters.[8] After synthesis, dopamine is transported from the cytosol into synaptic vesicles by a solute carrier—avesicular monoamine transporter, VMAT2.[18] Dopamine is stored in these vesicles until it is ejected into the synaptic cleft through a process called exocytosis. In most cases exocytosis is caused by action potentials, but it can also be caused by the activity of an intracellular trace amine-associated receptor, TAAR1.[16] TAAR1 is a high-affinity receptor for dopamine, trace amines, and certainsubstituted amphetamines that is located along membranes in the intracellular milieu of the presynaptic cell;[16] activation of the receptor can regulate dopamine signaling by producing reuptake inhibition and neurotransmitter efflux and inhibiting neuronal firing through a diverse set of mechanisms.[16][19]

Once in the synapse, dopamine binds to and activates dopamine receptors. These can be the D2Lh type, located on the postsynaptictarget cells or the D2Sh autoreceptor type located on the membrane of the presynaptic cell.[8] After an action potential, the dopamine molecules quickly become unbound from their receptors. They are then absorbed back into the presynaptic cell, via reuptake mediated either by the dopamine transporter or by the plasma membrane monoamine transporter.[20] Once back in the cytosol, dopamine can either be broken down by a monoamine oxidase or repackaged into vesicles by VMAT2, making it available for future release.[18]

In the brain the level of extracellular dopamine is modulated by two mechanisms: phasic and tonic transmission.[21] Phasic dopamine release, like most neurotransmitter release in the nervous system, is driven directly by action potentials in the dopamine-containing cells.[21] Tonic dopamine transmission occurs when small amounts of dopamine are released without being preceded by presynaptic action potentials.[21] Tonic transmission is regulated by a variety of factors, including the activity of other neurons and neurotransmitter reuptake.[21]

Inside the brain, dopamine plays important roles in executive functions, motor control, motivation, arousal, reinforcement, and reward, as well as lower-level functions including lactation, sexual gratification, and nausea. The dopaminergic cell groups and pathways make up the dopamine system which is neuromodulatory.

Dopaminergic neurons (dopamine-producing nerve cells) are comparatively few in number—a total of around 400,000 in the human brain[22]—and their cell bodies are confined in groups to a few relatively small brain areas.[23] However their axons project to many other brain areas, and they exert powerful effects on their targets.[23] These dopaminergic cell groups were first mapped in 1964 by Annica Dahlström and Kjell Fuxe, who assigned them labels starting with the letter “A” (for “aminergic”).[24] In their scheme, areas A1 through A7 contain the neurotransmitter norepinephrine, whereas A8 through A14 contain dopamine. The dopaminergic areas they identified are thesubstantia nigra (groups 8 and 9); the ventral tegmental area (group 10); the posterior hypothalamus (group 11); the arcuate nucleus(group 12); the zona incerta (group 13) and the periventricular nucleus (group 14).[24]

The substantia nigra is a small midbrain area that forms a component of the basal ganglia. This has two parts—an input area called thepars compacta and an output area the pars reticulata. The dopaminergic neurons are found mainly in the pars compacta (cell group A8) and nearby (group A9).[23] In humans, the projection of dopaminergic neurons from the substantia nigra pars compacta to the dorsal striatum, termed the nigrostriatal pathway, plays a significant role in the control of motor function and in learning new motor skills.[25]These neurons are especially vulnerable to damage, and when a large number of them die, the result is a parkinsonian syndrome.[26]

The ventral tegmental area (VTA) is another midbrain area. The most prominent group of VTA dopaminergic neurons projects to theprefrontal cortex via the mesocortical pathway and another smaller group projects to the nucleus accumbens via the mesolimbic pathway. Together, these two pathways are collectively termed the mesocorticolimbic projection.[23][25] The VTA also sends dopaminergic projections to the amygdala, cingulate gyrus, hippocampus, and olfactory bulb.[23][25] Mesocorticolimbic neurons play a central role in reward and other aspects of motivation.[25]

The posterior hypothalamus has dopamine neurons that project to the spinal cord, but their function is not well established.[27] There is some evidence that pathology in this area plays a role in restless legs syndrome, a condition in which people have difficulty sleeping due to an overwhelming compulsion to constantly move parts of the body, especially the legs.[27]

The arcuate nucleus and the periventricular nucleus of the hypothalamus have dopamine neurons that form an important projection—the tuberoinfundibular pathway which goes to the pituitary gland, where it influences the secretion of the hormone prolactin.[28] Dopamine is the primary neuroendocrine inhibitor of the secretion of prolactin from theanterior pituitary gland.[28] Dopamine produced by neurons in the arcuate nucleus is secreted into the hypophyseal portal system of the median eminence, which supplies thepituitary gland.[28] The prolactin cells that produce prolactin, in the absence of dopamine, secrete prolactin continuously; dopamine inhibits this secretion.[28] In the context of regulating prolactin secretion, dopamine is occasionally called prolactin-inhibiting factor, prolactin-inhibiting hormone, or prolactostatin.[28]

The zona incerta, grouped between the arcuate and periventricular nuclei, projects to several areas of the hypothalamus, and participates in the control of gonadotropin-releasing hormone, which is necessary to activate the development of the male and female reproductive systems, following puberty.[28]

An additional group of dopamine-secreting neurons is found in the retina of the eye.[29] These neurons are amacrine cells, meaning that they have no axons.[29] They release dopamine into the extracellular medium, and are specifically active during daylight hours, becoming silent at night.[29] This retinal dopamine acts to enhance the activity of cone cells in the retina while suppressing rod cells—the result is to increase sensitivity to color and contrast during bright light conditions, at the cost of reduced sensitivity when the light is dim.[29]

Basal ganglia

At the top, a line drawing of a side view of the human brain, with a cross section pulled out showing the basal ganglia structures in color near the center.  At the bottom an expanded line drawing of the basal ganglia structures, showing outlines of each structure and broad arrows for their connection pathways.

Main circuits of the basal ganglia. The dopaminergic pathway from the substantia nigra pars compacta to the striatum is shown in light blue.

The largest and most important sources of dopamine in the vertebrate brain are the substantia nigra and ventral tegmental area.[23] These structures are closely related to each other and functionally similar in many respects.[23] Both are components of the basal ganglia, a complex network of structures located mainly at the base of the forebrain.[23] The largest component of the basal ganglia is the striatum.[30] The substantia nigra sends a dopaminergic projection to the dorsal striatum, while the ventral tegmental area sends a similar type of dopaminergic projection to the ventral striatum.[23]

Progress in understanding the functions of the basal ganglia has been slow.[30] The most popular hypotheses, broadly stated, propose that the basal ganglia play a central role in action selection.[31] The action selection theory in its simplest form proposes that when a person or animal is in a situation where several behaviors are possible, activity in the basal ganglia determines which of them is executed, by releasing that response from inhibition while continuing to inhibit other motor systems that if activated would generate competing behaviors.[32] Thus the basal ganglia, in this concept, are responsible for initiating behaviors, but not for determining the details of how they are carried out. In other words, they essentially form a decision-making system.[32]

The basal ganglia can be divided into several sectors, and each is involved in controlling particular types of actions.[33] The ventral sector of the basal ganglia (containing the ventral striatum and ventral tegmental area) operates at the highest level of the hierarchy, selecting actions at the whole-organism level.[32] The dorsal sectors (containing the dorsal striatum and substantia nigra) operate at lower levels, selecting the specific muscles and movements that are used to implement a given behavior pattern.[33]

Dopamine contributes to the action selection process in at least two important ways. First, it sets the “threshold” for initiating actions.[31] The higher the level of dopamine activity, the lower the impetus required to evoke a given behavior.[31] As a consequence, high levels of dopamine lead to high levels of motor activity and impulsive behavior; low levels of dopamine lead to torpor and slowed reactions.[31] Parkinson’s disease, in which dopamine levels in the substantia nigra circuit are greatly reduced, is characterized by stiffness and difficulty initiating movement—however, when people with the disease are confronted with strong stimuli such as a serious threat, their reactions can be as vigorous as those of a healthy person.[34] In the opposite direction, drugs that increase dopamine release, such as cocaine or amphetamine, can produce heightened levels of activity, including at the extreme, psychomotor agitation and stereotyped movements.[35]

The second important effect of dopamine is as a “teaching” signal.[31] When an action is followed by an increase in dopamine activity, the basal ganglia circuit is altered in a way that makes the same response easier to evoke when similar situations arise in the future.[31] This is a form of operant conditioning, in which dopamine plays the role of a reward signal.[32]

Reward

Illustration of dopaminergic reward structures

In the reward system, reward is the attractive and motivational property of a stimulus that induces appetitive behavior (also known as approach behavior) – and consummatory behavior.[36] A rewarding stimulus is one that has the potential to cause an approach to it and a choice to be made to consume it or not.[36] Pleasure, learning (e.g., classical and operant conditioning), and approach behavior are the three main functions of reward.[36] As an aspect of reward, pleasure provides a definition of reward;[36] however, while all pleasurable stimuli are rewarding, not all rewarding stimuli are pleasurable (e.g., extrinstic rewards like money).[36][37] The motivational or desirable aspect of rewarding stimuli is reflected by the approach behavior that they induce, whereas the pleasurable component of intrinstic rewards is derived from the consummatory behavior that ensues upon acquiring them.[36] A neuropsychological model which distinguishes these two components of an intrinsically rewarding stimulus is the incentive salience model, where “wanting” or desire (less commonly, “seeking”[38]) corresponds to appetitive or approach behavior while “liking” or pleasure corresponds to consummatory behavior.[36][39][40] In human drug addicts, “wanting” becomes dissociated with “liking” as the desire to use an addictive drug increases, while the pleasure obtained from consuming it decreases due to drug tolerance.[39]

Within the brain, dopamine functions partly as a “global reward signal”, where an initial phasic dopamine response to a rewarding stimulus encodes information about the salience, value, and context of a reward.[36] In the context of reward-related learning, dopamine also functions as a reward prediction error signal, that is, the degree to which the value of a reward is unexpected.[36] According to this hypothesis of Wolfram Schultz, rewards that are expected do not produce a second phasic dopamine response in certain dopaminergic cells, but rewards that are unexpected, or greater than expected, produce a short-lasting increase in synaptic dopamine, whereas the omission of an expected reward actually causes dopamine release to drop below its background level.[36] The “prediction error” hypothesis has drawn particular interest from computational neuroscientists, because an influential computational-learning method known as temporal difference learning makes heavy use of a signal that encodes prediction error.[36] This confluence of theory and data has led to a fertile interaction between neuroscientists and computer scientists interested in machine learning.[36]

Evidence from microelectrode recordings from the brains of animals shows that dopamine neurons in the ventral tegmental area (VTA) and substantia nigra are strongly activated by a wide variety of rewarding events.[36] These reward-responsive dopamine neurons in the VTA and substantia nigra are crucial for reward-related cognition and serve as the central component of the reward system.[39][41][42] The function of dopamine varies in each axonal projection from the VTA and substantia nigra;[39] for example, the VTA–nucleus accumbens shell projection assigns incentive salience (“want”) to rewarding stimuli and its associated cues, the VTA–orbitofrontal cortex projection updates the value of different goals in accordance with their incentive salience, the VTA–amygdala and VTA–hippocampus projections mediate the consolidation of reward-related memories, and both the VTA–nucleus accumbens core and substantia nigra–dorsal striatum pathways are involved in learning motor responses that facilitate the acquisition of rewarding stimuli.[39][43]Some activity within the VTA dopaminergic projections appears to be associated with reward prediction as well.[39][43]

While dopamine has a central role in mediating “wanting” — associated with the appetitive or approach behavioral responses to rewarding stimuli, detailed studies have shown that dopamine cannot simply be equated with “liking” or pleasure, as reflected in the consummatory behavioral response.[37] Dopamine neurotransmission is involved in some but not all aspects of , since pleasure centers have been identified both within and outside the dopamine system (i.e., compartments within the nucleus accumbens shell and ventral pallidum, respectively).[37][40] For example, direct electrical stimulation of dopamine pathways, using electrodes implanted in the brain, is experienced as pleasurable, and many types of animals are willing to work to obtain it.[44] Antipsychotic drugs used to treat psychosis reduce dopamine levels and tend to causeanhedonia, a diminished ability to experience pleasure.[45] Many types of pleasurable experiences—such as sex, enjoying food, or playing video games—increase dopamine release.[46] All addictive drugs directly or indirectly affect dopamine neurotransmission in the nucleus accumbens;[39][44] these drugs increase drug “wanting”, leading to compulsive drug use, when repeatedly taken in high doses, presumably through the sensitization of incentive-salience.[40] Drugs that increase dopamine release includestimulants such as methamphetamine or cocaine. These produce increases in “wanting” behaviors, but do not greatly alter expressions of pleasure or change levels of satiation.[40][44] However, opiate drugs such as heroin or morphine produce increases in expressions of “liking” and “wanting” behaviors.[40] Moreover, animals in which the ventral tegmental dopamine system has been rendered inactive do not seek food, and will starve to death if left to themselves, but if food is placed in their mouths they will consume it and show expressions indicative of pleasure.[47]

 

Affective neuroscience

Mapping Emotions On The Body: Love Makes Us Warm All Over December 30, 20134:04 PM ET MICHAELEEN DOUCLEF Affect is the experience of feeling or emotion.[1] Affect is a key part of the process of an organism‘s interaction with stimuli. The word also refers sometimes to affect display, which is “a facial, vocal, or gestural […]

People drew maps of body locations where they feel basic emotions (top row) and more complex ones (bottom row). Hot colors show regions that people say are stimulated during the emotion. Cool colors indicate deactivated areas. Image courtesy of Lauri Nummenmaa, Enrico Glerean, Riitta Hari, and Jari Hietanen.

People drew maps of body locations where they feel basic emotions (top row) and more complex ones (bottom row). Hot colors show regions that people say are stimulated during the emotion. Cool colors indicate deactivated areas.
Image courtesy of Lauri Nummenmaa, Enrico Glerean, Riitta Hari, and Jari Hietanen.

Mapping Emotions On The Body: Love Makes Us Warm All Over

December 30, 20134:04 PM ET
MICHAELEEN DOUCLEF

Affect is the experience of feeling or emotion.[1] Affect is a key part of the process of an organism‘s interaction with stimuli. The word also refers sometimes to affect display, which is “a facial, vocal, or gestural behavior that serves as an indicator of affect” (APA 2006).

The affective domain represents one of the three divisions described in modern psychology: the cognitive, the conative, and the affective. Classically, these divisions have also been referred to as the “ABC of psychology”, in that case using the terms “affect”, “behavior“, and “cognition”. In certain views, the cognitive may be considered as a part of the affective, or the affective as a part of the cognitive.[2]

Affective states are psycho-physiological constructs. According to most current views, they vary along three principal dimensions: valence, arousal, and motivational intensity.[3]Valence is the subjective positive-to-negative evaluation of an experienced state. Emotional valence refers to the emotion’s consequences, emotion-eliciting circumstances, or subjective feelings or attitudes.[4] Arousal is objectively measurable as activation of the sympathetic nervous system, but can also be assessed subjectively via self-report. Arousal is a construct that is closely related to motivational intensity but they differ in that motivation necessarily implies action while arousal does not.[5] Motivational intensity refers to the impulsion to act.[6] It is the strength of an urge to move toward or away from a stimulus. Simply moving is not considered approach motivation without a motivational urge present.[7]

All three of these categories can be related to cognition when considering the construct of cognitive scope. Initially, it was thought that positive affects broadened cognitive scope whereas negative affects narrowed cognitive scope.[3] However, evidence now suggests that affects high in motivational intensity narrow cognitive scope whereas affects low in motivational intensity broaden cognitive scope. The cognitive scope has indeed proven to be a valuable construct in cognitive psychology.[3]

Affective neuroscience is the study of the neural mechanisms of emotion. This interdisciplinary field combines neuroscience with the psychological study of personality, emotion, and mood.

Emotions are thought to be related to activity in brain areas that direct our attention, motivate our behavior, and determine the significance of what is going on around us. Pioneering work by Paul Broca (1878),[2] James Papez (1937),[3] and Paul D. MacLean (1952)[4] suggested that emotion is related to a group of structures in the center of the brain called the limbic system, which includes the hypothalamus, cingulate cortex, hippocampi, and other structures. Research has shown that limbic structures are directly related to emotion, but non-limbic structures have been found to be of greater emotional relevance. The following brain structures are currently thought to be involved in emotion:[5]

In its broadest sense, cognition refers to all mental processes. However, the study of cognition has historically excluded emotion and focused on non-emotional processes (e.g., memory, attention, perception, action, problem solving and mental imagery).[40] As a result, the study of the neural basis of non-emotional and emotional processes emerged as two separate fields: cognitive neuroscience and affective neuroscience. The distinction between non-emotional and emotional processes is now thought to be largely artificial, as the two types of processes often involve overlapping neural and mental mechanisms.[41] Thus, when cognition is taken at its broadest definition, affective neuroscience could also be called the cognitive neuroscience of emotion.

Main structures of the limbic system

  • Amygdala — The amygdalae are two small, round structures located anterior to the hippocampi near the temporal poles. The amygdalae are involved in detecting and learning what parts of our surroundings are important and have emotional significance. They are critical for the production of emotion, and may be particularly so for negative emotions, especially fear.[6] Multiple studies have shown amygdala activation when perceiving a potential threat; various circuits allow the amygdala to use related past memories to better judge the possible threat.[7]
  • Thalamus– The thalamus is involved in relaying sensory and motor signals to the cerebral cortex,[8] especially visual stimuli. The thalamus also plays an important role in regulating states of sleep and wakefulness.[9]
  • Hypothalamus– The hypothalamus is located below the thalamus. It plays a role in emotional responses by synthesizing and releasing neurotransmitters which can affect mood, reward and arousal.[10]
  • Hippocampus – The hippocampus is a structure of the medial temporal lobes that is mainly involved in memory. It works to form new memories and also connecting different senses such as visual input, smell or sound to memories. The hippocampus allows memories to be stored long term and also retrieves them when necessary. It is this retrieval that is used within the amygdala to help evaluate current affective stimulus.[11]
  • Fornix The fornix is the main output pathway from the hippocampus to the mammillary bodies. It has been identified as a main region in controlling spatial memory functions, episodic memory and executive functions.[12]
  • Mammillary body – Mammillary bodies are important for recollective memory.[13]
  • Olfactory bulb– The olfactory bulbs are the first cranial nerves, located on the ventral side of the frontal lobe. They are involved in olfaction, the perception of odors.[14]
  • Cingulate gyrus– The cingulate gyrus is located above the corpus callosum and is usually considered to be part of the limbic system. The different parts of the cingulate gyrus have different functions, and are involved with affect, visceromotor control, response selection, skeletomotor control, visuospatial processing, and in memory access.[15] A part of the cingulate gyrus is the anterior cingulate cortex, that is thought to play a central role in attention[16] and behaviorally demanding cognitive tasks.[17] It may be particularly important with regard to conscious, subjective emotional awareness. This region of the brain may also play an important role in the initiation of motivated behavior.[17]

Other brain structures related to emotion

  • Basal ganglia – Basal ganglia are groups of neuclei found on either side of the thalamus. Basal ganglia play an important role in motivation.[18]
  • Orbitofrontal cortex – Is a major structure involved in decision making and the influence by emotion on that decision.[19]
  • Prefrontal cortex — The term prefrontal cortex refers to the very front of the brain, behind the forehead and above the eyes. It appears to play a critical role in the regulation of emotion and behavior by anticipating the consequences of our actions. The prefrontal cortex may play an important role in delayed gratification by maintaining emotions over time and organizing behavior toward specific goals.[20]
  • Ventral striatum — The ventral striatum is a group of subcortical structures thought to play an important role in emotion and behavior. One part of the ventral striatum called the nucleus accumbens is thought to be involved in the experience of goal-directed positive emotion. Individuals with addictions experience increased activity in this area when they encounter the object of their addiction.
  • Insula — The insular cortex is thought to play a critical role in the bodily experience of emotion, as it is connected to other brain structures that regulate the body’s autonomic functions (heart rate, breathing, digestion, etc.). This region also processes taste information and is thought to play an important role in experiencing the emotion of disgust.
  • Cerebellum – Recently, there has been a considerable amount of work that describes the role of the cerebellum in emotion as well as cognition, and a “Cerebellar Cognitive Affective Syndrome” has been described.[21] Both neuroimaging studies as well as studies following pathological lesions in the cerebellum (such as a stroke) demonstrate that the cerebellum has a significant role in emotional regulation. Lesion studies[22] have shown that cerebellar dysfunction can attenuate the experience of positive emotions. While these same studies do not show an attenuated response to frightening stimuli, the stimuli did not recruit structures that normally would be activated (such as the amygdala). Rather, alternative limbic structures were activated, such as the ventromedial prefrontal cortex, the anterior cingulate gyrus, and the insula. This may indicate that evolutionary pressure resulted in the development of the cerebellum as a redundant fear-mediating circuit to enhance survival. It may also indicate a regulatory role for the cerebellum in the neural response to rewarding stimuli, such as money,[23] drugs of abuse,[24] and orgasm.[25]

 

need for cognition

The need for cognition (NFC), in psychology, is a personality variable reflecting the extent to which individuals are inclined towardseffortful cognitive activities.[1][2] Need for cognition has been variously defined as “a need to structure relevant situations in meaningful, integrated ways” and “a need to understand and make reasonable the experiential world”.[3] Higher NFC is associated […]

The need for cognition (NFC), in psychology, is a personality variable reflecting the extent to which individuals are inclined towardseffortful cognitive activities.[1][2]

Need for cognition has been variously defined as “a need to structure relevant situations in meaningful, integrated ways” and “a need to understand and make reasonable the experiential world”.[3] Higher NFC is associated with increased appreciation of debate, idea evaluation, and problem solving. Those with a high need for cognition may be inclined towards high elaboration. Those with a lower need for cognition may display opposite tendencies, and may process information more heuristically, often through low elaboration.[4]

Need for cognition is closely related to the five factor model domain openness to ideas, typical intellectual engagement, and epistemic curiosity (see below). Need for cognition has also been found to correlate with higher self-esteem, masculine sex-role orientation, and psychological absorption[citation needed], while being inversely related to social anxiety.

The 18 statements from the revised Need for Cognition Scale (Cacioppo et al., 1984) used in the Wabash National Study of Liberal Arts Education are shown below. Asterisks designate the items that are reverse scored.

  1. I would prefer complex to simple problems.
  2. I like to have the responsibility of handling a situation that requires a lot of thinking.
  3. Thinking is not my idea of fun.*
  4. I would rather do something that requires little thought than something that is sure to challenge my thinking abilities.*
  5. I try to anticipate and avoid situations where there is likely a chance I will have to think in depth about something.*
  6. I find satisfaction in deliberating hard and for long hours.
  7. I only think as hard as I have to.*
  8. I prefer to think about small, daily projects to long-term ones.*
  9. I like tasks that require little thought once I’ve learned them.*
  10. The idea of relying on thought to make my way to the top appeals to me.
  11. I really enjoy a task that involves coming up with new solutions to problems.
  12. Learning new ways to think doesn’t excite me very much.*
  13. I prefer my life to be filled with puzzles that I must solve.
  14. The notion of thinking abstractly is appealing to me.
  15. I would prefer a task that is intellectual, difficult, and important to one that is somewhat important but does not require much thought.
  16. I feel relief rather than satisfaction after completing a task that required a lot of mental effort.*
  17. It’s enough for me that something gets the job done; I don’t care how or why it works.*
  18. I usually end up deliberating about issues even when they do not affect me personally.

Noam Chomsky at UC Santa Barbara

Published on Apr 7, 2014
March 01, 2014 at UC Santa Barbara
Topics Discussed Include: Syrian Civil War, Israel Lobby, East Asian Miracle, Austerity, Mysteries and Perplexing Questions, Alan Greenspan, Class Warfare, Latin America, Neo-liberalism, Free…

Published on Apr 7, 2014
March 01, 2014 at UC Santa Barbara

Topics Discussed Include: Syrian Civil War, Israel Lobby, East Asian Miracle, Austerity, Mysteries and Perplexing Questions, Alan Greenspan, Class Warfare, Latin America, Neo-liberalism, Free Will, Business Party, etc.

experiencing less as we record more

Photographing More, Experiencing Less Put Down The Camera And Watch The Show … Really? The idea that we are experiencing less as we record more got psychologist Linda Henkel thinking. Her father was a photographer, and she wanted to explore how photographs … Continue reading

photos

Photographing More, Experiencing Less

The idea that we are experiencing less as we record more got psychologist Linda Henkel thinking. Her father was a photographer, and she wanted to explore how photographs shape our memories.

Henkel, who researches human memory at Fairfield University in Connecticut, began an experiment by sending groups of students to the university’s art museum. The students observed some objects and photographed others. Then, back at the laboratory, they were given a memory test.

Henkel found what she called a “photo-taking impairment effect.”

“The objects that they had taken photos of — they actually remembered fewer of them, and remembered fewer details about those objects. Like, how was this statue’s hands positioned, or what was this statue wearing on its head. They remembered fewer of the details if they took photos of them, rather than if they had just looked at them,” she says.

Henkel says her students’ memories were impaired because relying on an external memory aid means you subconsciously count on the camera to remember the details for you.

The Act of Creation

The Act of Creation is a 1964 book by Arthur Koestler. It is a study of the processes of discovery, invention, imagination and creativity in humour, science, and the arts. It lays out Koestler’s attempt to develop an elaborate general theory of human creativity. From describing and comparing many different examples of invention and discovery, Koestler concludes […]

The Act of Creation is a 1964 book by Arthur Koestler. It is a study of the processes of discovery, invention, imagination and creativity in humour, science, and the arts. It lays out Koestler’s attempt to develop an elaborate general theory of human creativity.

From describing and comparing many different examples of invention and discovery, Koestler concludes that they all share a common pattern which he terms “bisociation” – a blending of elements drawn from two previously unrelated matrices of thought into a new matrix of meaning by way of a process involving comparison, abstraction and categorisation, analogies and metaphors. He regards many different mental phenomena based on comparison (such as analogies, metaphors, parables, allegories, jokes, identification, role-playing, acting, personification, anthropomorphism etc.), as special cases of “bisociation”.

The concept of bisociation has been adopted, generalised and formalised by cognitive linguists Gilles Fauconnier and Mark Turner, who developed it into conceptual blending theory

Conceptual blending, also called conceptual integration or view application, is a theory of cognition developed by Gilles Fauconnier and Mark Turner. According to this theory, elements and vital relations from diverse scenarios are “blended” in a subconscious process, which is assumed to be ubiquitous to everyday thought and language.

The development of this theory began in 1993 and a representative early formulation is found in the online article Conceptual Integration and Formal Expression. Turner and Fauconnier cite Arthur Koestler´s 1964 book The Act of Creation as an early forerunner of conceptual blending: Koestler had identified a common pattern in creative achievements in the arts, sciences and humor that he had termed “bisociation of matrices.”[1] A newer version of blending theory, with somewhat different terminology, was presented in their book The Way We Think.

 

the Universal Principles of Persuasion

Published on Nov 26, 2012 For more visit our blog at http://www.insideinfluence.com Animation describing the Universal Principles of Persuasion based on the research of Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University. Dr. Robert Cialdini & Steve Martin are co-authors (together with Dr. Noah Goldstein) of the New York Times, Wall Street […]

Published on Nov 26, 2012

For more visit our blog at http://www.insideinfluence.com

Animation describing the Universal Principles of Persuasion based on the research of Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University.

Dr. Robert Cialdini & Steve Martin are co-authors (together with Dr. Noah Goldstein) of the New York Times, Wall Street Journal and Business Week International Bestseller Yes! 50 Scientifically Proven Ways to be Persuasive.

US Amazon http://tinyurl.com/afbam9g

The candle problem

Uploaded on Aug 25, 2009 http://www.ted.com Career analyst Dan Pink examines the puzzle of motivation, starting with a fact that social scientists know but most managers don’t: Traditional rewards aren’t always as effective as we think. Listen for illuminating stories — and maybe, a way forward. The candle problem or candle task, also known as […]

Uploaded on Aug 25, 2009

http://www.ted.com Career analyst Dan Pink examines the puzzle of motivation, starting with a fact that social scientists know but most managers don’t: Traditional rewards aren’t always as effective as we think. Listen for illuminating stories — and maybe, a way forward.

The candle problem or candle task, also known as Duncker’s candle problem, is a cognitive performance test, measuring the influence of functional fixedness on a participant’s problem solving capabilities. The test was created [1] by Gestalt psychologist Karl Duncker and published posthumously in 1945. Duncker originally presented this test in his thesis on problem solving tasks at Clark University.

The test presents the participant with the following task: how to fix a lit candle on a wall (a cork board) in a way so the candle wax won’t drip onto the table below.[3] To do so, one may only use the following along with the candle:

  • a book of matches
  • a box of thumbtacks

The solution is to empty the box of thumbtacks, put the candle into the box, use the thumbtacks to nail the box (with the candle in it) to the wall, and light the candle with the match.[3] The concept of functional fixedness predicts that the participant will only see the box as a device to hold the thumbtacks and not immediately perceive it as a separate and functional component available to be used in solving the task.

Response

Many of the people who attempted the test explored other creative, but less efficient, methods to achieve the goal. For example, some tried to tack the candle to the wall without using the thumbtack box,[4] and others attempted to melt some of the candle’s wax and use it as an adhesive to stick the candle to the wall.[1] Neither method works.[1] However, if the task is presented with the tacks piled next to the box (rather than inside it), virtually all of the participants were shown to achieve the optimal solution, which is self defined.[4]

The test has been given to numerous people, including M.B.A. students at the Kellogg School of Management in a study investigating whether living abroad and creativity are linked.[5]

Glucksberg

Glucksberg (1962)[6] used a 2 × 2 design manipulating whether the tacks and matches were inside or outside of their boxes and whether subjects were offered cash prizes for completing the task quickly. Subjects who were offered no prize, termed low-drive, were told “We are doing pilot work on various problems in order to decide which will be the best ones to use in an experiment we plan to do later. We would like to obtain norms on the time needed to solve.” The remaining subjects, termed high-drive, were told “Depending on how quickly you solve the problem you can win $5.00 or $20.00. The top 25% of the Ss [subjects] in your group will win $5.00 each; the best will receive $20.00. Time to solve will be the criterion used.” (As a note, adjusting for inflation since 1962, the study’s publish year, the amounts in today’s dollars would be approximately $39 and $154, respectively.[7]) The empty-boxes condition was found to be easier than the filled-boxes condition: more subjects solved the problem, and those who did solve the problem solved it faster. Within the filled-boxes condition, high-drive subjects performed worse than low-drive subjects. Glucksberg interpreted this result in terms of “neobehavioristic drive theory”: “high drive prolongs extinction of the dominant habit and thus retards the correct habit from gaining ascendancy”. An explanation in terms of the overjustification effect is made difficult by the lack of a main effect for drive and by a nonsignificant trend in the opposite direction within the empty-boxes condition.

Another way to explain the higher levels of failure during the high-drive condition is that the process of turning the task into a competition for limited resources can create mild levels of stress in the subject, which can lead to the Sympathetic nervous system, otherwise known as the Fight-or-flight response, taking over the brain and body. This stress response effectively shuts down the creative thinking and problem solving areas of the brain in the prefrontal cortex.

Linguistic implications

E. Tory Higgins and W. M. Chaires found that having subjects repeat the names of common pairs of objects in this test, but in a different and unaccustomed linguistic structure, such as “box and tacks” instead of “box of tacks”, facilitated performance on the candle problem.[3] This phrasing helps one to distinguish the two entities as different and more accessible.[3]

In a written version of the task given to people at Stanford University, Michael C. Frank and language acquisition researcher Michael Ramscar reported that simply underlining certain relevant materials (“on the table there is a candle, a box of tacks, and a book of matches…”) increases the number of candle-problem solvers from 25% to 50%.[4]

References

  1. ^ Jump up to: a b c “Dan Pink on the surprising science of motivation”. Retrieved 16 January 2010.
  2. Jump up ^ Daniel Biella and Wolfram Luther. “A Synthesis Model for the Replication of Historical Experiments in Virtual Environments”. 5th European Conference on e-Learning. Academic Conferences Limited. p. 23. ISBN 978-1-905305-30-8.
  3. ^ Jump up to: a b c d Richard E. Snow and Marshall J. Farr, ed. (1987). “Positive Affect and Organization”. Aptitude, Learning, and Instruction Volume 3: Conative and Affective Process Analysis. Routledge. ISBN 978-0-89859-721-9.
  4. ^ Jump up to: a b c Frank, Michael. “Against Informational Atomism”. Retrieved 15 January 2010.
  5. Jump up ^ “Living Outside the Box: Living abroad boosts creativity”. April 2009. Retrieved 16 January 2010.
  6. Jump up ^ Glucksberg, S. (1962). “The influence of strength of drive on functional fixedness and perceptual recognition”. Journal of Experimental Psychology 63: 36–41. doi:10.1037/h0044683. PMID 13899303. edit
  7. Jump up ^ Inflated values automatically calculated.