Winners Are Grinners: Even If There’s Nothing to Win!
Whether it’s for money, marbles or chalk, the brains of reward-driven people keep their game faces on, helping them win at every step of the way. Surprisingly, they win most often when there is no reward.
That’s the finding of neuroscientists at Washington University in St. Louis, who tested 31 randomly selected subjects with word games, some of which had monetary rewards of either 25 or 75 cents per correct answer, others of which had no money attached.
Subjects were given a short list of five words to memorize in a matter of seconds, then a 3.5-second interval or pause, then a few seconds to respond to a solitary word that either had been on the list or had not. Test performance had no consequence in some trials, but in others, a computer graded the responses, providing an opportunity to win either 25 cent or 75 cents for quick and accurate answers. Even during these periods, subjects were sometimes alerted that their performance would not be rewarded on that trial.
Prior to testing, subjects were submitted to a battery of personality tests that rated their degree of competitiveness and their sensitivity to monetary rewards.
Designed to test the hypothesis that excitement in the brains of the most monetary-reward-sensitive subjects would slacken during trials that did not pay, the study is co-authored by Koji Jimura, PhD, a post-doctoral researcher, and Todd Braver, PhD, a professor, both based in psychology in Arts & Sciences. Braver is also a member of the neuroscience program and radiology department in the university’s School of Medicine.
But the researchers found a paradoxical result: the performance of the most reward-driven individuals was actually most improved – relative to the less reward-driven – in the trials that paid nothing, not the ones in which there was money at stake.
Even more striking was that the brain scans taken using functional Magnetic Resonance Imaging (fMRI) showed a change in the pattern of activity during the non-rewarded trials within the lateral prefrontal cortex (PFC), located right behind the outer corner of the eyebrow, an area that is strongly linked to intelligence, goal-driven behavior and cognitive strategies. The change in lateral PFC activity was statistically linked to the extra behavioral benefits observed in the reward-driven individuals.
The researchers suggest that this change in lateral PFC activity patterns represents a flexible shift in response to the motivational importance of the task, translating this into a superior task strategy that the researchers term “proactive cognitive control.” In other words, once the rewarding motivational context is established in the brain indicating there is a goal-driven contest at hand, the brain actually rallies its neuronal troops and readies itself for the next trial, whether it’s for money or not.
“It sounds reasonable now, but when I happened upon this result, I couldn’t believe it because we expected the opposite results,” says Jimura, first author of the paper. “I had to analyze the data thoroughly to persuade myself. The important finding of our study is that the brains of these reward- sensitive individuals do not respond to the reward information on individual trials. Instead, it shows that they have persistent motivation, even in the absence of a reward. You’d think you’d have to reward them on every trial to do well. But it seems that their brains recognized the rewarding motivational context that carried over across all the trials.”
The finding sheds more light on the workings of the lateral PFC and provides potential behavioral clues about personality, motivation, goals and cognitive strategies. The research has important implications for understanding the nature of persistent motivation, how the brain creates such states, and why some people seem to be able to use motivation more effectively than others. By understanding the brain circuitry involved, it might be possible to create motivational situations that are more effective for all individuals, not just the most reward-driven ones, or to develop drug therapies for individuals that suffer from chronic motivational problems.Their results are published April 26 in the early online edition of the Proceedings of the National Academy of Science.
Everyone knows of competitive people who have to win, whether in a game of HORSE, golf or the office NCAA basketball tournament pool. The findings might tell researchers something about the competitive drive.
The researchers are interested in the signaling chain that ignites the prefrontal cortex when it acts on reward-driven impulses, and they speculate that the brain chemical dopamine could be involved. That could be a potential direction of future studies. Dopamine neurons, once thought to be involved in a host of pleasurable situations, but now considered more of learning or predictive signal, might respond to cues that let the lateral PFC know that it’s in for something good. This signal might help to keep information about the goals, rules or best strategies for the task active in mind to increase the chances of obtaining the desired outcome.
In the context of this study, when a 75-cent reward is available for a trial, the dopamine-releasing neurons could be sending signals to the lateral PFC that “jump start” it to do the right procedures to get a reward.
“It would be like the dopamine neurons recognize a cup of Ben and Jerry’s ice cream, and tell the lateral PFC the right action strategy to get the reward – to grab a spoon and bring the ice cream to your mouth,” says Braver. “We think that the dopamine neurons fires to the cue rather than the reward itself, especially after the brain learns the relationship between the two. We’d like to explore that some more.”
They also are interested in the “reward carryover state,” or the proactive cognitive strategy that keeps the brain excited even in gaps, such as pauses between trials or trials without rewards. They might consider a study in which rewards are far fewer.
“It’s possible we’d see more slackers with less rewards,” Braver says. “That might have an effect on the reward carryover state. There are a host of interesting further questions that this work brings up which we plan to pursue.”
Source: Washington University in St. Louis,
![]()
Related articles by Zemanta
- Multitasking Splits the Brain (news.sciencemag.org)
- Why Humans Believe That Better Things Come To Those Who Wait (lockergnome.com)
Remember What Happened? Only In Your Dreams!
It is by now well established that sleep can be an important tool when it comes to enhancing memory and learning skills. And now, a new study sheds light on the role that dreams play in this important process.
Led by scientists at Beth Israel Deaconess Medical Center (BIDMC), the new findings suggest that dreams may be the sleeping brain’s way of telling us that it is hard at work on the process of memory consolidation, integrating our recent experiences to help us with performance-related tasks in the short run and, in the long run, translating this material into information that will have widespread application to our lives. The study is reported in the April 22 On-line issue of Current Biology.
“What’s got us really excited, is that after nearly 100 years of debate about the function of dreams, this study tells us that dreams are the brain’s way of processing, integrating and really understanding new information,” explains senior author Robert Stickgold, PhD, Director of the Center for Sleep and Cognition at BIDMC and Associate Professor of Psychiatry at Harvard Medical School. “Dreams are a clear indication that the sleeping brain is working on memories at multiple levels, including ways that will directly improve performance.”
At the outset, the authors hypothesized that dreaming about a learning experience during nonrapid eye movement (NREM) sleep would lead to improved performance on a hippocampus-dependent spatial memory task. (The hippocampus is a region of the brain responsible for storing spatial memory.)
To test this hypothesis, the investigators had 99 subjects spend an hour training on a “virtual maze task,” a computer exercise in which they were asked to navigate through and learn the layout of a complex 3D maze with the goal of reaching an endpoint as quickly as possible. Following this initial training, participants were assigned to either take a 90-minute nap or to engage in quiet activities but remain awake. At various times, subjects were also asked to describe what was going through their minds, or in the case of the nappers, what they had been dreaming about. Five hours after the initial exercise, the subjects were retested on the maze task.
The results were striking.
The non-nappers showed no signs of improvement on the second test – even if they had reported thinking about the maze during their rest period. Similarly, the subjects who napped, but who did not report experiencing any maze-related dreams or thoughts during their sleep period, showed little, if any, improvement. But, the nappers who described dreaming about the task showed dramatic improvement, 10 times more than that shown by those nappers who reported having no maze-related dreams.
“These dreamers described various scenarios – seeing people at checkpoints in a maze, being lost in a bat cave, or even just hearing the background music from the computer game,” explains first author Erin Wamsley, PhD, a postdoctoral fellow at BIDMC and Harvard Medical School. These interpretations suggest that not only was sleep necessary to “consolidate” the information, but that the dreams were an outward reflection that the brain had been busy at work on this very task.
Of particular note, say the authors, the subjects who performed better were not more interested or motivated than the other subjects. But, they say, there was one distinct difference that was noted.
“The subjects who dreamed about the maze had done relatively poorly during training,” explains Wamsley. “Our findings suggest that if something is difficult for you, it’s more meaningful to you and the sleeping brain therefore focuses on that subject – it ‘knows’ you need to work on it to get better, and this seems to be where dreaming can be of most benefit.”
Furthermore, this memory processing was dependent on being in a sleeping state. Even when a waking subject “rehearsed and reviewed” the path of the maze in his mind, if he did not sleep, then he did not see any improvement, suggesting that there is something unique about the brain’s physiology during sleep that permits this memory processing.
“In fact,” says Stickgold, “this may be one of the main goals that led to the evolution of sleep. If you remain awake [following the test] you perform worse on the subsequent task. Your memory actually decays, no matter how much you might think about the maze.
“We’re not saying that when you learn something it is dreaming that causes you to remember it,” he adds. “Rather, it appears that when you have a new experience it sets in motion a series of parallel events that allow the brain to consolidate and process memories.”
Ultimately, say the authors, the sleeping brain seems to be accomplishing two separate functions: While the hippocampus is processing information that is readily understandable (i.e. navigating the maze), at the same time, the brain’s higher cortical areas are applying this information to an issue that is more complex and less concrete (i.e. how to navigate through a maze of job application forms).
“Our [nonconscious] brain works on the things that it deems are most important,” adds Wamsley. “Every day, we are gathering and encountering tremendous amounts of information and new experiences,” she adds. “It would seem that our dreams are asking the question, ‘How do I use this information to inform my life?’”
Study coauthors include BIDMC investigators Matthew Tucker, Joseph Benavides and Jessica Payne (currently of the University of Notre Dame).
This study was supported by grants from the National Institutes of Health.
BIDMC is a patient care, teaching and research affiliate of Harvard Medical School, and consistently ranks in the top four in National Institutes of Health funding among independent hospitals nationwide. BIDMC is a clinical partner of the Joslin Diabetes Center and a research partner of the Dana-Farber/Harvard Cancer Center. BIDMC is the official hospital of the Boston Red Sox.
Source: BIDMC
Related articles by Zemanta
- Study: Naps Greatly Boost Memory If You Dream (shoppingblog.com)
- Sticks & Stones AND Words Can Hurt You: How Words Can Cause Physical Pain (peterhbrown.wordpress.com)
- Brain Training Or Just Brain Straining?: The Benefits Of Brain Exercise Software Are Unclear (peterhbrown.wordpress.com)
Brain Training Or Just Brain Straining?: The Benefits Of Brain Exercise Software Are Unclear
You’ve probably heard it before: the brain is a muscle that can be strengthened. It’s an assumption that has spawned a multimillion-dollar computer game industry of electronic brain-teasers and memory games. But in the largest study of such brain games to date, a team of British researchers has found that healthy adults who undertake computer-based “brain-training” do not improve their mental fitness in any significant way.
Read The Original Research Paper (Draft POF)
The study, published online Tuesday by the journal Nature, tracked 11,430 participants through a six-week online study. The participants were divided into three groups: the first group undertook basic reasoning, planning and problem-solving activities (such as choosing the “odd one out” of a group of four objects); the second completed more complex exercises of memory, attention, math and visual-spatial processing, which were designed to mimic popular “brain-training” computer games and programs; and the control group was asked to use the Internet to research answers to trivia questions.
All participants were given a battery of unrelated “benchmark” cognitive-assessment tests before and after the six-week program. These tests, designed to measure overall mental fitness, were adapted from reasoning and memory tests that are commonly used to gauge brain function in patients with brain injury or dementia. All three study groups showed marginal — and identical — improvement on these benchmark exams.
But the improvement had nothing to do with the interim brain-training, says study co-author Jessica Grahn of the Cognition and Brain Sciences Unit in Cambridge. Grahn says the results confirm what she and other neuroscientists have long suspected: people who practice a certain mental task — for instance, remembering a series of numbers in sequence, a popular brain-teaser used by many video games — improve dramatically on that task, but the improvement does not carry over to cognitive function in general. (Indeed, all the study participants improved in the tasks they were given; even the control group got better at looking up answers to obscure questions.) The “practice makes perfect” phenomenon probably explains why the study participants improved on the benchmark exams, says Grahn — they had all had taken it once before. “People who practiced a certain test improved at that test, but improvement does not translate beyond anything other than that specific test,” she says.
The authors believe the study, which was run in conjuction with a BBC television program called “Bang Goes the Theory,” undermines the sometimes outlandish claims of many brain-boosting websites and digital games. According to a past TIME.com article by Anita Hamilton, HAPPYneuron, an example not cited by Grahn, is a $100 Web-based brain-training site that invites visitors to “give the gift of brain fitness” and claims its users saw “16%+ improvement” through exercises such as learning to associate a bird’s song with its species and shooting basketballs through virtual hoops. Hamilton also notes Nintendo’s best-selling Brain Age game, which promises to “give your brain the workout it needs” through exercises like solving math problems and playing rock, paper, scissors on the handheld DS. “The widely held belief that commercially available computerized brain-training programs improve general cognitive function in the wider population lacks empirical support,” the paper concludes.
Not all neuroscientists agree with that conclusion, however. In 2005, Torkel Klingberg, a professor of cognitive neuroscience at the Karolinska Institute in Sweden, used brain imaging to show that brain-training can alter the number of dopamine receptors in the brain — dopamine is a neurotransmitter involved in learning and other important cognitive functions. Other studies have suggested that brain-training can help improve cognitive function in elderly patients and those in the early stages of Alzheimer’s disease, but the literature is contradictory.
Klingberg has developed a brain-training program called Cogmed Working Memory Training, and owns shares in the company that distributes it. He tells TIME that the Nature study “draws a large conclusion from a single negative finding” and that it is “incorrect to generalize from one specific training study to cognitive training in general.” He also criticizes the design of the study and points to two factors that may have skewed the results.
On average the study volunteers completed 24 training sessions, each about 10 minutes long — for a total of three hours spent on different tasks over six weeks. “The amount of training was low,” says Klingberg. “Ours and others’ research suggests that 8 to 12 hours of training on one specific test is needed to get a [general improvement in cognition].”
Second, he notes that the participants were asked to complete their training by logging onto the BBC Lab UK website from home. “There was no quality control. Asking subjects to sit at home and do tests online, perhaps with the TV on or other distractions around, is likely to result in bad quality of the training and unreliable outcome measures. Noisy data often gives negative findings,” Klingberg says.
Brain-training research has received generous funding in recent years — and not just from computer game companies — as a result of the proven effect of neuroplasticity, the brain’s ability to remodel its nerve connections after experience. The stakes are high. If humans could control that process and bolster cognition, it could have a transformative effect on society, says Nick Bostrom of Oxford University‘s Future of Humanity Institute. “Even a small enhancement in human cognition could have a profound effect,” he says. “There are approximately 10 million scientists in the world. If you could improve their cognition by 1%, the gain would hardly be noticeable in a single individual. But it could be equivalent to instantly creating 100,000 new scientists.”
For now, there is no nifty computer game that will turn you into Einstein, Grahn says. But there are other proven ways to improve cognition, albeit only by small margins. Consistently getting a good night’s sleep, exercising vigorously, eating right and maintaining healthy social activity have all been shown to help maximize a brain’s potential over the long term.
What’s more, says Grahn, neuroscientists and psychologists have yet to even agree on what constitutes high mental aptitude. Some experts argue that physical skill, which stems from neural pathways, should be considered a form of intelligence — so, masterful ballet dancers and basketball players would be considered geniuses.
Jason Allaire, co-director of the Games through Gaming lab at North Carolina State University says the Nature study makes sense; rather than finding a silver bullet for brain enhancement, he says, “it’s really time for researchers to think about a broad or holistic approach that exercises or trains the mind in general in order to start to improve cognition more broadly.”
Or, as Grahn puts it, when it comes to mental fitness, “there are no shortcuts.”
Credit: Time.com
Related articles by Zemanta
- Brain games don’t make you smarter, study finds (healthzone.ca)
- Brain training ‘boost’ questioned (news.bbc.co.uk)
ADHD Treatment: Behavior Therapy & Medication Seem To Positively Affect The Brain In The Same Way
(Information provided by The Wellcome Trust 1 April 2010)
Read the original research paper HERE (PDF)
Medication and behavioural interventions help children with attention deficit hyperactivity disorder (ADHD) better maintain attention and self-control by normalising activity in the same brain systems, according to research funded by the Wellcome Trust.
In a study published today in the journal ‘Biological Psychiatry’, researchers from the University of Nottingham show that medication has the most significant effect on brain function in children with ADHD, but this effect can be boosted by complementary use of rewards and incentives, which appear to mimic the effects of medication on brain systems.
ADHD is the most common mental health disorder in childhood, affecting around one in 20 children in the UK. Children with ADHD are excessively restless, impulsive and distractible, and experience difficulties at home and in school. Although no cure exists for the condition, symptoms can be reduced by a combination of medication and behaviour therapy.
Methylphenidate, a drug commonly used to treat ADHD, is believed to increase levels of dopamine in the brain. Dopamine is a chemical messenger associated with attention, learning and the brain’s reward and pleasure systems. This increase amplifies certain brain signals and can be measured using an electroencephalogram (EEG). Until now it has been unclear how rewards and incentives affect the brain, either with or without the additional use of medication.
To answer these questions, researchers at Nottingham’s Motivation, Inhibition and Development in ADHD Study (MIDAS) used EEG to measure brain activity while children played a simple game. They compared two particular markers of brain activity that relate to attention and impulsivity, and looked at how these were affected by medication and motivational incentives.
The team worked with two groups of children aged nine to 15: one group of 28 children with ADHD and a control group of 28. The children played a computer game in which green aliens were randomly interspersed with less frequent black aliens, each appearing for a short interval. Their task was to ‘catch’ as many green aliens as possible, while avoiding catching black aliens. For each slow or missed response, they would lose one point; they would gain one point for each timely response.
In a test designed to study the effect of incentives, the reward for avoiding catching the black alien was increased to five points; a follow-up test replaced this reward with a five-point penalty for catching the wrong alien.
The researchers found that when given their usual dose of methylphenidate, children with ADHD performed significantly better at the tasks than when given no medication, with better attention and reduced impulsivity. Their brain activity appeared to normalise, becoming similar to that of the control group.
Similarly, motivational incentives also helped to normalise brain activity on the two EEG markers and improved attention and reduced impulsivity, though its effect was much smaller than that of medication.
“When the children were given rewards or penalties, their attention and self-control was much improved,” says Dr Maddie Groom, first author of the study. “We suspect that both medication and motivational incentives work by making a task more appealing, capturing the child’s attention and engaging his or her brain response control systems.”
Professor Chris Hollis, who led the study, believes the findings may help to reconcile the often-polarised debate between those who advocate either medication on the one hand, or psychological/behavioural therapy on the other.
“Although medication and behaviour therapy appear to be two very different approaches of treating ADHD, our study suggests that both types of intervention may have much in common in terms of their affect on the brain,” he says. “Both help normalise similar components of brain function and improve performance. What’s more, their effect
is additive, meaning they can be more effective when used together.”
The researchers believe that the results lend support from neuroscience to current treatment guidelines
for ADHD as set out by the National Institute for Health and Clinical Excellence (NICE). These recommend that behavioural interventions, which have a smaller effect size, are appropriate for moderate ADHD, while medication, with its larger effect size, is added for severe ADHD.
Although the findings suggest that a combination of incentives and medication might work most effectively, and potentially enable children to take lower doses of medication, Professor Hollis believes more work is needed before the results can be applied to everyday clinical practice or classroom situations.
“The incentives and rewards in our study were immediate and consistent, but we know that children with ADHD respond disproportionately less well to delayed rewards,” he says. “This could mean that in the ‘real world’ of the classroom or home, the neural effects of behavioural approaches using reinforcement and rewards may be less effective.”
Read the original research paper HERE (PDF)
Related articles by Zemanta
- Mount Sinai finds meta-cognitive therapy more effective for adult ADHD patients (scienceblog.com)
- What Do Girls with ADHD Look Like As Adults? (psychcentral.com)
Exercise DOES Help Improve Mood! And Just 25 Minutes Worth Will Decrease Stress & Increase Energy
Having an Honors degree in Human Movement Studies and working in gyms in a former life while studying for my Clinical Masters degree, I have seen this to be true. Of course it seems self evident, but these researchers have used great science with an excellent and now research-proven written program and workbook. These, along with their recent meta-analytic research review, show just how effective exercise can be in improving mood.
Credit: PhysOrg.com) — Exercise is a magic drug for many people with depression and anxiety disorders, according to researchers who analyzed numerous studies, and it should be more widely prescribed by mental health care providers.
“Exercise has been shown to have tremendous benefits for mental health,” says Jasper Smits, director of the Anxiety Research and Treatment Program at Southern Methodist University in Dallas. “The more therapists who are trained in exercise therapy, the better off patients will be.”
The traditional treatments of cognitive behavioral therapy and pharmacotherapy don’t reach everyone who needs them, says Smits, an associate professor of psychology.
“Exercise can fill the gap for people who can’t receive traditional therapies because of cost or lack of access, or who don’t want to because of the perceived social stigma associated with these treatments,” he says. “Exercise also can supplement traditional treatments, helping patients become more focused and engaged.”
Smits and Michael Otto, psychology professor at Boston University, presented their findings to researchers and mental health care providers March 6 at the Anxiety Disorder Association of America’s annual conference in Baltimore.
Their workshop was based on their therapist guide “Exercise for Mood and Anxiety Disorders,” with accompanying patient workbook (Oxford University Press, September 2009).
The guide draws on dozens of population-based studies, clinical studies and meta-analytic reviews that demonstrate the efficacy of exercise programs, including the authors’ meta-analysis of exercise interventions for mental health and study on reducing anxiety sensitivity with exercise.
“Individuals who exercise report fewer symptoms of anxiety and depression, and lower levels of stress and anger,” Smits says. “Exercise appears to affect, like an antidepressant, particular neurotransmitter systems in the brain, and it helps patients with depression re-establish positive behaviors. For patients with anxiety disorders, exercise reduces their fears of fear and related bodily sensations such as a racing heart and rapid breathing.”
After patients have passed a health assessment, Smits says, they should work up to the public health dose, which is 150 minutes a week of moderate-intensity activity or 75 minutes a week of vigorous-intensity activity.
At a time when 40 percent of Americans are sedentary, he says, mental health care providers can serve as their patients’ exercise guides and motivators.
“Rather than emphasize the long-term health benefits of an exercise program — which can be difficult to sustain — we urge providers to focus with their patients on the immediate benefits,” he says. “After just 25 minutes, your mood improves, you are less stressed, you have more energy — and you’ll be motivated to exercise again tomorrow. A bad mood is no longer a barrier to exercise; it is the very reason to exercise.”
Smits says health care providers who prescribe exercise also must give their patients the tools they need to succeed, such as the daily schedules, problem-solving strategies and goal-setting featured in his guide for therapists.
“Therapists can help their patients take specific, achievable steps,” he says. “This isn’t about working out five times a week for the next year. It’s about exercising for 20 or 30 minutes and feeling better today.”
Related articles by Zemanta
The Real “Rain Man”: A Fascinating Look At Kim Peek
Kim Peek was the inspiration for the movie Rain Man starring Dustin Hoffman and Tom Cruise. Peek, who passed away last year at the age of 58, lived with his father Fran. Peek suffered from a brain development disorder known as agenesis of the corpus collosum. Malformation and absence of the corpus callosum are rare developmental disorders that result in a wide spectrum of symptoms, ranging from severe cerebral palsy, epilepsy and autism to relatively mild learning problems.
While Kim was able to perform extraordinary mental feats, particularly related to memory of historical facts, he struggled with many of the day to day tasks of life. This is a fascinating short video of Kim’s visit to London and his explanation of his condition. Enjoy!Vodpod videos no longer available.
Related articles by Zemanta
- Man Who Inspired ‘Rain Man’ Dies At 58 (npr.org)
- “I have so many things in me that you can’t even guess them all.” – Kim Peek. (althouse.blogspot.com)
You Can Trust Me More Than You Can Trust Them: Cynicism & The Trust Gap
Read the original research paper HERE (PDF)
Credit: Jeremy Dean from Psyblog
How do people come to believe that others are so much less trustworthy than themselves?
Much as we might prefer otherwise, there’s solid evidence that, on average, people are quite cynical. When thinking about strangers, studies have shown that people think others are more selfishly motivated than they really are and that others are less helpful than they really are.
Similarly in financial games psychologists have run in the lab, people are remarkably cynical about the trustworthiness of others. In one experiment people honored the trust placed in them between 80 and 90 percent of the time, but only estimated that others would honor their trust about 50 percent of the time.
Our cynicism towards strangers may develop as early as 7 years old (Mills & Keil, 2005). Surprisingly people are even overly cynical about their loved ones, assuming they will behave more selfishly than they really do (Kruger & Gilovich, 1999).
What could create such a huge gap between how people behave themselves and how they think others behave?
Trust me
People often say that it’s experience that breeds this cynicism rather than a failing in human nature. This is true, but only in a special way.
Think about it like this: the first time you trust a stranger and are betrayed, it makes sense to avoid trusting other strangers in the future. The problem is that when we don’t ever trust strangers, we never find out how trustworthy people in general really are. As a result our estimation of them is governed by fear.
If this argument is correct, it is lack of experience that leads to people’s cynicism, specifically not enough positive experiences of trusting strangers. This idea is tested in a new study published in Psychological Science. Fetchenhauer and Dunning (2010) set up a kind of ideal world in the lab where people were given accurate information about the trustworthiness of strangers to see if that would reduce their cynicism.
They recruited 120 participants to take part in a game of economic trust. Each person was given €7.50 and asked if they’d like to hand it to another person. If the other person made the same decision the pot would increase to €30. They were then asked to estimate whether the other person would opt to give them their half of the total winnings.
The participants watched 56 short videos of the people they were playing against. The researchers set up two experimental conditions, one to mimic what happens in the real world and one to test an ideal world scenario:
1. Real life condition: in this group participants were only told about the other person’s decision when they decided to trust them. The idea is that this condition simulates real life. You only find out if others are trustworthy when you decide to trust them. If you don’t trust someone you never find out whether or not they are trustworthy.
2. Ideal world condition: here participants were given feedback about the trustworthiness of other people whether or not they decided to trust them. This simulates an ideal-world condition where we all know from experience just how trustworthy people are (i.e. much more trustworthy than we think!)
Breaking down cynicism
Once again this study showed that people are remarkably cynical about strangers. Participants in this study thought that only 52 percent of the people they saw in the videos could be trusted to share their winnings. But the actual level of trustworthiness was a solid 80 percent. There’s the cynicism.
That cynicism was quickly broken down, though, by giving participants accurate feedback about others’ trustworthiness. People in the ideal world condition noticed that others could be trusted (they upped their estimate to 71 percent) and were also more trusting themselves, handing over the money 70.1 percent of the time.
People in the ideal world condition could even be seen shedding their cynicism as the study went on, becoming more trusting as they noticed that others were trustworthy. This suggests people aren’t inherently cynical, it’s just that we don’t get enough practice at trusting.
Self-fulfilling prophecy
Unfortunately we don’t live in the ideal world condition and have to put up with only receiving feedback when we decide to trust others. This leaves us in the position of trusting to psychology studies like this one to tell us that other people are more trustworthy than we imagine (or at least people who take part in psychology studies are!).
Trusting others is also a kind of self-fulfilling prophecy, just as we find in interpersonal attraction. If you try trusting others you’ll find they frequently repay that trust, leading you to be more trusting. On the other hand if you never trust anyone, except those nearest and dearest, then you’ll end up more cynical about strangers.
Visit Psyblog
Multitasking: New Study Challenges Previous Cognitive Theory But Shows That Only A Few “Supertaskers” Can Drive And Phone
Read The Original Research Paper HERE (PDF – internal link)
A new study from University of Utah psychologists found a small group of people with an extraordinary ability to multitask: Unlike 97.5 percent of those studied, they can safely drive while chatting on a cell phone.
These individuals – described by the researchers as “supertaskers” – constitute only 2.5 percent of the population. They are so named for their ability to successfully do two things at once: in this case, talk on a cell phone while operating a driving simulator without noticeable impairment.

Jason Watson, a University of Utah psychologist, negotiates cybertraffic in a driving simulator used to study driver distractions such as cell phones and testing. While many people think they can safely drive and talk on a cell phone at the same time, Watson's new study shows only one in 40 is a "supertasker" who can perform both tasks at once without impairment of abilities measured in the study. Credit: Valoree Dowell, University of Utah
The study, conducted by psychologists Jason Watson and David Strayer, is now in press for publication later this year in the journal Psychonomic Bulletin and Review.
This finding is important not because it shows people can drive well while on the phone – the study confirms that the vast majority cannot – but because it challenges current theories of multitasking. Further research may lead eventually to new understanding of regions of the brain that are responsible for supertaskers’ extraordinary performance.
“According to cognitive theory, these individuals ought not to exist,” says Watson. “Yet, clearly they do, so we use the supertasker term as a convenient way to describe their exceptional multitasking ability. Given the number of individuals who routinely talk on the phone while driving, one would have hoped that there would be a greater percentage of supertaskers. And while we’d probably all like to think we are the exception to the rule, the odds are overwhelmingly against it. In fact, the odds of being a supertasker are about as good as your chances of flipping a coin and getting five heads in a row.”
The researchers assessed the performance of 200 participants over a single task (simulated freeway driving), and again with a second demanding activity added (a cell phone conversation that involved memorizing words and solving math problems). Performance was then measured in four areas—braking reaction time, following distance, memory, and math execution.
As expected, results showed that for the group, performance suffered across the board while driving and talking on a hands-free cell phone.
For those who were not supertaskers and who talked on a cell phone while driving the simulators, it took 20 percent longer to hit the brakes when needed and following distances increased 30 percent as the drivers failed to keep pace with simulated traffic while driving. Memory performance declined 11 percent, and the ability to do math problems fell 3 percent.
However, when supertaskers talked while driving, they displayed no change in their normal braking times, following distances or math ability, and their memory abilities actually improved 3 percent.
The results are in line with Strayer’s prior studies showing that driving performance routinely declines under “dual-task conditions” – namely talking on a cell phone while driving – and is comparable to the impairment seen in drunken drivers.
Yet contrary to current understanding in this area, the small number of supertaskers showed no impairment on the measurements of either driving or cell conversation when in combination. Further, researchers found that these individuals’ performance even on the single tasks was markedly better than the control group.
“There is clearly something special about the supertaskers,” says Strayer. “Why can they do something that most of us cannot? Psychologists may need to rethink what they know about multitasking in light of this new evidence. We may learn from these very rare individuals that the multitasking regions of the brain are different and that there may be a genetic basis for this difference. That is very exciting. Stay tuned.”
Watson and Strayer are now studying expert fighter pilots under the assumption that those who can pilot a jet aircraft are also likely to have extraordinary multitasking ability.
The current value society puts on multitasking is relatively new, note the authors. As technology expands throughout our environment and daily lives, it may be that everyone – perhaps even supertaskers – eventually will reach the limits of their ability to divide attention across several tasks.
“As technology spreads, it will be very useful to better understand the brain’s processing capabilities, and perhaps to isolate potential markers that predict extraordinary ability, especially for high-performance professions,” Watson concludes.
Information from University of Utah
Sticking To The Status Quo: Why Habits Are So Tough To Break
Read the original Research Paper HERE (PDF internal link)
Kelly McGonigal, Ph.D. @ Psychology Today (excerpted)
A new study from the Proceedings of the National Academy of Sciences confirms what many confused shoppers, dieters, and investors know first-hand: when a decision is difficult, we go with the status quo or choose to do nothing. [..
..] Researchers from the Wellcome Trust Centre for Neuroimaging at University College London created a computerized decision-making task. Participants viewed a series of visual tests that asked them to play a referee making a sports call (e.g., whether a tennis ball bounced in our out of bounds).
Before each test, participants were told that one of the responses (in or out) was the “default” for this round. They were asked to hold down a key while they watched. If they continued to hold down the key, they were choosing the default. If they lifted their finger, they were choosing the non-default. Importantly, the default response (in or out) switched randomly between rounds, so that a participant’s response bias (to make a call in or out) would not be confused with their tendency to stick with the status quo.
The researchers were interested in two questions:
1) Does the difficulty of the decision influence the participants’ likelihood of choosing the default?
2) Is there a neural signature for choosing the default vs. overriding the status quo? [..
As the researchers].. predicted, participants were more likely to stick with the default when the decision was difficult. It didn’t matter whether the default was in or out. If they couldn’t make a confident choice, they essentially chose to do nothing. And as the researchers point out, this tendency led to more errors.
What was happening in the participants’ brains as they chose? The researchers observed an interesting pattern when participants went against the default in a difficult decision. There was increased activity in, and increased connectivity between, two regions: the prefrontal cortex (PFC) and an area of the midbrain called the subthalamic nucleus (STN). The PFC is well-known to be involved in decision-making and self-control. The STN is thought to be important for motivating action.
The researcher’s analyses couldn’t determine for sure what the relationship between the PFC and STN was, but the observations were consistent with the idea that the PFC was driving, or boosting, activity in the STN.
These brain analyses suggest that going against the default in difficult decisions requires some kind of extra motivation or confidence. Otherwise, the decider in our mind is puzzled, and the doer in our mind is paralyzed
Knowing this can help explain why changing habits can be so difficult. If you aren’t sure why you’re changing, don’t fully believe you’re making the right choice, or question whether what you’re doing will work, you’re likely to settle back on your automatic behaviors. That’s why self-efficacy-the belief that you can make a change and overcome obstacles-is one of the best predictors of successful change. The decider and the doer need a boost of confidence.
It also helps explain why we love formulaic diets, investment strategies, and other decision aids. Formulas feel scientific, tested, and promising. They also give us a new default. We can rely on the rules (no eating after 7 PM, automatically invest X% of your income in mutual funds twice a month) when we’re feeling overwhelmed. A new automatic makes change much easier.
So next time you’re trying to make a change, figure out what your current default is, and remind yourself exactly why it isn’t working. Then look for ways to change your default (clean out your fridge, set up direct deposit) so you don’t have to fight the old default as often. And feel free to be your own cheerleader when the going gets rough. Look for the first evidence (a pound lost here, a dwindling credit card statement there) that what you’re doing is paying off. The status quo is seductive, and we all need a little encouragement to lift our fingers off the keyboard..
Study cited:
Fleming, S.M., Thomas, C.L., & Dolan, R.J. Overcoming status quo bias in the human brain. PNAS. Published online before print March 15, 2010. doi:10.1073/pnas.0910380107

![Reblog this post [with Zemanta]](https://i0.wp.com/img.zemanta.com/reblog_e.png)












