Peter H Brown Clinical Psychologist

Psychology News & Resources

“Willy Wonky & The Chocolate Fetish”: Why Do Depressed People Eat More Chocolate?

People who are depressed appear to eat more chocolate than those who aren’t

Researchers at UC San Diego and UC Davis examined chocolate consumption and other dietary intake patterns among 931 men and women who were not using antidepressants. The participants were also given a depression screening test. Those who screened positive for possible depression consumed an average of 8.4 servings of chocolate — defined as one ounce of chocolate candy — per month. That compared with 5.4 servings per month among people who were not depressed.

Read Abstract Here

Those who scored highest on the mood tests, indicating possible major depression, consumed an average of 11.8 servings per month. The findings were similar among women and men.

When the researchers controlled for other dietary factors that could be linked to mood — such as caffeine, fat and carbohydrate intake — they found only chocolate consumption correlated with mood.

It’s not clear how the two are linked, the authors wrote. It could be that depression stimulates chocolate cravings as a form of self-treatment. Chocolate prompts the release of certain chemicals in the brain, such as dopamine, that produce feelings of pleasure.

There is no evidence, however, that chocolate has a sustained benefit on improving mood. Like alcohol, chocolate may contribute a short-term boost in mood followed by a return to depression or a worsened mood. A study published in 2007 in the journal Appetite found that eating chocolate improved mood but only for about three minutes.

It’s also possible that depressed people seek chocolate to improve mood but that the trans fats in some chocolate counteract the effect of omega-3 fatty acid production in the body, the authors said in the paper. Omega-3 fatty acids are thought to improve mental health.

Click image to read reviews

Another theory is that chocolate consumption contributes to depression or that some physiological mechanism, such as stress, drives both depression and chocolate cravings.

“It’s unlikely that chocolate makes people depressed,” said Marcia Levin Pelchat, a psychologist who studies food cravings at the Monell Chemical Senses Center in Philadelphia. She was not involved in the new study. “Most people believe the beneficial effects of chocolate are on mood and that they are learned. You eat chocolate; it makes you feel good, and sometime when you’re feeling badly it occurs to you, ‘Gee, if I eat some chocolate I might feel better.’ ”

Chocolate is popular in North America and Britain, she said. But in other cultures, different foods are considered pleasure-inducing pick-me-ups.

“In the United States, people consider chocolate really tasty,” Pelchat said. “It has a high cultural value. It’s an appropriate gift for Valentine’s Day. But in China, you might give stuffed snails to someone you really like.”

Source: LA Times

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 30, 2010 Posted by | Books, Cognition, depression, Health Psychology | , , , , , , , , , | 3 Comments

Winners Are Grinners: Even If There’s Nothing to Win!

Whether it’s for money, marbles or chalk, the brains of reward-driven people keep their game faces on, helping them win at every step of the way. Surprisingly, they win most often when there is no reward.

Read Abstract Here

That’s the finding of neuroscientists at Washington University in St. Louis, who tested 31 randomly selected subjects with word games, some of which had monetary rewards of either 25 or 75 cents per correct answer, others of which had no money attached.

Subjects were given a short list of five words to memorize in a matter of seconds, then a 3.5-second interval or pause, then a few seconds to respond to a solitary word that either had been on the list or had not. Test performance had no consequence in some trials, but in others, a computer graded the responses, providing an opportunity to win either 25 cent or 75 cents for quick and accurate answers. Even during these periods, subjects were sometimes alerted that their performance would not be rewarded on that trial.

Prior to testing, subjects were submitted to a battery of personality tests that rated their degree of competitiveness and their sensitivity to monetary rewards.

Designed to test the hypothesis that excitement in the brains of the most monetary-reward-sensitive subjects would slacken during trials that did not pay, the study is co-authored by Koji Jimura, PhD, a post-doctoral researcher, and Todd Braver, PhD, a professor, both based in psychology in Arts & Sciences. Braver is also a member of the neuroscience program and radiology department in the university’s School of Medicine.

But the researchers found a paradoxical result: the performance of the most reward-driven individuals was actually most improved – relative to the less reward-driven – in the trials that paid nothing, not the ones in which there was money at stake.

Even more striking was that the brain scans taken using functional Magnetic Resonance Imaging (fMRI) showed a change in the pattern of activity during the non-rewarded trials within the lateral prefrontal cortex (PFC), located right behind the outer corner of the eyebrow, an area that is strongly linked to intelligence, goal-driven behavior and cognitive strategies. The change in lateral PFC activity was statistically linked to the extra behavioral benefits  observed in the reward-driven individuals.

The researchers suggest that this change in lateral PFC activity patterns represents a flexible shift in response to the motivational importance of the task, translating this into a superior task strategy that the researchers term “proactive cognitive control.” In other words, once the rewarding motivational context is established in the brain indicating there is a goal-driven contest at hand, the brain actually rallies its neuronal troops and readies itself for the next trial, whether it’s for money or not.

“It sounds reasonable now, but when I happened upon this result, I couldn’t believe it because we expected the opposite results,” says Jimura, first author of the paper. “I had to analyze the data thoroughly to persuade myself. The important finding of our study is that the brains of these reward- sensitive individuals do not respond to the reward information on individual trials. Instead, it shows that they have persistent motivation, even in the absence of a reward. You’d think you’d have to reward them on every trial to do well. But it seems that their brains recognized the rewarding motivational context that carried over across all the trials.”

Click image to read reviews

The finding sheds more light on the workings of the lateral PFC and provides potential behavioral clues about personality, motivation, goals and cognitive strategies. The research has important implications for understanding the nature of persistent motivation, how the brain creates such states, and why some people seem to be able to use motivation more effectively than others. By understanding the brain circuitry involved, it might be possible to create motivational situations that are more effective for all individuals, not just the most reward-driven ones, or to develop drug therapies for individuals that suffer from chronic motivational problems.Their results are published April 26 in the early online edition of the Proceedings of the National Academy of Science.

Everyone knows of competitive people who have to win, whether in a game of HORSE, golf or the office NCAA basketball tournament pool. The findings might tell researchers something about the competitive drive.

The researchers are interested in the signaling chain that ignites the prefrontal cortex when it acts on reward-driven impulses, and they speculate that the brain chemical dopamine could be involved. That could be a potential direction of future studies. Dopamine neurons, once thought to be involved in a host of pleasurable situations, but now considered more of learning or predictive signal, might respond to cues that let the lateral PFC know that it’s in for something good. This signal might help to keep information about the goals, rules or best strategies for the task active in mind to increase the chances of obtaining the desired outcome.

In the context of this study, when a 75-cent reward is available for a trial, the dopamine-releasing neurons could be sending signals to the lateral PFC that “jump start” it to do the right procedures to get a reward.

“It would be like the dopamine neurons recognize a cup of Ben and Jerry’s ice cream, and tell the lateral PFC the right action strategy to get the reward – to grab a spoon and bring the ice cream to your mouth,” says Braver. “We think that the dopamine neurons fires to the cue rather than the reward itself, especially after the brain learns the relationship between the two. We’d like to explore that some more.”

They also are interested in the “reward carryover state,” or the proactive cognitive strategy that keeps the brain excited even in gaps, such as pauses between trials or trials without rewards. They might consider a study in which rewards are far fewer.

“It’s possible we’d see more slackers with less rewards,” Braver says. “That might have an effect on the reward carryover state. There are a host of interesting further questions that this work brings up which we plan to pursue.”

Source: Washington University in St. Louis,
Share/Save/Bookmark

Reblog this post [with Zemanta]

April 28, 2010 Posted by | Addiction, brain, Cognition, research, Social Psychology | , , , , , , , , , , , | 2 Comments

Remember What Happened? Only In Your Dreams!

It is by now well established that sleep can be an important tool when it comes to enhancing memory and learning skills. And now, a new study sheds light on the role that dreams play in this important process.

Read Abstract Here

Led by scientists at Beth Israel Deaconess Medical Center (BIDMC), the new findings suggest that dreams may be the sleeping brain’s way of telling us that it is hard at work on the process of memory consolidation, integrating our recent experiences to help us with performance-related tasks in the short run and, in the long run, translating this material into information that will have widespread application to our lives. The study is reported in the April 22 On-line issue of Current Biology.

“What’s got us really excited, is that after nearly 100 years of debate about the function of dreams, this study tells us that dreams are the brain’s way of processing, integrating and really understanding new information,” explains senior author Robert Stickgold, PhD, Director of the Center for Sleep and Cognition at BIDMC and Associate Professor of Psychiatry at Harvard Medical School. “Dreams are a clear indication that the sleeping brain is working on memories at multiple levels, including ways that will directly improve performance.”

At the outset, the authors hypothesized that dreaming about a learning experience during nonrapid eye movement (NREM) sleep would lead to improved performance on a hippocampus-dependent spatial memory task. (The hippocampus is a region of the brain responsible for storing spatial memory.)

To test this hypothesis, the investigators had 99 subjects spend an hour training on a “virtual maze task,” a computer exercise in which they were asked to navigate through and learn the layout of a complex 3D maze with the goal of reaching an endpoint as quickly as possible. Following this initial training, participants were assigned to either take a 90-minute nap or to engage in quiet activities but remain awake. At various times, subjects were also asked to describe what was going through their minds, or in the case of the nappers, what they had been dreaming about. Five hours after the initial exercise, the subjects were retested on the maze task.

The results were striking.

The non-nappers showed no signs of improvement on the second test – even if they had reported thinking about the maze during their rest period. Similarly, the subjects who napped, but who did not report experiencing any maze-related dreams or thoughts during their sleep period, showed little, if any, improvement. But, the nappers who described dreaming about the task showed dramatic improvement, 10 times more than that shown by those nappers who reported having no maze-related dreams.

“These dreamers described various scenarios – seeing people at checkpoints in a maze, being lost in a bat cave, or even just hearing the background music from the computer game,” explains first author Erin Wamsley, PhD, a postdoctoral fellow at BIDMC and Harvard Medical School. These interpretations suggest that not only was sleep necessary to “consolidate” the information, but that the dreams were an outward reflection that the brain had been busy at work on this very task.

Of particular note, say the authors, the subjects who performed better were not more interested or motivated than the other subjects. But, they say, there was one distinct difference that was noted.

“The subjects who dreamed about the maze had done relatively poorly during training,” explains Wamsley. “Our findings suggest that if something is difficult for you, it’s more meaningful to you and the sleeping brain therefore focuses on that subject – it ‘knows’ you need to work on it to get better, and this seems to be where dreaming can be of most benefit.”

Furthermore, this memory processing was dependent on being in a sleeping state. Even when a waking subject “rehearsed and reviewed” the path of the maze in his mind, if he did not sleep, then he did not see any improvement, suggesting that there is something unique about the brain’s physiology during sleep that permits this memory processing.

“In fact,” says Stickgold, “this may be one of the main goals that led to the evolution of sleep. If you remain awake [following the test] you perform worse on the subsequent task. Your memory actually decays, no matter how much you might think about the maze.

“We’re not saying that when you learn something it is dreaming that causes you to remember it,” he adds. “Rather, it appears that when you have a new experience it sets in motion a series of parallel events that allow the brain to consolidate and process memories.”

Ultimately, say the authors, the sleeping brain seems to be accomplishing two separate functions: While the hippocampus is processing information that is readily understandable (i.e. navigating the maze), at the same time, the brain’s higher cortical areas are applying this information to an issue that is more complex and less concrete (i.e. how to navigate through a maze of job application forms).

Click image to read reviews

“Our [nonconscious] brain works on the things that it deems are most important,” adds Wamsley. “Every day, we are gathering and encountering tremendous amounts of information and new experiences,” she adds. “It would seem that our dreams are asking the question, ‘How do I use this information to inform my life?’”

Study coauthors include BIDMC investigators Matthew Tucker, Joseph Benavides and Jessica Payne (currently of the University of Notre Dame).

This study was supported by grants from the National Institutes of Health.

BIDMC is a patient care, teaching and research affiliate of Harvard Medical School, and consistently ranks in the top four in National Institutes of Health funding among independent hospitals nationwide. BIDMC is a clinical partner of the Joslin Diabetes Center and a research partner of the Dana-Farber/Harvard Cancer Center. BIDMC is the official hospital of the Boston Red Sox.

Source: BIDMC

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 27, 2010 Posted by | Books, brain, Cognition, Health Psychology, research | , , , , , , , , , , , , | 1 Comment

Brain Training Or Just Brain Straining?: The Benefits Of Brain Exercise Software Are Unclear

You’ve probably heard it before: the brain is a muscle that can be strengthened. It’s an assumption that has spawned a multimillion-dollar computer game industry of electronic brain-teasers and memory games. But in the largest study of such brain games to date, a team of British researchers has found that healthy adults who undertake computer-based “brain-training” do not improve their mental fitness in any significant way.

Read The Original Research Paper (Draft POF)

The study, published online Tuesday by the journal Nature, tracked 11,430 participants through a six-week online study. The participants were divided into three groups: the first group undertook basic reasoning, planning and problem-solving activities (such as choosing the “odd one out” of a group of four objects); the second completed more complex exercises of memory, attention, math and visual-spatial processing, which were designed to mimic popular “brain-training” computer games and programs; and the control group was asked to use the Internet to research answers to trivia questions.

All participants were given a battery of unrelated “benchmark” cognitive-assessment tests before and after the six-week program. These tests, designed to measure overall mental fitness, were adapted from reasoning and memory tests that are commonly used to gauge brain function in patients with brain injury or dementia. All three study groups showed marginal — and identical — improvement on these benchmark exams.

But the improvement had nothing to do with the interim brain-training, says study co-author Jessica Grahn of the Cognition and Brain Sciences Unit in Cambridge. Grahn says the results confirm what she and other neuroscientists have long suspected: people who practice a certain mental task — for instance, remembering a series of numbers in sequence, a popular brain-teaser used by many video games — improve dramatically on that task, but the improvement does not carry over to cognitive function in general. (Indeed, all the study participants improved in the tasks they were given; even the control group got better at looking up answers to obscure questions.) The “practice makes perfect” phenomenon probably explains why the study participants improved on the benchmark exams, says Grahn — they had all had taken it once before. “People who practiced a certain test improved at that test, but improvement does not translate beyond anything other than that specific test,” she says.

The authors believe the study, which was run in conjuction with a BBC television program called “Bang Goes the Theory,” undermines the sometimes outlandish claims of many brain-boosting websites and digital games. According to a past TIME.com article by Anita Hamilton, HAPPYneuron, an example not cited by Grahn, is a $100 Web-based brain-training site that invites visitors to “give the gift of brain fitness” and claims its users saw “16%+ improvement” through exercises such as learning to associate a bird’s song with its species and shooting basketballs through virtual hoops. Hamilton also notes Nintendo’s best-selling Brain Age game, which promises to “give your brain the workout it needs” through exercises like solving math problems and playing rock, paper, scissors on the handheld DS. “The widely held belief that commercially available computerized brain-training programs improve general cognitive function in the wider population lacks empirical support,” the paper concludes.

Click on image to read reviews

Not all neuroscientists agree with that conclusion, however. In 2005, Torkel Klingberg, a professor of cognitive neuroscience at the Karolinska Institute in Sweden, used brain imaging to show that brain-training can alter the number of dopamine receptors in the brain — dopamine is a neurotransmitter involved in learning and other important cognitive functions. Other studies have suggested that brain-training can help improve cognitive function in elderly patients and those in the early stages of Alzheimer’s disease, but the literature is contradictory.

Klingberg has developed a brain-training program called Cogmed Working Memory Training, and owns shares in the company that distributes it. He tells TIME that the Nature study “draws a large conclusion from a single negative finding” and that it is “incorrect to generalize from one specific training study to cognitive training in general.” He also criticizes the design of the study and points to two factors that may have skewed the results.

On average the study volunteers completed 24 training sessions, each about 10 minutes long — for a total of three hours spent on different tasks over six weeks. “The amount of training was low,” says Klingberg. “Ours and others’ research suggests that 8 to 12 hours of training on one specific test is needed to get a [general improvement in cognition].”

Second, he notes that the participants were asked to complete their training by logging onto the BBC Lab UK website from home. “There was no quality control. Asking subjects to sit at home and do tests online, perhaps with the TV on or other distractions around, is likely to result in bad quality of the training and unreliable outcome measures. Noisy data often gives negative findings,” Klingberg says.

Brain-training research has received generous funding in recent years — and not just from computer game companies — as a result of the proven effect of neuroplasticity, the brain’s ability to remodel its nerve connections after experience. The stakes are high. If humans could control that process and bolster cognition, it could have a transformative effect on society, says Nick Bostrom of Oxford University‘s Future of Humanity Institute. “Even a small enhancement in human cognition could have a profound effect,” he says. “There are approximately 10 million scientists in the world. If you could improve their cognition by 1%, the gain would hardly be noticeable in a single individual. But it could be equivalent to instantly creating 100,000 new scientists.”

For now, there is no nifty computer game that will turn you into Einstein, Grahn says. But there are other proven ways to improve cognition, albeit only by small margins. Consistently getting a good night’s sleep, exercising vigorously, eating right and maintaining healthy social activity have all been shown to help maximize a brain’s potential over the long term.

What’s more, says Grahn, neuroscientists and psychologists have yet to even agree on what constitutes high mental aptitude. Some experts argue that physical skill, which stems from neural pathways, should be considered a form of intelligence — so, masterful ballet dancers and basketball players would be considered geniuses.

Jason Allaire, co-director of the Games through Gaming lab at North Carolina State University says the Nature study makes sense; rather than finding a silver bullet for brain enhancement, he says, “it’s really time for researchers to think about a broad or holistic approach that exercises or trains the mind in general in order to start to improve cognition more broadly.”

Or, as Grahn puts it, when it comes to mental fitness, “there are no shortcuts.”

Credit: Time.com

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 23, 2010 Posted by | Age & Ageing, Books, brain, Cognition, Education, Health Psychology, Internet, research | , , , , , , , , , , , , , , | 6 Comments

Fast Food, Fast You! How Fast Food Makes You Impatient

Like it or not, the golden arches of McDonalds are one of the most easily recognised icons of the modern world. The culture they represent is one of instant gratification and saved time, of ready-made food that can be bought cheaply and eaten immediately. Many studies have looked at the effects of these foods on our waistlines, but their symbols and brands are such a pervasive part of our lives that you’d expect them to influence the way we think too.

Read the original research paper (PDF)

And so they do – Chen-Bo Zhong and Sanford DeVoe have found that fast food can actually induce haste and impatience, in ways that have nothing to do with eating. They showed that subliminal exposure to fast food symbols, such as McDonalds’ golden arches, can actually increase people’s reading speed. Just thinking about these foods can boost our preferences for time-saving goods and even nudge us towards financial decisions that value immediate gains over future returns. Fast food, it seems, is very appropriately named.

Zhong and DeVoe asked 57 students to stare at the centre of a computer screen while ignoring a stream of objects flashing past in the corners. For some of the students, these flashes included the logos of McDonald’s, KFC, Subway, Taco Bell, Burger King and Wendy’s, all appearing for just 12 milliseconds. We can’t consciously recognise images that appear this quickly and, indeed, none of the students said that they saw anything other than blocks of colour.

The students were then asked to read out a 320-word description of Toronto and those who had subconsciously seen the fast food logos were faster. Even though they had no time limit, they whizzed through the text in just 70 seconds. The other students, who were shown blocks of colours in place of the logos, took a more leisurely 84 seconds.

Zhong and DeVoe also found that thoughts of fast food could sway students towards more efficient, time-saving products. They asked 91 students to complete a marketing survey by saying how much they wanted each of five product pairs. One option in each pair was more time-efficient (as rated by an independent panel of 54 people), such as 2-in-1 shampoo rather than regular shampoo or a four-slice toaster versus a one-slice one.

If the students had previously thought about the last time they ate at a fast food joint, they were more likely to prefer the time-saving products that students who had thought about their last visit to the grocery store. Zhong and DeVoe say that this supports their idea that thinking about fast-food makes people impatient. [This seems to be]  the weakest part of their study, for products like 2-in-1 shampoo are as much about saving money (perhaps more so) as they are about saving time. Fast food is not only served quickly but priced cheaply, and it may be this aspect that altered the students’ preference.

Click image to read reviews

However, the duo addressed this issue in their third experiment. They randomly asked 58 students to judge one of four different logos on their aesthetic qualities, including those of McDonald’s, KFC and two cheap diners. Later, they were told that they could either have $3 immediately or a larger sum in a week. They had to say how much it would take to make them delay their windfall.

As predicted, those who considered the fast food logos were more impatient, and demanded significantly more money to forego their smaller immediate payment in favour of a larger future one. It seems that they put a greater price on instant gratification over larger future returns

Of course, these results can’t tell us if fast food actually contributes to a culture of impatience and hurry, or if it’s just a symptom of it. Nor do they say anything about whether this effect is good or bad. That would all depend on context. As Zhong and DeVoe note, a brisk walking speed is a good thing if you’re trying to get to a meeting but it would be a sign of impatience if you’re aiming for a leisurely stroll in the park.

Their study does, however, suggest that fast food and the need to save time are inextricably linked in our minds so that even familiar brands can make us behave more hastily. They could even affect our economic decisions, harming our finances in the long run. As Zhong and DeVoe say, even our leisure activites are “experienced through the coloured glasses of impatience” and “it is possible that a fast food culture that extols saving time not only changes the way people eat, but also fundamentally alters the way they experience events”

Read the original research paper (PDF)

Credit: discovermagazine
Share/Save/Bookmark

Reblog this post [with Zemanta]

April 17, 2010 Posted by | Books, Cognition, Eating Disorder, Health Psychology, research, Social Psychology, stress, Technology | , , , , , , , , , , , , , | 2 Comments

Sticks & Stones AND Words Can Hurt You: How Words Can Cause Physical Pain

“Watch out, it’ll hurt for a second.” Not only children but also many adults get uneasy when they hear those words from their doctor. And, as soon as the needle touches their skin the piercing pain can be felt very clearly. “After such an experience it is enough to simply imagine a needle at the next vaccination appointment to activate our pain memory,” knows Prof. Dr. Thomas Weiss from the Friedrich-Schiller-University Jena.

Read the original research paper (PDF)

As the scientist and his team from the Dept. of Biological and Clinical Psychology could show in a study for the first time it is not only the painful memories and associations that set our pain memory on the alert. “Even verbal stimuli lead to reactions in certain areas of the brain,” claims Prof. Weiss. As soon as we hear words like “tormenting,” “gruelling” or “plaguing,” exactly those areas in the brain are being activated which process the corresponding pain. The psychologists from Jena University were able to examine this phenomenon using functional magnetic resonance tomography (fMRT). In their study they investigated how healthy subjects process words associated with experiencing pain. In order to prevent reactions based on a plain negative affect the subjects were also confronted with negatively connotated words like “terrifying,” “horrible” or “disgusting” besides the proper pain words.

“Subjects performed two tasks,” explains Maria Richter, doctoral candidate in Weiss’s team. “In a first task, subjects were supposed to imagine situations which correspond to the words,” the Jena psychologist says. In a second task, subjects were also reading the words but they were distracted by a brain-teaser. “In both cases we could observe a clear activation of the pain matrix in the brain by pain-associated words,” Maria Richter states. Other negatively connotated words, however, do not activate those regions. Neither for neutrally nor for positively connotated words comparable activity patterns could be examined.

Click image to read reviews

Can words intensify chronic pain?

“These findings show that words alone are capable of activating our pain matrix,” underlines Prof. Weiss. To save painful experiences is of biological advantage since it allows us to avoid painful situations in the future which might be dangerous for our lives. “However, our results suggest as well that verbal stimuli have a more important meaning than we have thought so far.” For the Jena psychologist the question remains open which role the verbal confrontation with pain plays for chronic pain patients. “They tend to speak a lot about their experiencing of pain to their physician or physiotherapist,” Maria Richter says. It is possible that those conversations intensify the activity of the pain matrix in the brain and therefore intensify the pain experience. This is what the Jena psychologists want to clarify in another study.

And so far it won’t do any harm not to talk too much about pain. Maybe then the next injection will be only half as painful.

Read the original research paper (PDF)

Adapted from ScienceDaily March 31 2010

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 14, 2010 Posted by | Age & Ageing, anxiety, Cognition, Pain, research, Technology | , , , , , , , , , , , , , , , , | 5 Comments

Ripped Off!: The Psychological Cost Of Wearing A Fake Rolex (Or Other Knockoffs)

Credit: Wray Herbert: The Huffington Post April 7 2010:

Read the original research paper HERE (PDF)

Within just a few blocks of my office, street vendors will sell me a Versace t-shirt or a silk tie from Prada, cheap. Or I could get a deal on a Rolex, or a chic pair of Ray Ban shades. These aren’t authentic brand name products, of course. They’re inexpensive replicas. But they make me look and feel good, and I doubt any of my friends can tell the difference.

That’s why we buy knockoffs, isn’t it? To polish our self-image–and broadcast that polished version of our personality to the world–at half the price? But does it work? After all, we first have to convince ourselves of our idealized image if we are going to sway anyone else. Can we really become Ray Ban-wearing, Versace-bedecked sophisticates in our own mind–just by dressing up?

New research suggests that knockoffs may not work as magically as we’d like–and indeed may backfire. Three psychological scientists–Francesca Gino of Chapel Hill, Michael Norton of Harvard Business School, and Dan Ariely of Duke–have been exploring the power and pitfalls of fake adornment in the lab. They wanted to see if counterfeit stuff might have hidden psychological costs, warping our actions and attitudes in undesirable ways.

Here’s an example of their work. The scientists recruited a large sample of young women and had them wear pricey Chloe sunglasses. The glasses were the real thing, but half the women thought they were wearing knockoffs. They wanted to see if wearing counterfeit shades–a form of dishonesty–might actually make the women act dishonestly in other ways.

So they had them perform a couple tasks–tasks that presented opportunities for lying and cheating. In one, for example, the women worked on a complicated set of mathematical puzzles–a task they couldn’t possibly complete in the time allowed. When time elapsed, the women were told to score themselves on the honor system–and to take money for each correct score. Unbeknownst to them, the scientists were monitoring both their work and their scoring.

And guess what. The women wearing the fake Chloe shades cheated more–considerably more. Fully 70 percent inflated their performance when they thought nobody was checking on them–and in effect stole cash from the coffer. To double-check this distressing result, the scientists put the women through a completely different task, one that forced a choice between the right answer and the more profitable answer. And again the Chloe-wearing women pocketed the petty cash. Notably, the women cheated not only when they expressed a preference for the cheap knockoffs, but also when the real and fake designer glasses were randomly handed out. So it appears that the very act of wearing the counterfeit eyewear triggered the lying and cheating.

Click image to read reviews

This is bizarre and disturbing, but it gets worse. The psychologists wondered if inauthentic image-making might not only corrupt personal ethics, but also lead to a generally cynical attitude toward other people. In other words, if wearing counterfeit stuff makes people feel inauthentic and behave unethically, might they see others as phony and unethical, too? To test this, they again handed out genuine and counterfeit Chloe shades, but this time they had the volunteers complete a survey about “someone they knew.” Would this person use an express line with too many groceries? Pad an expense report? Take home office supplies? There were also more elaborate scenarios involving business ethics. The idea was that all the answers taken together would characterize each volunteer as having a generally positive view of others–or a generally cynical view.

Cynical, without question. Compared to volunteers who were wearing authentic Chloe glasses, those wearing the knockoffs saw other people as more dishonest, less truthful, and more likely to act unethically in business dealings.

So what’s going on here? Well, the scientists ran a final experiment to answer this question, and here are the ironic results they report on-line this week in the journal Psychological Science: Wearing counterfeit glasses not only fails to bolster our ego and self-image the way we hope, it actually undermines our internal sense of authenticity. “Faking it” makes us feel like phonies and cheaters on the inside, and this alienated, counterfeit “self” leads to cheating and cynicism in the real world.

Counterfeiting is a serious economic and social problem, epidemic in scale. Most people buy these fake brands because they are a lot cheaper, but this research suggests there may be a hidden moral cost yet to be tallied.

Read the original research paper HERE (PDF)

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 9, 2010 Posted by | Books, Cognition, Identity, Resources, Social Psychology | , , , , , , , , , , , , , , , , , , | 3 Comments

Mid Life – What’s The Crisis?: Why Self Esteem Peaks In The Middle-Aged

Credit: LiveScience

Read the original research article HERE (PDF)

Bad vision and other physical ailments aren’t the only things that seem to get worse as people grow old. Self-esteem also declines around the age of retirement, a new study finds.

The study involved 3,617 American men and women ranging in age from 25 to 104. Self-esteem was lowest among young adults, but increased throughout adulthood, peaking at age 60, before it started to decline.

Several factors might explain this trend, the researchers say.

“Midlife is a time of highly stable work, family and romantic relationships. People increasingly occupy positions of power and status, which might promote feelings of self-esteem,” said study author Richard Robins of the University of California, Davis. “In contrast, older adults may be experiencing a change in roles such as an empty nest, retirement and obsolete work skills in addition to declining health.”

Measuring self-esteem

The participants were surveyed four times between 1986 and 2002. They were asked to rate their level of agreement with statements such as: “I take a positive attitude toward myself,” which suggests high self-esteem; “At times I think I am no good at all,” and “All in all, I am inclined to feel that I am a failure,” which both suggest low self-esteem.

Subjects also indicated their demographics, relationship satisfaction, and whether they had experienced stressful life events, including suddenly losing a job, being the victim of a violent crime, or experiencing the death of a parent or child.

On average, women had lower self-esteem than men throughout most of adulthood, but self-esteem levels converged as men and women reached their 80s and 90s. Blacks and whites had similar self-esteem levels throughout young adulthood and middle age. In old age, average self-esteem among blacks dropped much more sharply than self-esteem among whites. This result held even after accounting for differences in income and health.

Future research should further explore these ethnic differences, which might lead to better interventions aimed at improving self-esteem, the study authors say.

Click on image to read reviews

More self-esteem factors

Education, income, health and employment status all had some effect on the self-esteem trajectories, especially as people aged.

“People who have higher incomes and better health in later life tend to maintain their self-esteem as they age,” Orth said.

“We cannot know for certain that more wealth and better health directly lead to higher self-esteem, but it does appear to be linked in some way. For example, it is possible that wealth and health are related to feeling more independent and better able to contribute to one’s family and society, which in turn bolsters self-esteem.”

People of all ages in satisfying and supportive relationships tend to have higher self-esteem, according to the findings.

However, despite maintaining higher self-esteem throughout their lives, people in happy relationships experienced the same drop in self-esteem during old age as people in unhappy relationships.

“Thus, being in a happy relationship does not protect a person against the decline in self-esteem that typically occurs in old age,” said study author Kali H. Trzesniewski of the University of Western Ontario.

With medical advances, the drop in self-esteem might occur later for baby boomers, Orth said. Boomers might be healthier for longer and, therefore, able to work and earn money longer.

Read the original research article HERE (PDF)

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 6, 2010 Posted by | Age & Ageing, Books, Cognition, depression, Education, Health Psychology, research, Resilience, Resources, Seniors | , , , , , , , , , , , , , , | 1 Comment

The Real “Rain Man”: A Fascinating Look At Kim Peek

Kim Peek, diagnosed by Darold Treffert with Sa...

Kim Peek 1951-2009 Image via Wikipedia

Kim Peek was the inspiration for the movie Rain Man starring Dustin Hoffman and Tom Cruise. Peek, who passed away last year at the age of 58, lived with his father Fran. Peek suffered from a brain development disorder known as agenesis of the corpus collosum. Malformation and absence  of the corpus callosum are rare developmental disorders that result in a wide spectrum of symptoms, ranging from severe cerebral palsy, epilepsy and autism to relatively mild learning problems.

While Kim was able to perform extraordinary mental feats, particularly related to memory of historical facts, he struggled with many of the day to day tasks of life. This is a fascinating short video of Kim’s visit to London and his explanation of his condition. Enjoy!Vodpod videos no longer available.

Reblog this post [with Zemanta]

April 4, 2010 Posted by | Aspergers Syndrome, Autism, Books, brain, Cognition, Resources, video | , , , , , , , , , , , | Leave a comment

You Can Trust Me More Than You Can Trust Them: Cynicism & The Trust Gap

Read the original research paper HERE (PDF)

Credit: Jeremy Dean from Psyblog

How do people come to believe that others are so much less trustworthy than themselves?

Much as we might prefer otherwise, there’s solid evidence that, on average, people are quite cynical. When thinking about strangers, studies have shown that people think others are more selfishly motivated than they really are and that others are less helpful than they really are.

Similarly in financial games psychologists have run in the lab, people are remarkably cynical about the trustworthiness of others. In one experiment people honored the trust placed in them between 80 and 90 percent of the time, but only estimated that others would honor their trust about 50 percent of the time.

Our cynicism towards strangers may develop as early as 7 years old (Mills & Keil, 2005). Surprisingly people are even overly cynical about their loved ones, assuming they will behave more selfishly than they really do (Kruger & Gilovich, 1999).

What could create such a huge gap between how people behave themselves and how they think others behave?

Trust me

People often say that it’s experience that breeds this cynicism rather than a failing in human nature. This is true, but only in a special way.

Think about it like this: the first time you trust a stranger and are betrayed, it makes sense to avoid trusting other strangers in the future. The problem is that when we don’t ever trust strangers, we never find out how trustworthy people in general really are. As a result our estimation of them is governed by fear.

If this argument is correct, it is lack of experience that leads to people’s cynicism, specifically not enough positive experiences of trusting strangers. This idea is tested in a new study published in Psychological Science. Fetchenhauer and Dunning (2010) set up a kind of ideal world in the lab where people were given accurate information about the trustworthiness of strangers to see if that would reduce their cynicism.

They recruited 120 participants to take part in a game of economic trust. Each person was given €7.50 and asked if they’d like to hand it to another person. If the other person made the same decision the pot would increase to €30. They were then asked to estimate whether the other person would opt to give them their half of the total winnings.

Click image to read reviews

The participants watched 56 short videos of the people they were playing against. The researchers set up two experimental conditions, one to mimic what happens in the real world and one to test an ideal world scenario:

1. Real life condition: in this group participants were only told about the other person’s decision when they decided to trust them. The idea is that this condition simulates real life. You only find out if others are trustworthy when you decide to trust them. If you don’t trust someone you never find out whether or not they are trustworthy.

2. Ideal world condition: here participants were given feedback about the trustworthiness of other people whether or not they decided to trust them. This simulates an ideal-world condition where we all know from experience just how trustworthy people are (i.e. much more trustworthy than we think!)

Breaking down cynicism

Once again this study showed that people are remarkably cynical about strangers. Participants in this study thought that only 52 percent of the people they saw in the videos could be trusted to share their winnings. But the actual level of trustworthiness was a solid 80 percent. There’s the cynicism.

That cynicism was quickly broken down, though, by giving participants accurate feedback about others’ trustworthiness. People in the ideal world condition noticed that others could be trusted (they upped their estimate to 71 percent) and were also more trusting themselves, handing over the money 70.1 percent of the time.

People in the ideal world condition could even be seen shedding their cynicism as the study went on, becoming more trusting as they noticed that others were trustworthy. This suggests people aren’t inherently cynical, it’s just that we don’t get enough practice at trusting.

Self-fulfilling prophecy

Unfortunately we don’t live in the ideal world condition and have to put up with only receiving feedback when we decide to trust others. This leaves us in the position of trusting to psychology studies like this one to tell us that other people are more trustworthy than we imagine (or at least people who take part in psychology studies are!).

Trusting others is also a kind of self-fulfilling prophecy, just as we find in interpersonal attraction. If you try trusting others you’ll find they frequently repay that trust, leading you to be more trusting. On the other hand if you never trust anyone, except those nearest and dearest, then you’ll end up more cynical about strangers.

Visit Psyblog

Read the original research paper HERE (PDF)

Share/Save/Bookmark

April 1, 2010 Posted by | Books, brain, Cognition, Health Psychology, research, Resilience | , , , , , , , | 2 Comments