Why Your Job Doesn’t Make You Happy
Information supplied by T
he British Psychological Society
Read the original research paper here (PDF)
People who are unhappy in life are unlikely to find satisfaction at work. This is the finding of a study published online last thursday, 1st April 2010, in the Journal of Occupational and Organizational Psychology.
Assistant Professor Nathan Bowling of Wright State University, USA, and colleagues Kevin Eschleman and Qiang Wang undertook a meta-analysis on the results of 223 studies carried out between 1967 and 2008. All of the studies had investigated some combination of job satisfaction and life satisfaction (or subjective well-being).
Assistant Professor Nathan Bowling said: “We used studies that assessed these factors at two time points so that we could better understand the causal links between job satisfaction and life satisfaction. If people are satisfied at work, does this mean they will be more satisfied and happier in life overall? Or is the causal effect the opposite way around?”
The causal link between subjective well-being and subsequent levels of job satisfaction was found to be stronger than the link between job satisfaction and subsequent levels of subjective well-being.
“These results suggest that if people are, or are predisposed to be, happy and satisfied in life generally, then they will be likely to be happy and satisfied in their work,” said Nathan Bowling.
“However, the flipside of this finding could be that those people who are dissatisfied generally and who seek happiness through their work, may not find job satisfaction. Nor might they increase their levels of overall happiness by pursuing it.”
Read the original research paper here (PDF)
Related articles by Zemanta
- Those Who Enjoy Life More Likely to Be Happy at Work (nlm.nih.gov)
Mid Life – What’s The Crisis?: Why Self Esteem Peaks In The Middle-Aged
Credit: LiveScience
Read the original research article HERE (PDF)
Bad vision and other physical ailmen
ts aren’t the only things that seem to get worse as people grow old. Self-esteem also declines around the age of retirement, a new study finds.
The study involved 3,617 American men and women ranging in age from 25 to 104. Self-esteem was lowest among young adults, but increased throughout adulthood, peaking at age 60, before it started to decline.
Several factors might explain this trend, the researchers say.
“Midlife is a time of highly stable work, family and romantic relationships. People increasingly occupy positions of power and status, which might promote feelings of self-esteem,” said study author Richard Robins of the University of California, Davis. “In contrast, older adults may be experiencing a change in roles such as an empty nest, retirement and obsolete work skills in addition to declining health.”
Measuring self-esteem
The participants were surveyed four times between 1986 and 2002. They were asked to rate their level of agreement with statements such as: “I take a positive attitude toward myself,” which suggests high self-esteem; “At times I think I am no good at all,” and “All in all, I am inclined to feel that I am a failure,” which both suggest low self-esteem.
Subjects also indicated their demographics, relationship satisfaction, and whether they had experienced stressful life events, including suddenly losing a job, being the victim of a violent crime, or experiencing the death of a parent or child.
On average, women had lower self-esteem than men throughout most of adulthood, but self-esteem levels converged as men and women reached their 80s and 90s. Blacks and whites had similar self-esteem levels throughout young adulthood and middle age. In old age, average self-esteem among blacks dropped much more sharply than self-esteem among whites. This result held even after accounting for differences in income and health.
Future research should further explore these ethnic differences, which might lead to better interventions aimed at improving self-esteem, the study authors say.
More self-esteem factors
Education, income, health and employment status all had some effect on the self-esteem trajectories, especially as people aged.
“People who have higher incomes and better health in later life tend to maintain their self-esteem as they age,” Orth said.
“We cannot know for certain that more wealth and better health directly lead to higher self-esteem, but it does appear to be linked in some way. For example, it is possible that wealth and health are related to feeling more independent and better able to contribute to one’s family and society, which in turn bolsters self-esteem.”
People of all ages in satisfying and supportive relationships tend to have higher self-esteem, according to the findings.
However, despite maintaining higher self-esteem throughout their lives, people in happy relationships experienced the same drop in self-esteem during old age as people in unhappy relationships.
“Thus, being in a happy relationship does not protect a person against the decline in self-esteem that typically occurs in old age,” said study author Kali H. Trzesniewski of the University of Western Ontario.
With medical advances, the drop in self-esteem might occur later for baby boomers, Orth said. Boomers might be healthier for longer and, therefore, able to work and earn money longer.
Read the original research article HERE (PDF)
Related articles by Zemanta
- Self-Esteem And Self-Respect (andrewsullivan.theatlantic.com)
- Aging Health Strongly Affected by How You Feel About Aging (seniormemorysource.com)
- Parenting: Signs of Low Self-Esteem in Children (abcnews.go.com)
Exercise DOES Help Improve Mood! And Just 25 Minutes Worth Will Decrease Stress & Increase Energy
Having an Honors degree in Human Movement Studies and working in gyms in a former life while studying for my Clinical Masters degree, I have seen this to be true. Of course it seems self evident, but these researchers have used great science with an excellent and now research-proven written program and workbook. These, along with their recent meta-analytic research review, show just how effective exercise can be in improving mood.
Credit: PhysOrg.com) — Exercise is a magic drug for many people with depression and anxiety disorders, according to researchers who analyzed numerous studies, and it should be more widely prescribed by mental health care providers.
“Exercise has been shown to have tremendous benefits for mental health,” says Jasper Smits, director of the Anxiety Research and Treatment Program at Southern Methodist University in Dallas. “The more therapists who are trained in exercise therapy, the better off patients will be.”
The traditional treatments of cognitive behavioral therapy and pharmacotherapy don’t reach everyone who needs them, says Smits, an associate professor of psychology.
“Exercise can fill the gap for people who can’t receive traditional therapies because of cost or lack of access, or who don’t want to because of the perceived social stigma associated with these treatments,” he says. “Exercise also can supplement traditional treatments, helping patients become more focused and engaged.”
Smits and Michael Otto, psychology professor at Boston University, presented their findings to researchers and mental health care providers March 6 at the Anxiety Disorder Association of America’s annual conference in Baltimore.
Their workshop was based on their therapist guide “Exercise for Mood and Anxiety Disorders,” with accompanying patient workbook (Oxford University Press, September 2009).
The guide draws on dozens of population-based studies, clinical studies and meta-analytic reviews that demonstrate the efficacy of exercise programs, including the authors’ meta-analysis of exercise interventions for mental health and study on reducing anxiety sensitivity with exercise.
“Individuals who exercise report fewer symptoms of anxiety and depression, and lower levels of stress and anger,” Smits says. “Exercise appears to affect, like an antidepressant, particular neurotransmitter systems in the brain, and it helps patients with depression re-establish positive behaviors. For patients with anxiety disorders, exercise reduces their fears of fear and related bodily sensations such as a racing heart and rapid breathing.”
After patients have passed a health assessment, Smits says, they should work up to the public health dose, which is 150 minutes a week of moderate-intensity activity or 75 minutes a week of vigorous-intensity activity.
At a time when 40 percent of Americans are sedentary, he says, mental health care providers can serve as their patients’ exercise guides and motivators.
“Rather than emphasize the long-term health benefits of an exercise program — which can be difficult to sustain — we urge providers to focus with their patients on the immediate benefits,” he says. “After just 25 minutes, your mood improves, you are less stressed, you have more energy — and you’ll be motivated to exercise again tomorrow. A bad mood is no longer a barrier to exercise; it is the very reason to exercise.”
Smits says health care providers who prescribe exercise also must give their patients the tools they need to succeed, such as the daily schedules, problem-solving strategies and goal-setting featured in his guide for therapists.
“Therapists can help their patients take specific, achievable steps,” he says. “This isn’t about working out five times a week for the next year. It’s about exercising for 20 or 30 minutes and feeling better today.”
Related articles by Zemanta
Martin Seligman: Author Of “Learned Optimism” Speaks About Positive Psychology And Authentic Happiness
Martin Seligman was originally best known for his classic psychology studies and theory of “Learned Helplessness” (1967) and it’s relationship to depression.
These days he is considered to be a founder of positive psychology, a field of study that examines healthy states, such as authentic happiness, strength of character and optimism, and is the author of “Learned Optimism”.
This is a terrific talk on Positive Psychology and what it means to be happy. It’s about 20 mins. long but definitely worth a watch!
You Can Trust Me More Than You Can Trust Them: Cynicism & The Trust Gap
Read the original research paper HERE (PDF)
Credit: Jeremy Dean from Psyblog
How do people come to believe that others are so much less trustworthy than themselves?
Much as we might prefer otherwise, there’s solid evidence that, on average, people are quite cynical. When thinking about strangers, studies have shown that people think others are more selfishly motivated than they really are and that others are less helpful than they really are.
Similarly in financial games psychologists have run in the lab, people are remarkably cynical about the trustworthiness of others. In one experiment people honored the trust placed in them between 80 and 90 percent of the time, but only estimated that others would honor their trust about 50 percent of the time.
Our cynicism towards strangers may develop as early as 7 years old (Mills & Keil, 2005). Surprisingly people are even overly cynical about their loved ones, assuming they will behave more selfishly than they really do (Kruger & Gilovich, 1999).
What could create such a huge gap between how people behave themselves and how they think others behave?
Trust me
People often say that it’s experience that breeds this cynicism rather than a failing in human nature. This is true, but only in a special way.
Think about it like this: the first time you trust a stranger and are betrayed, it makes sense to avoid trusting other strangers in the future. The problem is that when we don’t ever trust strangers, we never find out how trustworthy people in general really are. As a result our estimation of them is governed by fear.
If this argument is correct, it is lack of experience that leads to people’s cynicism, specifically not enough positive experiences of trusting strangers. This idea is tested in a new study published in Psychological Science. Fetchenhauer and Dunning (2010) set up a kind of ideal world in the lab where people were given accurate information about the trustworthiness of strangers to see if that would reduce their cynicism.
They recruited 120 participants to take part in a game of economic trust. Each person was given €7.50 and asked if they’d like to hand it to another person. If the other person made the same decision the pot would increase to €30. They were then asked to estimate whether the other person would opt to give them their half of the total winnings.
The participants watched 56 short videos of the people they were playing against. The researchers set up two experimental conditions, one to mimic what happens in the real world and one to test an ideal world scenario:
1. Real life condition: in this group participants were only told about the other person’s decision when they decided to trust them. The idea is that this condition simulates real life. You only find out if others are trustworthy when you decide to trust them. If you don’t trust someone you never find out whether or not they are trustworthy.
2. Ideal world condition: here participants were given feedback about the trustworthiness of other people whether or not they decided to trust them. This simulates an ideal-world condition where we all know from experience just how trustworthy people are (i.e. much more trustworthy than we think!)
Breaking down cynicism
Once again this study showed that people are remarkably cynical about strangers. Participants in this study thought that only 52 percent of the people they saw in the videos could be trusted to share their winnings. But the actual level of trustworthiness was a solid 80 percent. There’s the cynicism.
That cynicism was quickly broken down, though, by giving participants accurate feedback about others’ trustworthiness. People in the ideal world condition noticed that others could be trusted (they upped their estimate to 71 percent) and were also more trusting themselves, handing over the money 70.1 percent of the time.
People in the ideal world condition could even be seen shedding their cynicism as the study went on, becoming more trusting as they noticed that others were trustworthy. This suggests people aren’t inherently cynical, it’s just that we don’t get enough practice at trusting.
Self-fulfilling prophecy
Unfortunately we don’t live in the ideal world condition and have to put up with only receiving feedback when we decide to trust others. This leaves us in the position of trusting to psychology studies like this one to tell us that other people are more trustworthy than we imagine (or at least people who take part in psychology studies are!).
Trusting others is also a kind of self-fulfilling prophecy, just as we find in interpersonal attraction. If you try trusting others you’ll find they frequently repay that trust, leading you to be more trusting. On the other hand if you never trust anyone, except those nearest and dearest, then you’ll end up more cynical about strangers.
Visit Psyblog
Multitasking: New Study Challenges Previous Cognitive Theory But Shows That Only A Few “Supertaskers” Can Drive And Phone
Read The Original Research Paper HERE (PDF – internal link)
A new study from University of Utah psychologists found a small group of people with an extraordinary ability to multitask: Unlike 97.5 percent of those studied, they can safely drive while chatting on a cell phone.
These individuals – described by the researchers as “supertaskers” – constitute only 2.5 percent of the population. They are so named for their ability to successfully do two things at once: in this case, talk on a cell phone while operating a driving simulator without noticeable impairment.

Jason Watson, a University of Utah psychologist, negotiates cybertraffic in a driving simulator used to study driver distractions such as cell phones and testing. While many people think they can safely drive and talk on a cell phone at the same time, Watson's new study shows only one in 40 is a "supertasker" who can perform both tasks at once without impairment of abilities measured in the study. Credit: Valoree Dowell, University of Utah
The study, conducted by psychologists Jason Watson and David Strayer, is now in press for publication later this year in the journal Psychonomic Bulletin and Review.
This finding is important not because it shows people can drive well while on the phone – the study confirms that the vast majority cannot – but because it challenges current theories of multitasking. Further research may lead eventually to new understanding of regions of the brain that are responsible for supertaskers’ extraordinary performance.
“According to cognitive theory, these individuals ought not to exist,” says Watson. “Yet, clearly they do, so we use the supertasker term as a convenient way to describe their exceptional multitasking ability. Given the number of individuals who routinely talk on the phone while driving, one would have hoped that there would be a greater percentage of supertaskers. And while we’d probably all like to think we are the exception to the rule, the odds are overwhelmingly against it. In fact, the odds of being a supertasker are about as good as your chances of flipping a coin and getting five heads in a row.”
The researchers assessed the performance of 200 participants over a single task (simulated freeway driving), and again with a second demanding activity added (a cell phone conversation that involved memorizing words and solving math problems). Performance was then measured in four areas—braking reaction time, following distance, memory, and math execution.
As expected, results showed that for the group, performance suffered across the board while driving and talking on a hands-free cell phone.
For those who were not supertaskers and who talked on a cell phone while driving the simulators, it took 20 percent longer to hit the brakes when needed and following distances increased 30 percent as the drivers failed to keep pace with simulated traffic while driving. Memory performance declined 11 percent, and the ability to do math problems fell 3 percent.
However, when supertaskers talked while driving, they displayed no change in their normal braking times, following distances or math ability, and their memory abilities actually improved 3 percent.
The results are in line with Strayer’s prior studies showing that driving performance routinely declines under “dual-task conditions” – namely talking on a cell phone while driving – and is comparable to the impairment seen in drunken drivers.
Yet contrary to current understanding in this area, the small number of supertaskers showed no impairment on the measurements of either driving or cell conversation when in combination. Further, researchers found that these individuals’ performance even on the single tasks was markedly better than the control group.
“There is clearly something special about the supertaskers,” says Strayer. “Why can they do something that most of us cannot? Psychologists may need to rethink what they know about multitasking in light of this new evidence. We may learn from these very rare individuals that the multitasking regions of the brain are different and that there may be a genetic basis for this difference. That is very exciting. Stay tuned.”
Watson and Strayer are now studying expert fighter pilots under the assumption that those who can pilot a jet aircraft are also likely to have extraordinary multitasking ability.
The current value society puts on multitasking is relatively new, note the authors. As technology expands throughout our environment and daily lives, it may be that everyone – perhaps even supertaskers – eventually will reach the limits of their ability to divide attention across several tasks.
“As technology spreads, it will be very useful to better understand the brain’s processing capabilities, and perhaps to isolate potential markers that predict extraordinary ability, especially for high-performance professions,” Watson concludes.
Information from University of Utah
Game On: The Decline of Backyard Play
I found this post from Peter G. Stromberg @ Psychology Today. It really got me thinking about kids and the pressure that we may put on them as parents…What do you think?
A few weeks ago I flew to Denver with my younger daughter so that she could participate in a volleyball tournament; she has been travelling to tournaments for the last two years but this is the first time we had to fly. My daughter is 11 years old.
Shouldn’t my daughter be riding her bike around the neighborhood and jumping rope with her friends? Why is she, at age 11, playing on a team coached by a former Olympic-level athlete and competing against nationally-ranked teams based thousands of miles from our home? There is research to suggest that unstructured play and basic movement activities (running, jumping, balancing) are more beneficial for children of her age than specialized training in one particular sport. Why in the world should an 11 year old child be in year-round volleyball training? Well, let me explain.
I would guess that many readers who are older than 30 will share my own experience: at my daughter’s age and into my early teens, I spent every possible minute getting into pick-up games of basketball and football with my friends or just roaming around outside. This approach didn’t produce a skilled athlete, but it sure was fun (and cheap). Today, in most areas of the country, such activities are simply less available. One reason my daughter doesn’t head down to the park to play with her friends is that they aren’t there-they are at soccer practice, or piano lessons, or having pre-arranged play dates.
There has been a recent and enormous shift in the way children play in our society, away from unstructured outside play and towards organized competition under adult supervision. Why? One reason that will come quickly to mind is stranger danger. Many parents (including me, by the way) now believe it is unsafe for children-perhaps particularly girls-to be outside without adult supervision. Although neighborhoods vary, statistics that I have seen on this issue do not support the belief that in general accidents or attacks on children are more frequent now than, say, 30 years ago. It seems more likely that what has changed is extensive news coverage of issues such as attacks on children, which often fosters the belief that such events are frequent.
In short, actual danger from strangers is probably not the real reason for the decline in outside play. Well, how about this? Public funding for playgrounds, parks, and recreation centers has been declining since the 1980s. There aren’t as many places to go for public play anymore, and the ones that persist are likely not as well-maintained.
That’s relevant, but it still isn’t really at the heart of why my daughter plays highly competitive volleyball at such a young age. The fact is that if she doesn’t play now and decides to take up the sport at 14 or 15, the train will have left the station. Unless a child has extraordinary athletic gifts, she will be so far behind by that age that she will not be able to find a place on a team. It isn’t only that opportunities for unstructured public play have declined, it’s that opportunities for highly competitive play have expanded to such an extent that in some sports that is all that exists. There are simply no possibilities in my part of the country for recreational volleyball for children 10-18. And the situation is similar for many other sports as well: our focus on producing highly competitive teams with highly skilled participants leads to a lack of focus on producing opportunities for children who simply want to play a sport casually.
This, I think, gets us close to probably the most important reason that highly competitive sport for the few has begun to replace recreational sport for the many among children today. We as a society don’t care about recreational sport for the many. The logic of entertainment has come to control youth sports. Parents, kids, and the society as a whole are excited by the possibility of championships, cheering spectators, and (for the really elite) media coverage. And we aren’t really excited by our children playing disorganized touch football until they have to come in for dinner. What’s the point of that? Nobody is watching.
This isn’t anyone’s fault, it’s just the way our society works. I really wish my kids could play pick up games and intramurals the way their not-so-athletically-talented dad did. But the intramurals and pick up games are far fewer now. Strangely enough, childhood obesity rates have skyrocketed as they have faded. Or maybe that’s not strange at all.
This post reflects on issues I have been thinking about for years, but it is also heavily influenced by a recent book called Game On by ESPN writer Tom Farrey. To learn more about play in general, visit my website
from Peter G. Stromberg @ Psychology Today
Money & Happiness: Higher Income Only Increases Contentment If You’re ‘Keeping Up With The Jones’s’
Read the original research paper HERE (Free PDF internal link)
Source :ScienceDaily (Mar. 22, 2010)
A study by researchers at the University of Warwick and Cardiff University has found that money only makes people happier if it improves their social rank. The researchers found that simply being highly paid wasn’t enough — to be happy, people must perceive themselves as being more highly paid than their friends and work colleagues.
The researchers were seeking to explain why people in rich nations have not become any happier on average over the last 40 years even though economic growth has led to substantial increases in average incomes.
Lead researcher on the paper Chris Boyce from the University of Warwick’s Department of Psychology said: “Our study found that the ranked position of an individual’s income best predicted general life satisfaction, while the actual amount of income and the average income of others appear to have no significant effect. Earning a million pounds a year appears to be not enough to make you happy if you know your friends all earn 2 million a year.”
The study entitled “Money and Happiness: Rank of Income, Not Income, Affects Life Satisfaction” will be published in the journal Psychological Science. The researchers looked at data on earnings and life satisfaction from seven years of the British Household Panel Survey (BHPS), which is a representative longitudinal sample of British households.
First they examined how life satisfaction was related to how much money each person earned. They found however that satisfaction was much more strongly related to the ranked position of the person’s income (compared to people of the same gender, age, level of education, or from the same geographical area).
The results explain why making everybody in society richer will not necessarily increase overall happiness — because it is only having a higher income than other people that matters.
The three authors of the paper were Chris Boyce, Gordon Brown (both of the University of Warwick’s Department of Psychology), and Simon Moore of Cardiff University.
Read the original research paper HERE (Free PDF internal link)
Pay It Forward: Research Proves That Acts Of Kindness From A Few Cascade On To Dozens
Read The Original Research Paper HERE (Free PDF)
ScienceDaily (Mar. 10, 2010) — For all those dismayed by scenes of looting in disaster-struck zones, whether Haiti or Chile or elsewhere, take heart: Good acts — acts of kindness, generosity and cooperation — spread just as easily as bad. And it takes only a handful of individuals to really make a difference.

This diagram illustrates how a single act of kindness can spread between individuals and across time. Cooperative behavior spreads three degrees of separation
In a study published in the March 8 early online edition of the Proceedings of the National Academy of Sciences, researchers from the University of California, San Diego and Harvard provide the first laboratory evidence that cooperative behavior is contagious and that it spreads from person to person to person. When people benefit from kindness they “pay it forward” by helping others who were not originally involved, and this creates a cascade of cooperation that influences dozens more in a social network.
The research was conducted by James Fowler, associate professor at UC San Diego in the Department of Political Science and Calit2’s Center for Wireless and Population Health Systems, and Nicholas Christakis of Harvard, who is professor of sociology in the Faculty of Arts and Sciences and professor of medicine and medical sociology at Harvard Medical School. Fowler and Christakis are coauthors of the recently published book Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives.
In the current study, Fowler and Christakis show that when one person gives money to help others in a “public-goods game,” where people have the opportunity to cooperate with each other, the recipients are more likely to give their own money away to other people in future games. This creates a domino effect in which one person’s generosity spreads first to three people and then to the nine people that those three people interact with in the future, and then to still other individuals in subsequent waves of the experiment.
The effect persists, Fowler said: “You don’t go back to being your ‘old selfish self.”’ As a result, the money a person gives in the first round of the experiment is ultimately tripled by others who are subsequently (directly or indirectly) influenced to give more. “The network functions like a matching grant,” Christakis said.
“Though the multiplier in the real world may be higher or lower than what we’ve found in the lab,” Fowler said, “personally it’s very exciting to learn that kindness spreads to people I don’t know or have never met. We have direct experience of giving and seeing people’s immediate reactions, but we don’t typically see how our generosity cascades through the social network to affect the lives of dozens or maybe hundreds of other people.”
The study participants were strangers to each other and never played twice with the same person, a study design that eliminates direct reciprocity and reputation management as possible causes.
In previous work demonstrating the contagious spread of behaviors, emotions and ideas — including obesity, happiness, smoking cessation and loneliness — Fowler and Christakis examined social networks re-created from the records of the Framingham Heart Study. But like all observational studies, those findings could also have partially reflected the fact that people were choosing to interact with people like themselves or that people were exposed to the same environment. The experimental method used here eliminates such factors.
The study is the first work to document experimentally Fowler and Christakis’s earlier findings that social contagion travels in networks up to three degrees of separation, and the first to corroborate evidence from others’ observational studies on the spread of cooperation.
The contagious effect in the study was symmetric; uncooperative behavior also spread, but there was nothing to suggest that it spread any more or any less robustly than cooperative behavior, Fowler said.
From a scientific perspective, Fowler added, these findings suggest the fascinating possibility that the process of contagion may have contributed to the evolution of cooperation: Groups with altruists in them will be more altruistic as a whole and more likely to survive than selfish groups.
“Our work over the past few years, examining the function of human social networks and their genetic origins, has led us to conclude that there is a deep and fundamental connection between social networks and goodness,” said Christakis. “The flow of good and desirable properties like ideas, love and kindness is required for human social networks to endure, and, in turn, networks are required for such properties to spread. Humans form social networks because the benefits of a connected life outweigh the costs.”
The research was funded by the National Institute on Aging, the John Templeton Foundation, and a Pioneer Grant from the Robert Wood Johnson Foundation.
Read The Original Research Paper HERE (Free PDF)
Related articles by Zemanta
- How Social Networks Impact Drinking Habits (wellness.blogs.time.com)
- A new way of thinking about social networks and the world (boston.com)
- Loneliness May Be Contagious (wired.com)
It Takes HOW Long to Form a Habit?: Research Shows a Curved Relationship Between Practice and Automaticity. (Repost)
Say you want to create a new habit, whether it’s taking more exercise, eating more healthily or writing a blog post every day, how often does it need to be performed before it no longer requires Herculean self control?
Clearly it’s going to depend on the type of habit you’re trying to form and how single-minded you are in pursuing your goal. But are there any general guidelines for how long it takes before behaviours become automatic?
Ask Google and you’ll get a figure of somewhere between 21 and 28 days. In fact there’s no solid evidence for this number at all. The 21 day myth may well come from a book published in 1960 by a plastic surgeon. Dr Maxwell Maltz noticed that amputees took, on average, 21 days to adjust to the loss of a limb and he argued that people take 21 days to adjust to any major life changes.
Unless you’re in the habit of sawing off your own arm, this is not particularly relevant.
Doing without thinking
Now, however, there is some psychological research on this question in a paper recently published in the European Journal of Social Psychology. Phillippa Lally and colleagues from University College London recruited 96 people who were interested in forming a new habit such as eating a piece of fruit with lunch or doing a 15 minute run each day Lally et al. (2009). Participants were then asked daily how automatic their chosen behaviours felt. These questions included things like whether the behaviour was ‘hard not to do’ and could be done ‘without thinking’.
When the researchers examined the different habits, many of the participants showed a curved relationship between practice and automaticity of the form depicted below (solid line). On average a plateau in automaticity was reached after 66 days. In other words it had become as much of a habit as it was ever going to become.

This graph shows that early practice was rewarded with greater increases in automaticity and gains tailed off as participants reached their maximum automaticity for that behaviour.
Although the average was 66 days, there was marked variation in how long habits took to form, anywhere from 18 days up to 254 days in the habits examined in this study. As you’d imagine, drinking a daily glass of water became automatic very quickly but doing 50 sit-ups before breakfast required more dedication (above, dotted lines). The researchers also noted that:
- Missing a single day did not reduce the chance of forming a habit.
- A sub-group took much longer than the others to form their habits, perhaps suggesting some people are ‘habit-resistant’.
- Other types of habits may well take much longer.
No small change
What this study reveals is that when we want to develop a relatively simple habit like eating a piece of fruit each day or taking a 10 minute walk, it could take us over two months of daily repetitions before the behaviour becomes a habit. And, while this research suggests that skipping single days isn’t detrimental in the long-term, it’s those early repetitions that give us the greatest boost in automaticity.
Unfortunately it seems there’s no such thing as small change: the much-repeated 21 days to form a habit is a considerable underestimation unless your only goal in life is drinking glasses of water.
Source: psyblog.com
![Reblog this post [with Zemanta]](https://i0.wp.com/img.zemanta.com/reblog_e.png)











