Peter H Brown Clinical Psychologist

Psychology News & Resources

iPhone Addiction: Does Smart Phone = Dumber You?

20120129-124046.jpg

Source: Psychology Today, Susan Krauss Whitbourne, Ph.D. The smartphone is quickly becoming an extension of the human brain. The latest entry into the market, the iPhone 4S, contains a feature named Siri that (according to the Apple website): “understands what you say, knows what you mean, and even talks back. Siri is so easy to use and does so much, you’ll keep finding more and more ways to use it.” Now, I love technology as much as anyone (at least when it’s working), but as a psychologist, I have to join in the voice of critics who yearn for a simpler, less tecnical age. It’s not that I wish to return to the pre-computer age, of paper and pencil, however. Instead, I’m worried that we risk having our brains become vestigial organs. Research on technological tools suggests that offloading our mental functions to these electronic devices could cause our brains to go soft. Consider the evidence from a study reported in late 2010 by researchers at McGill University. Neuroscientist Veronique Bohbot and her team reported that relying on a global positioning system (GPS) to get to known locations reduces the function of the hippocampus, the “seahorse” shaped structure in the brain that controls memory and spatial orientation. Participants used to getting around on the basis of their own wits had higher activity and a greater volume in the hippocampus than the older adults using a GPS. What’s more, when it came to their actual performance, the non-GPS users performed better on a memory test. Bohbot recommends that you turn off the GPS when you’re navigating around your hometown and use it only for its actual purpose of finding locations you’ve never been to before. Your hippocampus will thank you, whether you’re 16 or 60.

It’s not much of a leap to extrapolate from the GPS to the smartphone. A normal cellphone can remember numbers for you so that you no longer have to do so. Confess– can you remember the actual cellphone number of the people you call most frequently? We used to rely on our neurons to hold onto these crucial bits of information. Now they reside somewhere out there in the ether. What’s worse is that most people don’t even take the time to write down a new phone number anymore. You call your new acquaintance and your new acquaintance calls you, and the information is automatically stored in your contacts. It’s great for efficiency’s sake, but you’ve now given your working memory one less important exercise. Memory benefits from practice, especially in the crucial stage of encoding. Let’s move from phone numbers to information in general. People with smartphones no longer have to remember important facts because when in doubt, they can just tap into Google. When was the last time St. Louis was in the World Series, you wonder? Easy! Just enter a few letters (not even the whole city name) into your “smart” search engine. Your fingers, much less your mind, don’t have to walk very far at all. Trying to give your brain a workout with a crossword puzzle? What’s to stop you from taking a few shortcuts when the answers are right there on your phone? No mental gymnastics necessary. This leads us to Siri, that seductress of the smartphone. With your iPhone slave on constant standby, you don’t even have to key in your questions. Just say the question, and Siri conjures up the answer in an instant. With a robot at your fingertips, why even bother to look the information up yourself? The irony is that smartphones have the potential to make our brains sharper, not dumber. Researchers are finding that videogame play involving rapid decision-making can hone your cognitive resources. Older adults, in particular, seem to be able to improve their attentional and decision-making speeded task performance when they play certain games. People with a form of amnesia in which they can’t learn new information can also be helped by smartphones, according to a study conducted by Canadian researchers (Svobodo & Richards, 2009). The problem is not the use of the smartphone itself; the problem comes when the smartphone takes over a function that your brain is perfectly capable of performing. It’s like taking the elevator instead of the stairs; the ride may be quicker but your muscles won’t get a workout. Smartphones are like mental elevators. Psychologists have known for years that the “use it or lose it” principle is key to keeping your brain functioning in its peak condition throughout your life. As we become more and more drawn to these sleeker and sexier gadgets, the trick will be learning how to “use it.” So take advantage of these 5 tips to help your smartphone keep you smart: 1. Don’t substitute your smartphone for your brain. Force yourself to memorize a phone number before you store it, and dial your frequently called numbers from memory whenever possible. If there’s a fact or word definition you can infer, give your brain the job before consulting your electronic helper. 2. Turn off the GPS app when you’re going to familiar places. Just like the GPS-hippocampus study showed, you need to keep your spatial memory as active as possible by relying on your brain, not your phone, when you’re navigating well-known turf. If you are using the GPS to get around a new location, study a map first. Your GPS may not really know the best route to take (as any proper Bostonian can tell you!). 3. Use your smartphone to keep up with current events. Most people use their smartphones in their leisure time for entertainment. However, with just a few easy clicks, you can just as easily check the headlines, op-eds, and featured stories from respected news outlets around the world. This knowledge will build your mental storehouse of information, and make you a better conversationalist as well. 4. Build your social skills with pro-social apps. Some videogames can actually make you a nicer person by strengthening your empathic tendencies. Twitter and Facebook can build social bonds. Staying connected is easier than ever, and keeping those social bonds active provides you with social support. Just make sure you avoid some of the social media traps of over-sharing and FOMO (fear of missing out) syndrome. 5. Turn off your smartphone while you’re driving. No matter how clever you are at multitasking under ordinary circumstances, all experts agree that you need to give your undivided attention to driving when behind the wheel. This is another reason to look at and memorize your route before going someplace new. Fiddling with your GPS can create a significant distraction if you find that it’s given you the wrong information. Smartphones have their place, and can make your life infinitely more productive as long as you use yours to supplement, not replace, your brain. Reference: Svoboda, E., & Richards, B. (2009). Compensating for anterograde amnesia: A new training method that capitalizes on emerging smartphone technologies. Journal of the International Neuropsychological Society, 15(4), 629-638. doi:10.1017/S1355617709090791 Follow Susan Krauss Whitbourne, Ph.D. on Twitter @swhitbo for daily updates on psychology, health, and aging and please check out my website,www.searchforfulfillment.com where you can read this week’s Weekly Focus to get additional information, self-tests, and psychology-related links.

Share/Save/Bookmark

Enhanced by Zemanta

October 24, 2011 Posted by | Addiction, brain, Cognition, Health Psychology, Identity, Internet, research, Technology | , , , , , , , , , , | 2 Comments

Contentment: Is Spare Time > Spare Stuff?

What is more desirable: too little or too much spare time on your hands? To be happy, somewhere in the middle, according to Chris Manolis and James Roberts from Xavier University in Cincinnati, OH and Baylor University in Waco, TX. Their work shows that materialistic young people with compulsive buying issues need just the right amount of spare time to feel happier. The study is published online in Springer’s journal Applied Research in Quality of Life.

We now live in a society where time is of the essence. The perception of a shortage of time, or time pressure, is linked to lower levels of happiness. At the same time, our consumer culture, characterized by materialism and compulsive buying, also has an effect on people’s happiness: the desire for materialistic possessions leads to lower life satisfaction.

Given the importance of time in contemporary life, Manolis and Roberts investigate, for the first time, the effect of perceived time affluence (the amount of spare time one perceives he or she has) on the consequences of materialistic values and compulsive buying for adolescent well-being.

A total of 1,329 adolescents from a public high school in a large metropolitan area of the Midwestern United States took part in the study. The researchers measured how much spare time the young people thought they had; the extent to which they held materialistic values and had compulsive buying tendencies; and their subjective well-being, or self-rated happiness.

Manolis and Roberts’ findings confirm that both materialism and compulsive buying have a negative impact on teenagers’ happiness. The more materialistic they are and the more they engage in compulsive buying, the lower their happiness levels.

In addition, time affluence moderates the negative consequences of both materialism and compulsive buying in this group. Specifically, moderate time affluence i.e. being neither too busy, nor having too much spare time, is linked to higher levels of happiness in materialistic teenagers and those who are compulsive buyers.

Those who suffer from time pressures and think materialistically and/or purchase compulsively feel less happy compared with their adolescent counterparts. Equally, having too much free time on their hands exacerbates the negative effects of material values and compulsive buying on adolescent happiness. The authors conclude: “Living with a sensible, balanced amount of free time promotes well-being not only directly, but also by helping to alleviate some of the negative side effects associated with living in our consumer-orientated society.”

Manolis C & Roberts JA (2011). Subjective well-being among adolescent consumers: the effects of materialism, compulsive buying, and time affluence. Applied Research in Quality of Life. DOI 10.1007/s11482-011-9155-5

Share/Save/Bookmark

Enhanced by Zemanta

October 23, 2011 Posted by | Acceptance and Commitment Therapy, Age & Ageing, depression, Exercise, Health Psychology, Identity, mood, Positive Psychology, research, Resilience, stress | , , , , , , , , , , | Leave a comment

“They All Look Alike To Me”: Here’s Why “They” Do

Source : ScienceDaily (July 1, 2011) —

Read The Original Article Here

Northwestern University researchers have provided new biological evidence suggesting that the brain works differently when memorizing the face of a person from one’s own race than when memorizing a face from another race.

Their study — which used EEG recordings to measure brain activity — sheds light on a well-documented phenomenon known as the “other-race effect.” One of the most replicated psychology findings, the other-race effect finds that people are less likely to remember a face from a racial group different from their own.

“Scientists have put forward numerous ideas about why people do not recognize other-race faces as well as same-race faces,” says Northwestern psychology professor Ken Paller, who with psychology professor Joan Chiao and Heather Lucas co-authored “Why some faces won’t be remembered: Brain potentials illuminate successful versus unsuccessful encoding for same-race and other-race faces.”

The discovery of a neural marker of successful encoding of other-race faces will help put these ideas to the test, according to Paller, who directs the Cognitive Neuroscience Laboratory in the Weinberg College of Arts and Sciences.

“The ability to accurately remember faces is an important social skill with potentially serious consequences,” says doctoral student Lucas, lead author of the recently published study in Frontiers in Human Neuroscience. “It’s merely embarrassing to forget your spouse’s boss, but when an eyewitness incorrectly remembers a face, the consequence can be a wrongful criminal conviction,” she adds.

The Northwestern team found that brain activity increases in the very first 200 to 250 milliseconds upon seeing both same-race and other-race faces. To their surprise, however, they found that the amplitude of that increased brain activity only predicts whether an other-race face (not a same-race face) is later remembered.

“There appears to be a critical phase shortly after an other-race face appears that determines whether or not that face will be remembered or forgotten,” Lucas says. “In other words, the process of laying down a memory begins almost immediately after one first sees the face.”

Previous research has associated this very early phase — what is known as the N200 brain potential — with the perceptual process of individuation. That process involves identifying personally unique facial features such as the shape of the eyes and nose and the spatial configuration of various facial features.

When the researchers asked the 18 white study participants to view same-race faces and to commit them to memory, the individuation process indexed by N200 appeared “almost automatic — so robust and reliable that it actually was irrelevant as to whether a face was remembered or not,” says Lucas.

Minutes later, the participants were given a recognition test that included new faces along with some that were previously viewed. The researchers analyzed brain activity during initial face viewing as a function of whether or not each face was ultimately remembered or forgotten on the recognition test.

The N200 waves were large for all same-race faces, regardless of whether or not they later were successfully remembered. In contrast, N200 waves were larger for other-race faces that were remembered than for other-race faces that were forgotten.

Of course, not all same-race faces were successfully recognized, the researchers say. Accordingly, their study also identified brain activity that predicted whether or not a same-race face would be remembered. A specific brain wave starting at about 300 milliseconds and lasting for several hundred milliseconds was associated with what the psychologists call “elaborative encoding.”

In contrast to individuation (which involves rapidly identifying unique physical attributes from faces), elaborative encoding is a more deliberate process of inferring attributes. For example, you might note that a face reminds you of someone you know, that its expression appears friendly or shy, or it looks like the face of a scientist or police officer.

Making these types of social inferences increases the likelihood that a face will be remembered.

“However, this strategy only works if the process of individuation also occurred successfully — that is, if the physical attributes unique to a particular face already have been committed to memory,” Lucas says. “And our study found that individuation is not always engaged with other-race faces.”

Why is individuation so fragile for other-race faces? One possibility, the researchers say, is that many people simply have less practice seeing and remembering other-race faces.

“People tend to have more frequent and extensive interactions with same-race than with other-race individuals, particularly racial majority members,” Lucas says. As a result, their brains may be less adept at finding the facial information that distinguishes other-race faces from one another compared to distinguishing among faces of their own racial group.

Another possible explanation involves “social categorization,” or the tendency to group others into social categories by race. “Prior research has found that when we label and group others according to race we end up focusing more on attributes that group members tend to have in common — such as skin color — and less on attributes that individuate one group member from others,” Lucas says.

As a result, smaller N200 brain potentials for other-race faces — particularly those that were not remembered later — could indicate that race-specifying features of these faces were given more attention.

The Northwestern researchers expect future research to build on their findings in the continuing effort to better understand the other-race effect. “That research also will need to focus more on face recognition in minorities, given that the bulk of research to date has examined majority-white populations,” Lucas says.

Read The Original Article Here
Share/Save/Bookmark

Enhanced by Zemanta

July 6, 2011 Posted by | Cognition, Identity, research, Social Psychology | , , , , , , , | 2 Comments

The Great Pyramid: Did Maslow Get The Hierarchy Of Needs Right?

Maslow’s Pyramid of Human Needs Put to the Test

Source: University of Illinois at Urbana-Champaign

For more than 60 years, psychologist Abraham Maslow’s hierarchy of human needs has served as a model by which many judge life satisfaction. But his theory has never been subjected to scientific validation.

A new global study tested Maslow’s concepts and sequence, in a way that reflects life in the 21st century.

“Anyone who has ever completed a psychology class has heard of Abraham Maslow and his theory of needs,” said University of Illinois professor emeritus of psychology Dr. Ed Diener, who led the study. “But the nagging question has always been: Where is the proof? Students learn the theory, but scientific research backing this theory is rarely mentioned.”

Maslow’s pyramid of human needs begins with a base that signifies an individual’s basic needs (for food, sleep and sex). Safety and security came next, then love and belonging, then esteem and, finally, at the pyramid’s peak, a quality he called “self-actualization.”

Maslow proposed that people who have these needs fulfilled should be happier than those who don’t.

In the new study, U of I researchers put Maslow’s ideas to the test with data from 123 countries representing every major region of the world.

To determine current perceptions, the researchers turned to the Gallup World Poll, which conducted surveys in 155 countries from 2005 to 2010, and included questions about money, food, shelter, safety, social support, feeling respected, being self-directed, having a sense of mastery, and the experience of positive or negative emotions.

The researchers found that fulfillment of a diversity of needs, as defined by Maslow, do appear to be universal and important to individual happiness. But the order in which “higher” and “lower” needs are met has little bearing on how much they contribute to life satisfaction and enjoyment, Diener said.

They also found that individuals identified life satisfaction (the way an individual ranked his or her life on a scale from worst to best) with fulfillment of basic life needs.

The satisfaction of higher needs – for social support, respect, autonomy or mastery – was “more strongly related to enjoying life, having more positive feelings and less negative feelings,” Diener said.

 

An important finding, Diener said, is that the research indicated that people have higher life evaluations when others in society also have their needs fulfilled.

“Thus life satisfaction is not just an individual affair, but depends substantially also on the quality of life of one’s fellow citizens,” he said.

“Our findings suggest that Maslow’s theory is largely correct. In cultures all over the world the fulfillment of his proposed needs correlates with happiness,” Diener said.

“However, an important departure from Maslow’s theory is that we found that a person can report having good social relationships and self-actualization even if their basic needs and safety needs are not completely fulfilled.”

Share/Save/Bookmark

Enhanced by Zemanta

July 5, 2011 Posted by | Acceptance and Commitment Therapy, anxiety, Cognition, Identity, research, Resilience, Social Psychology, Spirituality | , , , , , , , | 3 Comments

Shyness, Loneliness And Facebook:Is It Easier To Be Friends In Cyberspace?

Read The Original Study In Full Here

Source: psypost.org

Do shy individuals prefer socializing on the internet? And if so, do they become less shy while on the internet and have more friends?

In 2009, the journal CyberPsychology and Behavior published an article that investigated this issue. Specifically, the researchers investigated the relationship between shyness and Facebook use.

The study was conducted by Emily S. Orr and her colleagues from the University of Windsor.

To examine this relationship, 103 undergraduate students from a university in Ontario completed an online questionnaire that assessed self-reported shyness, time spent on Facebook, number of Facebook friends, and attitudes towards Facebook.

The results of this questionnaire indicated that shy individuals tended to have fewer Facebook friends and reported spending more time on Facebook. They were also more likely to have a more favorable attitude towards Facebook than those who were less shy.

Orr and her colleagues believe that the relative anonymity provided by Facebook may explain the increased use of and favorable attitude towards Facebook.

Shy individuals may find Facebook appealing because of “the anonymity afforded by online communication, specifically, the removal of many of the verbal and nonverbal cues associated with face-to-face interactions,” as Orr and her colleagues explain.

Those who find face-to-face communication uncomfortable may use Facebook as a way to remain connected to the social world while avoiding physical social interaction.

“These findings suggest that although shy individuals do not have as many contacts on their Facebook profiles, they still regard this tool as an appealing method of communication and spend more time on Facebook than do nonshy individuals.”

Reference:

Orr, E.S., Sisic, M., Ross, C., Simmering, M.G., Arsenault, J.M. & Orr, R.R. (2009). The influence of shyness on the use of facebook in an undergraduate sample. CyberPsychology and Behavior, Vol 12, No 3: 337-340.

Share/Save/Bookmark

Read The Original Study In Full Here

Enhanced by Zemanta

June 30, 2011 Posted by | anxiety, Bullying, Cognition, depression, Identity, Internet, research, Resilience | , , , , , , , | 3 Comments

Weight Loss Goes Sci-Fi: Using Virtual Reality To Lose Weight?

The University of Houston’s Tracey Ledoux, assistant professor of health and human performance, is using an innovative approach to studying food addictions in hopes of finding strategies to assess and treat them.

“There is a growing body of research that shows that consumption of palatable food stimulates the same reward and motivation centers of the brain that recognized addictive drugs do,” Ledoux said. “These cravings are related to overeating, unsuccessful weight loss and obesity.”

Ledoux and Professor Patrick Bordnick, director of the UH Graduate College of Social Work‘s Virtual Reality Lab, will use virtual environments to try to induce food cravings. Bordnick’s body of research has focused on addictive behaviors and phobias and has used virtual reality as a tool to assess and treat them.

In this new investigation, participants will wear a virtual reality helmet to enter a “real -world” restaurant, complete with all the sights, sounds and smells. A joystick will allow them to walk to a buffet, encounter waitstaff and other patrons.

“Virtual reality will allow us to identify food and food-related stimuli of the built, home, school and social environment that cue food cravings, which has public policy, public health and clinical treatment implications,” Ledoux said. “Our study is innovative because it provides a very effective, cost-efficient tool that can be used to increase our understanding of food cravings.”

Ledoux is recruiting normal-weight women who do not have dietary restrictions or are trying to lose weight. Participants will be invited to two appointments, which may last between 30 minutes and an hour, and will receive a small compensation plus a chance to win a Kindle e-reader. For more information contact Tracey Ledoux at 713-743-1870 or TALedoux@uh.edu.

“Obesity is a pervasive and intractable problem with significant public health and economic costs in our society,” she said. “Finding the elements that promote overeating is critical for reversing the dangerous obesity trend.”

Source: Medicalnewstoday
Share/Save/Bookmark

Enhanced by Zemanta

June 27, 2011 Posted by | Addiction, anxiety, brain, Cognition, Eating Disorder, Exercise, Health Psychology, Identity, research | , , , , , , , | Leave a comment

Now That I’ve Got Your Attention: What Gets Your Attention?

Source: Association for Psychological Science.

Once we learn the relationship between a cue and its consequences—say, the sound of a bell and the appearance of the white ice cream truck bearing our favorite chocolate cone—do we turn our attention to that bell whenever we hear it? Or do we tuck the information away and marshal our resources to learning other, novel cues—a recorded jingle, or a blue truck?

Psychologists observing “attentional allocation” now agree that the answer is both, and they have arrived at two principles to describe the phenomena. The “predictive” principle says we search for meaningful—important—cues amid the “noise” of our environments. The “uncertainty” principle says we pay most attention to unfamiliar or unpredictable cues, which may yield useful information or surprise us with pleasant or perilous consequences.

Animal studies have supplied evidence for both, and research on humans has showed how predictiveness operates, but not uncertainty. “There was a clear gap in the research,” says Oren Griffiths, a research fellow at the University of New South Wales, in Australia. So he, along with Ameika M. Johnson and Chris J. Mitchell, set out to demonstrate the uncertainty principle in humans.

“We showed that people will pay more attention to a stimulus or a cue if its status as a predictor is unreliable,” he says. The study will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science.

The researchers investigated what is called “negative transfer”—a cognitive process by which a learned association between cue and outcome inhibits any further learning about that cue. We think we know what to expect, so we aren’t paying attention when a different outcome shows up—and we learn that new association more slowly than if the cue or outcome were unpredictable. Negative transfer is a good example of the uncertainty principle at work.

Participants were divided into three groups, and administered the “allergist test.” They observed “Mrs. X” receiving a small piece of fruit—say, apple. Using a scroll bar they predicted her allergic reaction, from none to critical. They then learned that her reaction to the apple was “mild.” Later, when Mrs. X ate the apple, she had a severe reaction which participants also had to learn to predict.

The critical question was how quickly people learned about the severe reaction. Unsurprisingly, if apple was only ever paired with a severe reaction, learning was fast. But what about if apple had previously been shown to be dangerous (i.e. produce a mild allergic reaction)? In this case, learning about the new severe reaction was slow. This is termed the “negative transfer” effect. This effect did not occur, however, when the initial relationship between apple and allergy was uncertain — if, say, apple was sometimes safe to eat. Under these circumstances, the later association between apple and severe allergic reaction was learned rapidly.

Why? “They didn’t know what to expect from the cue, so they had to pay more attention to it,” says Griffiths. “That’s because of the uncertainty principle.”

Enhanced by Zemanta

June 22, 2011 Posted by | brain, Cognition, Identity, research | , , , , , , , | Leave a comment

Bullying: Casey Heynes Speaks Out

People all over the world have reacted to the video of Casey Heynes reaction to years of pent up anger from bullying. The following article and video Casey being interviewed on Australian current affairs program is well worth a watch for any parent, teacher or concerned community member. Please leave any thoughts or comments below.

Source: autismkey.com

Last week, we wrote about a popular video clip involving Casey Heynes, an Australian student who retaliated after being bullied by 12-year-old Ritchard Gale. The video struck a chord with many across the globe and went viral, being viewed by millions in the process. We covered the story on our site because of the inordinate number of children with autism who are bullied on a daily basis and felt the need to shed additional light on this growing epidemic that currently exists in schools.
On Sunday, A Current Affair (ACA) Australia, aired a fascinating in-depth interview with Casey Heynes (posted below) that gave the back-story that led up to the on-camera bullying episode and subsequent retaliation. In the ACA segment, Heynes describes a chronic pattern of abuse that occurred “practically every day.”  Some of his torment included being called “fatty,” taking slaps across the back of the head, being tripped and bombarded with water bombs at school.
The bullying began all the way back in the second grade and continued until the day Heynes’ incident was caught on camera. The harassment was so severe, Heynes describes how he considered suicide as recently as last year. “Bullycide” as it is called, has become a major problem among teens who are tormented to the point of taking their own lives.
As a parent of a child with autism, these bullying stories are extremely upsetting and much more needs to be done to address this seemingly out-of-control problem. If there is any silver lining to the Casey Heynes incident, it has brought significant attention to bullying in schools and will give further ammunition to those seeking legislative changes to address the epidemic. In fact, as we reported the other day, California Congresswoman Jackie Speier will soon be introducing legislation that addresses bullying against special needs students. The video below is a great testimony of how a single incident can change the course of how the public perceives a particular issue and the good that can come from it. In fact, the Casey Heynes story may be the proverbial straw that breaks the camel’s back, providing a catalyst for significant change to help finally protect our children from bullies once and for all.

Enhanced by Zemanta

March 24, 2011 Posted by | ADHD /ADD, Adolescence, Bullying, Child Behavior | , , , , , , , | 3 Comments

The Human Hall Of Mirrors: Who Are You From Someone Else’s Perspective?


You likely see yourself very differently from the way others see you. A little self-awareness can prevent a lot of misunderstanding.

By Sam Gosling, published on September 01, 2009 – last reviewed on November 26, 2010

“I’ll be there at 2 p.m. sharp,” Kirsten assures me as we set up our next research meeting. I make note of it in my calendar—but I put it down as 3 p.m. It’s not that Kirsten is trying to fool me; she’s just deluded about her time-management skills. After a long history of meetings to which she shows up an hour late, I’ve realized I have to make allowances for her self-blinding optimism. I don’t have unique insight—any of her friends would make the same prediction. In the domain of punctuality, others know Kirsten better than she knows herself.

The difference between how you see yourself and how others see you is not just a matter of egocentrism. Like Kirsten, we all have blind spots. We change our self-conception when we see ourselves through others’ eyes. Part of the discrepancy arises because the outsider’s perspective affords information you yourself miss—like the fact that it looks like you’re scowling when you’re listening, or that you talk over other people.

How well we understand ourselves has a profound impact on our ability to navigate the social realm. In some areas, we know ourselves better than others do. But in other areas, we’re so biased by our need to see ourselves in a good light that we become strangers to ourselves. By soliciting feedback from other people, we can learn more about ourselves and how we’re coming off. Only by understanding how we’re seen can we make sure we’re sending the right signals. To be understood by others, in other words, the first step is understanding ourselves.

There Is No Perfect Point of View

How do you cut through the fog and learn to see yourself—and others—clearly? Different perspectives provide different information on the self. To bring some order to all the things that can be known about you, it helps to divide them into four categories.

First, there are “bright spots”—things known by both you and others, like the fact that you’re politically conservative or talkative. Studies show that traits likeextroversion, talkativeness, and dominance are easily observable both to the self and to others. If everyone thinks you’re a chatterbox, you probably are.

Second are “dark spots”—things known by neither you nor others. These could include deepunconscious motives that drive your behaviors, like the fact that your relentless ambition is driven by the need to prove wrong your parents’ assumption that you’d never amount to much. T hird are “personal spots”—things known only by you, like your tendency to get anxious in crowds or your contempt for your coworkers. And finally, there are “blind spots”—things known only by others, which can include such factors as your level of hostility and defensiveness, your attractiveness, and your intelligence.

The most interesting are the latter two—personal spots and blind spots—since they involve discrepancies between how we see ourselves and how others see us.

Why You’re Less Transparent Than You Think

We’re not entirely deluded about ourselves. We have pretty unrestricted access, for instance, to what we like and believe; if you think you’re in favor of tighter regulation for car emissions or that Bon Iver is your favorite band right now, who am I to argue? Even if you don’t know the mysterious unconscious motives underlying what you like and do, you’re still the best source of information about your attitudes, beliefs, and preferences.

We often think others are aware of our anxiety or our darkest feelings, but research shows they’re actually poor judges of our emotions, intentions, and thoughts. Thomas Gilovich, a psychologist at Cornell, has found that numerous obstacles and psychological biases stand in the way of knowing how you’re seen by others. We overestimate the extent to which our internal states are detectable to others—a bias known as the “illusion of transparency.” We also overestimate the extent to which our behavior and appearance are noticed and evaluated by others—a bias known as the “spotlight effect.”

We’re good at judging our own self-esteem, optimism and pessimism, and anything to do with how we feel. So for instance, others may think you’re very calm when in fact you’re so anxious in large groups that your palms sweat and your heart rate soars.

Personal spots exist because others know how you behave, but they don’t know your intentions or feelings, explains Simine Vazire, director of thePersonality and Self-Knowledge Lab at Washington University. “If you’re quiet at a party, people don’t know if it’s because you’re arrogant and you think you’re better than everyone else or because you’re shy and don’t know how to talk to people,” she says. “But you know, because you know your thoughts and feelings. So things like anxiety, optimism and pessimism, your tendency to daydream, and your general level of happiness—what’s going on inside of you, rather than things you do—those are things other people have a hard time knowing.”

Share/Save/Bookmark//

Enhanced by Zemanta

March 10, 2011 Posted by | General | Leave a comment

I Am Old But I Am Happy: Why Happiness And Emotional Stability Might Improve With Age

ScienceDaily (Oct. 28, 2010) — It’s a prediction often met with worry: In 20 years, there will be more Americans over 60 than under 15. Some fear that will mean an aging society with an increasing number of decrepit, impaired people and fewer youngsters to care for them while also keeping the country’s productivity going.

The concerns are valid, but a new Stanford study shows there’s a silver lining to the graying of our nation. As we grow older, we tend to become more emotionally stable. And that translates into longer, more productive lives that offer more benefits than problems, said Laura Carstensen, the study’s lead author.

“As people age, they’re more emotionally balanced and better able to solve highly emotional problems,” said Carstensen, a psychology professor and director of the Stanford Center on Longevity. “We may be seeing a larger group of people who can get along with a greater number of people. They care more and are more compassionate about problems, and that may lead to a more stable world.”

Between 1993 and 2005, Carstensen and her colleagues tracked about 180 Americans between the ages of 18 and 94. Over the years, some participants died and others aged out of the younger groups, so additional participants were included.

For one week every five years, the study participants carried pagers and were required to immediately respond to a series of questions whenever the devices buzzed. The periodic quizzes were intended to chart how happy, satisfied and comfortable they were at any given time.

Carstensen’s study — which was published online in the journal Psychology and Aging — was coauthored by postdoctoral fellows Bulent Turan and Susanne Scheibe as well as Stanford doctoral students and researchers at Pennsylvania State, Northwestern, the University of Virginia and the University of California’s campuses in San Francisco and Los Angeles.

Click Image to Read Reviews

While previous research has established a correlation between aging and happiness, Carstensen’s study is the first to track the same people over a long period of time to examine how they changed.

The undertaking was an effort to answer questions asked over and over again by social scientists: Are seniors today who say they’re happy simply part of a socioeconomic era that predisposed them to good cheer? Or do most people — whether born and reared in boom times or busts — have it within themselves to reach their golden years with a smile? The answer has important implications for future aging societies.

“Our findings suggest that it doesn’t matter when you were born,” Carstensen said. “In general, people get happier as they get older.”

Over the years, the older subjects reported having fewer negative emotions and more positive ones compared with their younger days. But even with the good outweighing the bad, older people were inclined to report a mix of positive and negative emotions more often than younger test subjects.

“As people get older, they’re more aware of mortality,” Carstensen said. “So when they see or experience moments of wonderful things, that often comes with the realization that life is fragile and will come to an end. But that’s a good thing. It’s a signal of strong emotional health and balance.”

Carstensen (who is 56 and says she’s happier now than she was a few decades ago) attributes the change in older people to her theory of “socio-emotional selectivity” — a scientific way of saying that people invest in what’s most important to them when time is limited.

While teenagers and young adults experience more frustration, anxiety and disappointment over things like test scores, career goals and finding a soul mate, older people typically have made their peace with life’s accomplishments and failures. In other words, they have less ambiguity to stress about.

“This all suggests that as our society is aging, we will have a greater resource,” Carstensen said. “If people become more even-keeled as they age, older societies could be wiser and kinder societies.”

So what, then, do we make of the “grumpy old man” stereotype?

“Most of the grumpy old men out there are grumpy young men who grew old,” Carstensen said. “Aging isn’t going to turn someone grumpy into someone who’s happy-go-lucky. But most people will gradually feel better as they grow older.”

Share/Save/Bookmark

Enhanced by Zemanta

November 4, 2010 Posted by | Age & Ageing, Identity, mood, Resilience | , , , , , , , , , , , | 2 Comments