Peter H Brown Clinical Psychologist

Psychology News & Resources

Why Soccer (Football) IS A “Real Man’s” Game :)

STREETSOCCER_CalvinHollywood

Via Medical News Today

Read The Original Research Article Here

Soccer fans’ testosterone and cortisol levels go up when watching a game, but don’t further increase after a victory, according to a study published in the open access journal PLoS ONE.

The study was conducted with 50 Spanish soccer fans watching the finals between Spain and the Netherlands in the 2010 World Cup. The researchers, led by Leander van der Meij of the University of Valencia in Spain and VU University Amsterdam in the Netherlands, measured testosterone and cortisol levels for fans of different ages, genders, and degree of interest in the game. They found that the increase in testosterone was independent of all these factors, but the increase in cortisol level was more pronounced for dedicated, young, male fans.

The authors write that the testosterone effect is in agreement with the “challenge hypothesis,” as testosterone levels increased to prepare for the game, and the cortisol effect is consistent with the “social self-preservation theory,” as higher cortisol secretion among young and greater soccer fans suggests that they perceived a particularly strong threat to their own social esteem if their team didn’t win.

Read The Original Research Article Here

Enhanced by Zemanta

April 20, 2012 Posted by | Cognition, Exercise, mood, research, Sex & Sexuality, stress | , , , , , , , , , , , , , | 1 Comment

There IS Hope: Effective Treatment For Borderline Personality Disorder

20120211-133218.jpg

Source Credit: Mental Health Grace Alliance

The Good News about Borderline Personality Disorder
Date: 06 Feb 2012
Guest Blog:
Amanda Smith, Founder of Hope For BPD
After being diagnosed with Borderline Personality Disorder in 2004, she started her path of recovery. As she oversees the programs of Hope for BPD, she has also served as Executive Director of a NAMI affiliate in Florida and currently serves on a local NAMI board of directors in Texas.

Harvard-based researcher Mary Zanarini, PhD has called borderline personality disorder (BPD) the “good prognosis diagnosis” and there are many reasons to be hopeful about the long-term outlook.

Borderline personality disorder—most frequently characterized by rapidly-changing mood swings, unstable relationships, identity disturbance, and chronic feelings of emptiness—is a mental illness with a lifetime prevalence rate of almost 6% among the general population.

Time and again, research has shown that individuals who have been diagnosed with borderline personality disorder can feel better about themselves and their world, are able to work towards academic and vocational goals, sustain healthy relationships, and experience a sense of purpose or meaning in their lives. We also know more now about the neuroplasticity of the brain and understand that our brains continue to change and adapt so that we can learn new behaviors and process information in healthier ways.

But there are many things that increase the likelihood of recovery. These include:

• taking part in an evidence-based treatment that was created specifically to treat BPD such as dialectical behavior therapy (DBT) and mentalization-based treatment (MBT)
• reading books and articles that actively promote recovery
• getting steady support and encouragement from family, friends, church leaders, and other people who have been diagnosed with BPD
• making a commitment to self-care that includes getting enough sleep, eating balanced meals, exercising, and treating physical illnesses
• being brave and asking for help before things become a crisis or an emergency

Family members who are in need of education and support can connect with organizations such as NEA-BPD and take part in their free Family Connections classes or NAMI’s Family-to-Family program.

Remember, the vast majority of people with BPD get better and go on to create lives worth living. If you’re someone who has been diagnosed with the disorder, that means you!

For more information about BPD, please visit Hope for BPD.

Amanda L. Smith
Treatment Consultation for Borderline Personality Disorder and Self-Injury
http://www.hopeforbpd.com

February 11, 2012 Posted by | Cognition, Dialectical Behavior Therapy, Identity, Personality Disorder, research, Resilience, Spirituality | , , , , , | 6 Comments

Forget What You’ve Learnt About Learning

20120129-121719.jpg

Source and authorship credit: Everything you thought you knew about learning is wrong Psychology Today
http://www.psychologytoday.com/

Everything You Thought You Knew About Learning Is Wrong How, and how NOT, to learn anything Published on January 28, 2012 by Garth Sundem in Brain Candy

Learning through osmosis didn’t make the strategies list

Taking notes during class? Topic-focused study? A consistent learning environment? All are exactly opposite the best strategies for learning. Really, I recently had the good fortune to interview Robert Bjork, director of the UCLA Learning and Forgetting Lab, distinguished professor of psychology, and massively renowned expert on packing things in your brain in a way that keeps them from leaking out. And it turns out that everything I thought I knew about learning is wrong. Here’s what he said.

First, think about how you attack a pile of study material.

“People tend to try to learn in blocks,” says Bjork, “mastering one thing before moving on to the next.” But instead he recommends interleaving, a strategy in which, for example,instead of spending an hour working on your tennis serve, you mix in a range of skills like backhands, volleys, overhead smashes, and footwork. “This creates a sense of difficulty,” says Bjork, “and people tend not to notice the immediate effects of learning.”

Instead of making an appreciable leap forward with yourserving ability after a session of focused practice, interleaving forces you to make nearly imperceptible steps forward with many skills.

But over time, the sum of these small steps is much greater than the sum of the leaps you would have taken if you’d spent the same amount of time mastering each skill in its turn.

Bjork explains that successful interleaving allows you to “seat” each skill among the others: “If information is studied so that it can be interpreted in relation to other things in memory, learning is much more powerful,” he says.

There’s one caveat: Make sure the mini skills you interleave are related in some higher-order way. If you’re trying to learn tennis, you’d want to interleave serves, backhands, volleys, smashes, and footwork—not serves, synchronized swimming, European capitals, and programming in Java.

Similarly, studying in only one location is great as long as you’ll only be required to recall the information in the same location. If you want information to be accessible outside your dorm room, or office, or nook on the second floor of the library, Bjork recommends varying your study location.

And again, these tips generalize. Interleaving and varying your study location will help whether you’re mastering math skills, learning French, or trying to become a better ballroom dancer.

So too will a somewhat related phenomenon, the spacing effect, first described by Hermann Ebbinghaus in 1885. “If you study and then you wait, tests show that the longer you wait, the more you will have forgotten,” says Bjork. That’s obvious—over time, you forget. But here’s thecool part:

If you study, wait, and then study again, the longer the wait, the more you’ll have learned after this second study session.

Bjork explains it this way: “When we access things from our memory, we do more than reveal it’s there. It’s not like a playback. What we retrieve becomes more retrievable in the future. Provided the retrieval succeeds, the more difficult and involved the retrieval, the more beneficial it is.” Note that there’s a trick implied by “provided the retrieval succeeds”: You should space your study sessions so that the information you learned in the first session remains just barely retrievable. Then, the more you have to work to pull it from the soup of your mind, the more this second study session will reinforce your learning. If you study again too soon, it’s too easy.

Along these lines, Bjork also recommends taking notes just after class, rather than during—forcing yourself to recall a lecture’s information ismore effective than simply copying it from a blackboard. “Get out of court stenographer mode,” says Bjork. You have to work for it.

The more you work, the more you learn, and the more you learn, the more awesome you can become.

“Forget about forgetting,” says Robert Bjork.

“People tend to think that learning is building up something in your memory and that forgetting is losing the things you built.

But in some respects the opposite is true.” See, once you learn something, you never actually forget it. Do you remember your childhood best friend’s phone number? No? Well, Dr. Bjork showed that if you were reminded, you would retain it much more quickly and strongly than if you were asked to memorize a fresh seven-digit number. So this oldphone number is not forgotten—it lives somewhere in you—only, recall can be a bit tricky.

And while we count forgetting as the sworn enemy of learning, in some ways that’s wrong, too. Bjork showed that the two live in a kind of symbiosis in which forgetting actually aids recall.

“Because humans have unlimited storage capacity, having total recall would be a mess,” says Bjork. “Imagine you remembered all the phone numbers of all the houses you had ever lived in. When someone asks you your current phone number, you would have to sort it from this long list.” Instead, we forget the old phone numbers, or at least bury them far beneath theease of recall we gift to our current number. What you thought were sworn enemies are more like distant collaborators.

* Excerpted from Brain Trust: 93 Top Scientists Dish the Lab-Tested Secrets of Surfing, Dating, Dieting, Gambling, Growing Man-Eating Plants and More (Three Rivers Press, March 2012)

@garthsundem
Garth Sundem is the bestselling author of Brain Candy, Geek Logik, and The Geeks’ Guide to World Domination. more…

January 29, 2012 Posted by | ADHD /ADD, brain, Cognition, Education, research, stress, Technology | , , , , , , , , | 6 Comments

Maslow’s Hierarchy Of Facebook

Author Credit: futurecomms.co.uk

20120120-232546.jpg

The Psychology Behind Facebook
A new study from Boston University has looked at why people use Facebook. But not in the conventional ‘to keep in touch with friends’ or ‘to share photos’ sense. Oh no, this is FAR more interesting.

The study looks at human needs (think Maslow) and attempts to explain where Facebook fits within that context. The authors’ proposition is that Facebook (and other social networks) meets two primary human needs. The first is the need to belong to a sociodemographic group of like-minded people (linked to self-esteem and self-worth). Given this ‘need to belong’, it is hypothesised that there are differences in the way people use and share on Facebook according to cultural factors (individualistic v collectivist cultures). The thing is, some studies have suggested that being active on Facebook may not improve self-esteem, so we may be kidding ourselves if that’s (partly) why we use it!
The second need is the need for self-presentation. Further studies suggest that the person people portray on Facebook IS the real person, not an idealised version. BUT, it’s a person as seen through a socially-desirable filter. In other words, we present ourselves as highly sociable, lovable and popular even if we sit in our bedrooms in the dark playing World of Warcraft ten hours a day. There’s an aspirational element to our online selves. And hey, for me that’s certainly true – I’m a miserable sod in real life!

It’s a fascinating topic area, an understanding of which could really help marketers. Click the Source link below to read more about this study and lots of associated material. But in the meantime, stop showing off on Facebook and start just being yourself :o)

(Source: readwriteweb.com)

January 20, 2012 Posted by | Addiction, Cognition, Identity, Internet, Intimate Relationshps, research, Social Psychology, Technology | , , , , , | Leave a comment

iPhone Addiction: Does Smart Phone = Dumber You?

20120129-124046.jpg

Source: Psychology Today, Susan Krauss Whitbourne, Ph.D. The smartphone is quickly becoming an extension of the human brain. The latest entry into the market, the iPhone 4S, contains a feature named Siri that (according to the Apple website): “understands what you say, knows what you mean, and even talks back. Siri is so easy to use and does so much, you’ll keep finding more and more ways to use it.” Now, I love technology as much as anyone (at least when it’s working), but as a psychologist, I have to join in the voice of critics who yearn for a simpler, less tecnical age. It’s not that I wish to return to the pre-computer age, of paper and pencil, however. Instead, I’m worried that we risk having our brains become vestigial organs. Research on technological tools suggests that offloading our mental functions to these electronic devices could cause our brains to go soft. Consider the evidence from a study reported in late 2010 by researchers at McGill University. Neuroscientist Veronique Bohbot and her team reported that relying on a global positioning system (GPS) to get to known locations reduces the function of the hippocampus, the “seahorse” shaped structure in the brain that controls memory and spatial orientation. Participants used to getting around on the basis of their own wits had higher activity and a greater volume in the hippocampus than the older adults using a GPS. What’s more, when it came to their actual performance, the non-GPS users performed better on a memory test. Bohbot recommends that you turn off the GPS when you’re navigating around your hometown and use it only for its actual purpose of finding locations you’ve never been to before. Your hippocampus will thank you, whether you’re 16 or 60.

It’s not much of a leap to extrapolate from the GPS to the smartphone. A normal cellphone can remember numbers for you so that you no longer have to do so. Confess– can you remember the actual cellphone number of the people you call most frequently? We used to rely on our neurons to hold onto these crucial bits of information. Now they reside somewhere out there in the ether. What’s worse is that most people don’t even take the time to write down a new phone number anymore. You call your new acquaintance and your new acquaintance calls you, and the information is automatically stored in your contacts. It’s great for efficiency’s sake, but you’ve now given your working memory one less important exercise. Memory benefits from practice, especially in the crucial stage of encoding. Let’s move from phone numbers to information in general. People with smartphones no longer have to remember important facts because when in doubt, they can just tap into Google. When was the last time St. Louis was in the World Series, you wonder? Easy! Just enter a few letters (not even the whole city name) into your “smart” search engine. Your fingers, much less your mind, don’t have to walk very far at all. Trying to give your brain a workout with a crossword puzzle? What’s to stop you from taking a few shortcuts when the answers are right there on your phone? No mental gymnastics necessary. This leads us to Siri, that seductress of the smartphone. With your iPhone slave on constant standby, you don’t even have to key in your questions. Just say the question, and Siri conjures up the answer in an instant. With a robot at your fingertips, why even bother to look the information up yourself? The irony is that smartphones have the potential to make our brains sharper, not dumber. Researchers are finding that videogame play involving rapid decision-making can hone your cognitive resources. Older adults, in particular, seem to be able to improve their attentional and decision-making speeded task performance when they play certain games. People with a form of amnesia in which they can’t learn new information can also be helped by smartphones, according to a study conducted by Canadian researchers (Svobodo & Richards, 2009). The problem is not the use of the smartphone itself; the problem comes when the smartphone takes over a function that your brain is perfectly capable of performing. It’s like taking the elevator instead of the stairs; the ride may be quicker but your muscles won’t get a workout. Smartphones are like mental elevators. Psychologists have known for years that the “use it or lose it” principle is key to keeping your brain functioning in its peak condition throughout your life. As we become more and more drawn to these sleeker and sexier gadgets, the trick will be learning how to “use it.” So take advantage of these 5 tips to help your smartphone keep you smart: 1. Don’t substitute your smartphone for your brain. Force yourself to memorize a phone number before you store it, and dial your frequently called numbers from memory whenever possible. If there’s a fact or word definition you can infer, give your brain the job before consulting your electronic helper. 2. Turn off the GPS app when you’re going to familiar places. Just like the GPS-hippocampus study showed, you need to keep your spatial memory as active as possible by relying on your brain, not your phone, when you’re navigating well-known turf. If you are using the GPS to get around a new location, study a map first. Your GPS may not really know the best route to take (as any proper Bostonian can tell you!). 3. Use your smartphone to keep up with current events. Most people use their smartphones in their leisure time for entertainment. However, with just a few easy clicks, you can just as easily check the headlines, op-eds, and featured stories from respected news outlets around the world. This knowledge will build your mental storehouse of information, and make you a better conversationalist as well. 4. Build your social skills with pro-social apps. Some videogames can actually make you a nicer person by strengthening your empathic tendencies. Twitter and Facebook can build social bonds. Staying connected is easier than ever, and keeping those social bonds active provides you with social support. Just make sure you avoid some of the social media traps of over-sharing and FOMO (fear of missing out) syndrome. 5. Turn off your smartphone while you’re driving. No matter how clever you are at multitasking under ordinary circumstances, all experts agree that you need to give your undivided attention to driving when behind the wheel. This is another reason to look at and memorize your route before going someplace new. Fiddling with your GPS can create a significant distraction if you find that it’s given you the wrong information. Smartphones have their place, and can make your life infinitely more productive as long as you use yours to supplement, not replace, your brain. Reference: Svoboda, E., & Richards, B. (2009). Compensating for anterograde amnesia: A new training method that capitalizes on emerging smartphone technologies. Journal of the International Neuropsychological Society, 15(4), 629-638. doi:10.1017/S1355617709090791 Follow Susan Krauss Whitbourne, Ph.D. on Twitter @swhitbo for daily updates on psychology, health, and aging and please check out my website,www.searchforfulfillment.com where you can read this week’s Weekly Focus to get additional information, self-tests, and psychology-related links.

Share/Save/Bookmark

Enhanced by Zemanta

October 24, 2011 Posted by | Addiction, brain, Cognition, Health Psychology, Identity, Internet, research, Technology | , , , , , , , , , , | 2 Comments

Contentment: Is Spare Time > Spare Stuff?

What is more desirable: too little or too much spare time on your hands? To be happy, somewhere in the middle, according to Chris Manolis and James Roberts from Xavier University in Cincinnati, OH and Baylor University in Waco, TX. Their work shows that materialistic young people with compulsive buying issues need just the right amount of spare time to feel happier. The study is published online in Springer’s journal Applied Research in Quality of Life.

We now live in a society where time is of the essence. The perception of a shortage of time, or time pressure, is linked to lower levels of happiness. At the same time, our consumer culture, characterized by materialism and compulsive buying, also has an effect on people’s happiness: the desire for materialistic possessions leads to lower life satisfaction.

Given the importance of time in contemporary life, Manolis and Roberts investigate, for the first time, the effect of perceived time affluence (the amount of spare time one perceives he or she has) on the consequences of materialistic values and compulsive buying for adolescent well-being.

A total of 1,329 adolescents from a public high school in a large metropolitan area of the Midwestern United States took part in the study. The researchers measured how much spare time the young people thought they had; the extent to which they held materialistic values and had compulsive buying tendencies; and their subjective well-being, or self-rated happiness.

Manolis and Roberts’ findings confirm that both materialism and compulsive buying have a negative impact on teenagers’ happiness. The more materialistic they are and the more they engage in compulsive buying, the lower their happiness levels.

In addition, time affluence moderates the negative consequences of both materialism and compulsive buying in this group. Specifically, moderate time affluence i.e. being neither too busy, nor having too much spare time, is linked to higher levels of happiness in materialistic teenagers and those who are compulsive buyers.

Those who suffer from time pressures and think materialistically and/or purchase compulsively feel less happy compared with their adolescent counterparts. Equally, having too much free time on their hands exacerbates the negative effects of material values and compulsive buying on adolescent happiness. The authors conclude: “Living with a sensible, balanced amount of free time promotes well-being not only directly, but also by helping to alleviate some of the negative side effects associated with living in our consumer-orientated society.”

Manolis C & Roberts JA (2011). Subjective well-being among adolescent consumers: the effects of materialism, compulsive buying, and time affluence. Applied Research in Quality of Life. DOI 10.1007/s11482-011-9155-5

Share/Save/Bookmark

Enhanced by Zemanta

October 23, 2011 Posted by | Acceptance and Commitment Therapy, Age & Ageing, depression, Exercise, Health Psychology, Identity, mood, Positive Psychology, research, Resilience, stress | , , , , , , , , , , | Leave a comment

“They All Look Alike To Me”: Here’s Why “They” Do

Source : ScienceDaily (July 1, 2011) —

Read The Original Article Here

Northwestern University researchers have provided new biological evidence suggesting that the brain works differently when memorizing the face of a person from one’s own race than when memorizing a face from another race.

Their study — which used EEG recordings to measure brain activity — sheds light on a well-documented phenomenon known as the “other-race effect.” One of the most replicated psychology findings, the other-race effect finds that people are less likely to remember a face from a racial group different from their own.

“Scientists have put forward numerous ideas about why people do not recognize other-race faces as well as same-race faces,” says Northwestern psychology professor Ken Paller, who with psychology professor Joan Chiao and Heather Lucas co-authored “Why some faces won’t be remembered: Brain potentials illuminate successful versus unsuccessful encoding for same-race and other-race faces.”

The discovery of a neural marker of successful encoding of other-race faces will help put these ideas to the test, according to Paller, who directs the Cognitive Neuroscience Laboratory in the Weinberg College of Arts and Sciences.

“The ability to accurately remember faces is an important social skill with potentially serious consequences,” says doctoral student Lucas, lead author of the recently published study in Frontiers in Human Neuroscience. “It’s merely embarrassing to forget your spouse’s boss, but when an eyewitness incorrectly remembers a face, the consequence can be a wrongful criminal conviction,” she adds.

The Northwestern team found that brain activity increases in the very first 200 to 250 milliseconds upon seeing both same-race and other-race faces. To their surprise, however, they found that the amplitude of that increased brain activity only predicts whether an other-race face (not a same-race face) is later remembered.

“There appears to be a critical phase shortly after an other-race face appears that determines whether or not that face will be remembered or forgotten,” Lucas says. “In other words, the process of laying down a memory begins almost immediately after one first sees the face.”

Previous research has associated this very early phase — what is known as the N200 brain potential — with the perceptual process of individuation. That process involves identifying personally unique facial features such as the shape of the eyes and nose and the spatial configuration of various facial features.

When the researchers asked the 18 white study participants to view same-race faces and to commit them to memory, the individuation process indexed by N200 appeared “almost automatic — so robust and reliable that it actually was irrelevant as to whether a face was remembered or not,” says Lucas.

Minutes later, the participants were given a recognition test that included new faces along with some that were previously viewed. The researchers analyzed brain activity during initial face viewing as a function of whether or not each face was ultimately remembered or forgotten on the recognition test.

The N200 waves were large for all same-race faces, regardless of whether or not they later were successfully remembered. In contrast, N200 waves were larger for other-race faces that were remembered than for other-race faces that were forgotten.

Of course, not all same-race faces were successfully recognized, the researchers say. Accordingly, their study also identified brain activity that predicted whether or not a same-race face would be remembered. A specific brain wave starting at about 300 milliseconds and lasting for several hundred milliseconds was associated with what the psychologists call “elaborative encoding.”

In contrast to individuation (which involves rapidly identifying unique physical attributes from faces), elaborative encoding is a more deliberate process of inferring attributes. For example, you might note that a face reminds you of someone you know, that its expression appears friendly or shy, or it looks like the face of a scientist or police officer.

Making these types of social inferences increases the likelihood that a face will be remembered.

“However, this strategy only works if the process of individuation also occurred successfully — that is, if the physical attributes unique to a particular face already have been committed to memory,” Lucas says. “And our study found that individuation is not always engaged with other-race faces.”

Why is individuation so fragile for other-race faces? One possibility, the researchers say, is that many people simply have less practice seeing and remembering other-race faces.

“People tend to have more frequent and extensive interactions with same-race than with other-race individuals, particularly racial majority members,” Lucas says. As a result, their brains may be less adept at finding the facial information that distinguishes other-race faces from one another compared to distinguishing among faces of their own racial group.

Another possible explanation involves “social categorization,” or the tendency to group others into social categories by race. “Prior research has found that when we label and group others according to race we end up focusing more on attributes that group members tend to have in common — such as skin color — and less on attributes that individuate one group member from others,” Lucas says.

As a result, smaller N200 brain potentials for other-race faces — particularly those that were not remembered later — could indicate that race-specifying features of these faces were given more attention.

The Northwestern researchers expect future research to build on their findings in the continuing effort to better understand the other-race effect. “That research also will need to focus more on face recognition in minorities, given that the bulk of research to date has examined majority-white populations,” Lucas says.

Read The Original Article Here
Share/Save/Bookmark

Enhanced by Zemanta

July 6, 2011 Posted by | Cognition, Identity, research, Social Psychology | , , , , , , , | 2 Comments

The Great Pyramid: Did Maslow Get The Hierarchy Of Needs Right?

Maslow’s Pyramid of Human Needs Put to the Test

Source: University of Illinois at Urbana-Champaign

For more than 60 years, psychologist Abraham Maslow’s hierarchy of human needs has served as a model by which many judge life satisfaction. But his theory has never been subjected to scientific validation.

A new global study tested Maslow’s concepts and sequence, in a way that reflects life in the 21st century.

“Anyone who has ever completed a psychology class has heard of Abraham Maslow and his theory of needs,” said University of Illinois professor emeritus of psychology Dr. Ed Diener, who led the study. “But the nagging question has always been: Where is the proof? Students learn the theory, but scientific research backing this theory is rarely mentioned.”

Maslow’s pyramid of human needs begins with a base that signifies an individual’s basic needs (for food, sleep and sex). Safety and security came next, then love and belonging, then esteem and, finally, at the pyramid’s peak, a quality he called “self-actualization.”

Maslow proposed that people who have these needs fulfilled should be happier than those who don’t.

In the new study, U of I researchers put Maslow’s ideas to the test with data from 123 countries representing every major region of the world.

To determine current perceptions, the researchers turned to the Gallup World Poll, which conducted surveys in 155 countries from 2005 to 2010, and included questions about money, food, shelter, safety, social support, feeling respected, being self-directed, having a sense of mastery, and the experience of positive or negative emotions.

The researchers found that fulfillment of a diversity of needs, as defined by Maslow, do appear to be universal and important to individual happiness. But the order in which “higher” and “lower” needs are met has little bearing on how much they contribute to life satisfaction and enjoyment, Diener said.

They also found that individuals identified life satisfaction (the way an individual ranked his or her life on a scale from worst to best) with fulfillment of basic life needs.

The satisfaction of higher needs – for social support, respect, autonomy or mastery – was “more strongly related to enjoying life, having more positive feelings and less negative feelings,” Diener said.

 

An important finding, Diener said, is that the research indicated that people have higher life evaluations when others in society also have their needs fulfilled.

“Thus life satisfaction is not just an individual affair, but depends substantially also on the quality of life of one’s fellow citizens,” he said.

“Our findings suggest that Maslow’s theory is largely correct. In cultures all over the world the fulfillment of his proposed needs correlates with happiness,” Diener said.

“However, an important departure from Maslow’s theory is that we found that a person can report having good social relationships and self-actualization even if their basic needs and safety needs are not completely fulfilled.”

Share/Save/Bookmark

Enhanced by Zemanta

July 5, 2011 Posted by | Acceptance and Commitment Therapy, anxiety, Cognition, Identity, research, Resilience, Social Psychology, Spirituality | , , , , , , , | 3 Comments

Shyness, Loneliness And Facebook:Is It Easier To Be Friends In Cyberspace?

Read The Original Study In Full Here

Source: psypost.org

Do shy individuals prefer socializing on the internet? And if so, do they become less shy while on the internet and have more friends?

In 2009, the journal CyberPsychology and Behavior published an article that investigated this issue. Specifically, the researchers investigated the relationship between shyness and Facebook use.

The study was conducted by Emily S. Orr and her colleagues from the University of Windsor.

To examine this relationship, 103 undergraduate students from a university in Ontario completed an online questionnaire that assessed self-reported shyness, time spent on Facebook, number of Facebook friends, and attitudes towards Facebook.

The results of this questionnaire indicated that shy individuals tended to have fewer Facebook friends and reported spending more time on Facebook. They were also more likely to have a more favorable attitude towards Facebook than those who were less shy.

Orr and her colleagues believe that the relative anonymity provided by Facebook may explain the increased use of and favorable attitude towards Facebook.

Shy individuals may find Facebook appealing because of “the anonymity afforded by online communication, specifically, the removal of many of the verbal and nonverbal cues associated with face-to-face interactions,” as Orr and her colleagues explain.

Those who find face-to-face communication uncomfortable may use Facebook as a way to remain connected to the social world while avoiding physical social interaction.

“These findings suggest that although shy individuals do not have as many contacts on their Facebook profiles, they still regard this tool as an appealing method of communication and spend more time on Facebook than do nonshy individuals.”

Reference:

Orr, E.S., Sisic, M., Ross, C., Simmering, M.G., Arsenault, J.M. & Orr, R.R. (2009). The influence of shyness on the use of facebook in an undergraduate sample. CyberPsychology and Behavior, Vol 12, No 3: 337-340.

Share/Save/Bookmark

Read The Original Study In Full Here

Enhanced by Zemanta

June 30, 2011 Posted by | anxiety, Bullying, Cognition, depression, Identity, Internet, research, Resilience | , , , , , , , | 3 Comments

Weight Loss Goes Sci-Fi: Using Virtual Reality To Lose Weight?

The University of Houston’s Tracey Ledoux, assistant professor of health and human performance, is using an innovative approach to studying food addictions in hopes of finding strategies to assess and treat them.

“There is a growing body of research that shows that consumption of palatable food stimulates the same reward and motivation centers of the brain that recognized addictive drugs do,” Ledoux said. “These cravings are related to overeating, unsuccessful weight loss and obesity.”

Ledoux and Professor Patrick Bordnick, director of the UH Graduate College of Social Work‘s Virtual Reality Lab, will use virtual environments to try to induce food cravings. Bordnick’s body of research has focused on addictive behaviors and phobias and has used virtual reality as a tool to assess and treat them.

In this new investigation, participants will wear a virtual reality helmet to enter a “real -world” restaurant, complete with all the sights, sounds and smells. A joystick will allow them to walk to a buffet, encounter waitstaff and other patrons.

“Virtual reality will allow us to identify food and food-related stimuli of the built, home, school and social environment that cue food cravings, which has public policy, public health and clinical treatment implications,” Ledoux said. “Our study is innovative because it provides a very effective, cost-efficient tool that can be used to increase our understanding of food cravings.”

Ledoux is recruiting normal-weight women who do not have dietary restrictions or are trying to lose weight. Participants will be invited to two appointments, which may last between 30 minutes and an hour, and will receive a small compensation plus a chance to win a Kindle e-reader. For more information contact Tracey Ledoux at 713-743-1870 or TALedoux@uh.edu.

“Obesity is a pervasive and intractable problem with significant public health and economic costs in our society,” she said. “Finding the elements that promote overeating is critical for reversing the dangerous obesity trend.”

Source: Medicalnewstoday
Share/Save/Bookmark

Enhanced by Zemanta

June 27, 2011 Posted by | Addiction, anxiety, brain, Cognition, Eating Disorder, Exercise, Health Psychology, Identity, research | , , , , , , , | Leave a comment