iPhone Addiction: Does Smart Phone = Dumber You?
It’s not much of a leap to extrapolate from the GPS to the smartphone. A normal cellphone can remember numbers for you so that you no longer have to do so. Confess– can you remember the actual cellphone number of the people you call most frequently? We used to rely on our neurons to hold onto these crucial bits of information. Now they reside somewhere out there in the ether. What’s worse is that most people don’t even take the time to write down a new phone number anymore. You call your new acquaintance and your new acquaintance calls you, and the information is automatically stored in your contacts. It’s great for efficiency’s sake, but you’ve now given your working memory one less important exercise. Memory benefits from practice, especially in the crucial stage of encoding. Let’s move from phone numbers to information in general. People with smartphones no longer have to remember important facts because when in doubt, they can just tap into Google. When was the last time St. Louis was in the World Series, you wonder? Easy! Just enter a few letters (not even the whole city name) into your “smart” search engine. Your fingers, much less your mind, don’t have to walk very far at all. Trying to give your brain a workout with a crossword puzzle? What’s to stop you from taking a few shortcuts when the answers are right there on your phone? No mental gymnastics necessary. This leads us to Siri, that seductress of the smartphone. With your iPhone slave on constant standby, you don’t even have to key in your questions. Just say the question, and Siri conjures up the answer in an instant. With a robot at your fingertips, why even bother to look the information up yourself? The irony is that smartphones have the potential to make our brains sharper, not dumber. Researchers are finding that videogame play involving rapid decision-making can hone your cognitive resources. Older adults, in particular, seem to be able to improve their attentional and decision-making speeded task performance when they play certain games. People with a form of amnesia in which they can’t learn new information can also be helped by smartphones, according to a study conducted by Canadian researchers (Svobodo & Richards, 2009). The problem is not the use of the smartphone itself; the problem comes when the smartphone takes over a function that your brain is perfectly capable of performing. It’s like taking the elevator instead of the stairs; the ride may be quicker but your muscles won’t get a workout. Smartphones are like mental elevators. Psychologists have known for years that the “use it or lose it” principle is key to keeping your brain functioning in its peak condition throughout your life. As we become more and more drawn to these sleeker and sexier gadgets, the trick will be learning how to “use it.” So take advantage of these 5 tips to help your smartphone keep you smart: 1. Don’t substitute your smartphone for your brain. Force yourself to memorize a phone number before you store it, and dial your frequently called numbers from memory whenever possible. If there’s a fact or word definition you can infer, give your brain the job before consulting your electronic helper. 2. Turn off the GPS app when you’re going to familiar places. Just like the GPS-hippocampus study showed, you need to keep your spatial memory as active as possible by relying on your brain, not your phone, when you’re navigating well-known turf. If you are using the GPS to get around a new location, study a map first. Your GPS may not really know the best route to take (as any proper Bostonian can tell you!). 3. Use your smartphone to keep up with current events. Most people use their smartphones in their leisure time for entertainment. However, with just a few easy clicks, you can just as easily check the headlines, op-eds, and featured stories from respected news outlets around the world. This knowledge will build your mental storehouse of information, and make you a better conversationalist as well. 4. Build your social skills with pro-social apps. Some videogames can actually make you a nicer person by strengthening your empathic tendencies. Twitter and Facebook can build social bonds. Staying connected is easier than ever, and keeping those social bonds active provides you with social support. Just make sure you avoid some of the social media traps of over-sharing and FOMO (fear of missing out) syndrome. 5. Turn off your smartphone while you’re driving. No matter how clever you are at multitasking under ordinary circumstances, all experts agree that you need to give your undivided attention to driving when behind the wheel. This is another reason to look at and memorize your route before going someplace new. Fiddling with your GPS can create a significant distraction if you find that it’s given you the wrong information. Smartphones have their place, and can make your life infinitely more productive as long as you use yours to supplement, not replace, your brain. Reference: Svoboda, E., & Richards, B. (2009). Compensating for anterograde amnesia: A new training method that capitalizes on emerging smartphone technologies. Journal of the International Neuropsychological Society, 15(4), 629-638. doi:10.1017/S1355617709090791 Follow Susan Krauss Whitbourne, Ph.D. on Twitter @swhitbo for daily updates on psychology, health, and aging and please check out my website,www.searchforfulfillment.com where you can read this week’s Weekly Focus to get additional information, self-tests, and psychology-related links.
Related articles
- Why Do So Many Robots Have A Woman’s Voice? [Technology] (jezebel.com)
- Siri lets strangers control some iPhone functions (redtape.msnbc.msn.com)
“They All Look Alike To Me”: Here’s Why “They” Do
Source : ScienceDaily (July 1, 2011) —
Read The Original Article Here
Northwestern University researchers have provided new biological evidence suggesting that the brain works differently when memorizing the face of a person from one’s own race than when memorizing a face from another race.
Their study — which used EEG recordings to measure brain activity — sheds light on a well-documented phenomenon known as the “other-race effect.” One of the most replicated psychology findings, the other-race effect finds that people are less likely to remember a face from a racial group different from their own.
“Scientists have put forward numerous ideas about why people do not recognize other-race faces as well as same-race faces,” says Northwestern psychology professor Ken Paller, who with psychology professor Joan Chiao and Heather Lucas co-authored “Why some faces won’t be remembered: Brain potentials illuminate successful versus unsuccessful encoding for same-race and other-race faces.”
The discovery of a neural marker of successful encoding of other-race faces will help put these ideas to the test, according to Paller, who directs the Cognitive Neuroscience Laboratory in the Weinberg College of Arts and Sciences.
“The ability to accurately remember faces is an important social skill with potentially serious consequences,” says doctoral student Lucas, lead author of the recently published study in Frontiers in Human Neuroscience. “It’s merely embarrassing to forget your spouse’s boss, but when an eyewitness incorrectly remembers a face, the consequence can be a wrongful criminal conviction,” she adds.
The Northwestern team found that brain activity increases in the very first 200 to 250 milliseconds upon seeing both same-race and other-race faces. To their surprise, however, they found that the amplitude of that increased brain activity only predicts whether an other-race face (not a same-race face) is later remembered.
“There appears to be a critical phase shortly after an other-race face appears that determines whether or not that face will be remembered or forgotten,” Lucas says. “In other words, the process of laying down a memory begins almost immediately after one first sees the face.”
Previous research has associated this very early phase — what is known as the N200 brain potential — with the perceptual process of individuation. That process involves identifying personally unique facial features such as the shape of the eyes and nose and the spatial configuration of various facial features.
When the researchers asked the 18 white study participants to view same-race faces and to commit them to memory, the individuation process indexed by N200 appeared “almost automatic — so robust and reliable that it actually was irrelevant as to whether a face was remembered or not,” says Lucas.
Minutes later, the participants were given a recognition test that included new faces along with some that were previously viewed. The researchers analyzed brain activity during initial face viewing as a function of whether or not each face was ultimately remembered or forgotten on the recognition test.
The N200 waves were large for all same-race faces, regardless of whether or not they later were successfully remembered. In contrast, N200 waves were larger for other-race faces that were remembered than for other-race faces that were forgotten.
Of course, not all same-race faces were successfully recognized, the researchers say. Accordingly, their study also identified brain activity that predicted whether or not a same-race face would be remembered. A specific brain wave starting at about 300 milliseconds and lasting for several hundred milliseconds was associated with what the psychologists call “elaborative encoding.”
In contrast to individuation (which involves rapidly identifying unique physical attributes from faces), elaborative encoding is a more deliberate process of inferring attributes. For example, you might note that a face reminds you of someone you know, that its expression appears friendly or shy, or it looks like the face of a scientist or police officer.
Making these types of social inferences increases the likelihood that a face will be remembered.
“However, this strategy only works if the process of individuation also occurred successfully — that is, if the physical attributes unique to a particular face already have been committed to memory,” Lucas says. “And our study found that individuation is not always engaged with other-race faces.”
Why is individuation so fragile for other-race faces? One possibility, the researchers say, is that many people simply have less practice seeing and remembering other-race faces.
“People tend to have more frequent and extensive interactions with same-race than with other-race individuals, particularly racial majority members,” Lucas says. As a result, their brains may be less adept at finding the facial information that distinguishes other-race faces from one another compared to distinguishing among faces of their own racial group.
Another possible explanation involves “social categorization,” or the tendency to group others into social categories by race. “Prior research has found that when we label and group others according to race we end up focusing more on attributes that group members tend to have in common — such as skin color — and less on attributes that individuate one group member from others,” Lucas says.
As a result, smaller N200 brain potentials for other-race faces — particularly those that were not remembered later — could indicate that race-specifying features of these faces were given more attention.
The Northwestern researchers expect future research to build on their findings in the continuing effort to better understand the other-race effect. “That research also will need to focus more on face recognition in minorities, given that the bulk of research to date has examined majority-white populations,” Lucas says.
Read The Original Article Here
Related articles
- How social pressure can affect what we remember: Scientists track brain activity as false memories are formed (sciencedaily.com)
- Consciousness – correlation is not a cause (mindblog.dericbownds.net)
- Shyness, Loneliness And Facebook:Is It Easier To Be Friends In Cyberspace? (peterhbrown.wordpress.com)
The Great Pyramid: Did Maslow Get The Hierarchy Of Needs Right?
Maslow’s Pyramid of Human Needs Put to the Test
Source: University of Illinois at Urbana-Champaign
For more than 60 years, psychologist Abraham Maslow’s hierarchy of human needs has served as a model by which many judge life satisfaction. But his theory has never been subjected to scientific validation.
A new global study tested Maslow’s concepts and sequence, in a way that reflects life in the 21st century.
“Anyone who has ever completed a psychology class has heard of Abraham Maslow and his theory of needs,” said University of Illinois professor emeritus of psychology Dr. Ed Diener, who led the study. “But the nagging question has always been: Where is the proof? Students learn the theory, but scientific research backing this theory is rarely mentioned.”
Maslow’s pyramid of human needs begins with a base that signifies an individual’s basic needs (for food, sleep and sex). Safety and security came next, then love and belonging, then esteem and, finally, at the pyramid’s peak, a quality he called “self-actualization.”
Maslow proposed that people who have these needs fulfilled should be happier than those who don’t.
In the new study, U of I researchers put Maslow’s ideas to the test with data from 123 countries representing every major region of the world.
To determine current perceptions, the researchers turned to the Gallup World Poll, which conducted surveys in 155 countries from 2005 to 2010, and included questions about money, food, shelter, safety, social support, feeling respected, being self-directed, having a sense of mastery, and the experience of positive or negative emotions.
The researchers found that fulfillment of a diversity of needs, as defined by Maslow, do appear to be universal and important to individual happiness. But the order in which “higher” and “lower” needs are met has little bearing on how much they contribute to life satisfaction and enjoyment, Diener said.
They also found that individuals identified life satisfaction (the way an individual ranked his or her life on a scale from worst to best) with fulfillment of basic life needs.
The satisfaction of higher needs – for social support, respect, autonomy or mastery – was “more strongly related to enjoying life, having more positive feelings and less negative feelings,” Diener said.
An important finding, Diener said, is that the research indicated that people have higher life evaluations when others in society also have their needs fulfilled.
“Thus life satisfaction is not just an individual affair, but depends substantially also on the quality of life of one’s fellow citizens,” he said.
“Our findings suggest that Maslow’s theory is largely correct. In cultures all over the world the fulfillment of his proposed needs correlates with happiness,” Diener said.
“However, an important departure from Maslow’s theory is that we found that a person can report having good social relationships and self-actualization even if their basic needs and safety needs are not completely fulfilled.”
Related articles
- Researchers look for ingredients of happiness around the world (empressoftheglobaluniverse.wordpress.com)
- A Modified Maslow (Elizabeth Bryson Blog 9) (visualargument.wordpress.com)
Weight Loss Goes Sci-Fi: Using Virtual Reality To Lose Weight?
The University of Houston’s Tracey Ledoux, assistant professor of health and human performance, is using an innovative approach to studying food addictions in hopes of finding strategies to assess and treat them.
“There is a growing body of research that shows that consumption of palatable food stimulates the same reward and motivation centers of the brain that recognized addictive drugs do,” Ledoux said. “These cravings are related to overeating, unsuccessful weight loss and obesity.”
Ledoux and Professor Patrick Bordnick, director of the UH Graduate College of Social Work‘s Virtual Reality Lab, will use virtual environments to try to induce food cravings. Bordnick’s body of research has focused on addictive behaviors and phobias and has used virtual reality as a tool to assess and treat them.
In this new investigation, participants will wear a virtual reality helmet to enter a “real -world” restaurant, complete with all the sights, sounds and smells. A joystick will allow them to walk to a buffet, encounter waitstaff and other patrons.
“Virtual reality will allow us to identify food and food-related stimuli of the built, home, school and social environment that cue food cravings, which has public policy, public health and clinical treatment implications,” Ledoux said. “Our study is innovative because it provides a very effective, cost-efficient tool that can be used to increase our understanding of food cravings.”
Ledoux is recruiting normal-weight women who do not have dietary restrictions or are trying to lose weight. Participants will be invited to two appointments, which may last between 30 minutes and an hour, and will receive a small compensation plus a chance to win a Kindle e-reader. For more information contact Tracey Ledoux at 713-743-1870 or TALedoux@uh.edu.
“Obesity is a pervasive and intractable problem with significant public health and economic costs in our society,” she said. “Finding the elements that promote overeating is critical for reversing the dangerous obesity trend.”
Source: Medicalnewstoday
Related articles
- How can you learn what it feels like to be pregnant? (fatherinastrangeland.wordpress.com)
- Virtual Reality Video Games Really Do Work As A Painkiller (businessinsider.com)
Now That I’ve Got Your Attention: What Gets Your Attention?
Source: Association for Psychological Science.
Once we learn the relationship between a cue and its consequences—say, the sound of a bell and the appearance of the white ice cream truck bearing our favorite chocolate cone—do we turn our attention to that bell whenever we hear it? Or do we tuck the information away and marshal our resources to learning other, novel cues—a recorded jingle, or a blue truck?
Psychologists observing “attentional allocation” now agree that the answer is both, and they have arrived at two principles to describe the phenomena. The “predictive” principle says we search for meaningful—important—cues amid the “noise” of our environments. The “uncertainty” principle says we pay most attention to unfamiliar or unpredictable cues, which may yield useful information or surprise us with pleasant or perilous consequences.
Animal studies have supplied evidence for both, and research on humans has showed how predictiveness operates, but not uncertainty. “There was a clear gap in the research,” says Oren Griffiths, a research fellow at the University of New South Wales, in Australia. So he, along with Ameika M. Johnson and Chris J. Mitchell, set out to demonstrate the uncertainty principle in humans.
“We showed that people will pay more attention to a stimulus or a cue if its status as a predictor is unreliable,” he says. The study will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science.
The researchers investigated what is called “negative transfer”—a cognitive process by which a learned association between cue and outcome inhibits any further learning about that cue. We think we know what to expect, so we aren’t paying attention when a different outcome shows up—and we learn that new association more slowly than if the cue or outcome were unpredictable. Negative transfer is a good example of the uncertainty principle at work.
Participants were divided into three groups, and administered the “allergist test.” They observed “Mrs. X” receiving a small piece of fruit—say, apple. Using a scroll bar they predicted her allergic reaction, from none to critical. They then learned that her reaction to the apple was “mild.” Later, when Mrs. X ate the apple, she had a severe reaction which participants also had to learn to predict.
The critical question was how quickly people learned about the severe reaction. Unsurprisingly, if apple was only ever paired with a severe reaction, learning was fast. But what about if apple had previously been shown to be dangerous (i.e. produce a mild allergic reaction)? In this case, learning about the new severe reaction was slow. This is termed the “negative transfer” effect. This effect did not occur, however, when the initial relationship between apple and allergy was uncertain — if, say, apple was sometimes safe to eat. Under these circumstances, the later association between apple and severe allergic reaction was learned rapidly.
Why? “They didn’t know what to expect from the cue, so they had to pay more attention to it,” says Griffiths. “That’s because of the uncertainty principle.”
Related articles
- iThink therefore iAm (if iPaid attention) (brobrubel.wordpress.com)
The Human Hall Of Mirrors: Who Are You From Someone Else’s Perspective?
You likely see yourself very differently from the way others see you. A little self-awareness can prevent a lot of misunderstanding.
“I’ll be there at 2 p.m. sharp,” Kirsten assures me as we set up our next research meeting. I make note of it in my calendar—but I put it down as 3 p.m. It’s not that Kirsten is trying to fool me; she’s just deluded about her time-management skills. After a long history of meetings to which she shows up an hour late, I’ve realized I have to make allowances for her self-blinding optimism. I don’t have unique insight—any of her friends would make the same prediction. In the domain of punctuality, others know Kirsten better than she knows herself.
The difference between how you see yourself and how others see you is not just a matter of egocentrism. Like Kirsten, we all have blind spots. We change our self-conception when we see ourselves through others’ eyes. Part of the discrepancy arises because the outsider’s perspective affords information you yourself miss—like the fact that it looks like you’re scowling when you’re listening, or that you talk over other people.
There Is No Perfect Point of View
How do you cut through the fog and learn to see yourself—and others—clearly? Different perspectives provide different information on the self. To bring some order to all the things that can be known about you, it helps to divide them into four categories.
First, there are “bright spots”—things known by both you and others, like the fact that you’re politically conservative or talkative. Studies show that traits likeextroversion, talkativeness, and dominance are easily observable both to the self and to others. If everyone thinks you’re a chatterbox, you probably are.
Second are “dark spots”—things known by neither you nor others. These could include deepunconscious motives that drive your behaviors, like the fact that your relentless ambition is driven by the need to prove wrong your parents’ assumption that you’d never amount to much. T hird are “personal spots”—things known only by you, like your tendency to get anxious in crowds or your contempt for your coworkers. And finally, there are “blind spots”—things known only by others, which can include such factors as your level of hostility and defensiveness, your attractiveness, and your intelligence.
The most interesting are the latter two—personal spots and blind spots—since they involve discrepancies between how we see ourselves and how others see us.
Why You’re Less Transparent Than You Think
We’re not entirely deluded about ourselves. We have pretty unrestricted access, for instance, to what we like and believe; if you think you’re in favor of tighter regulation for car emissions or that Bon Iver is your favorite band right now, who am I to argue? Even if you don’t know the mysterious unconscious motives underlying what you like and do, you’re still the best source of information about your attitudes, beliefs, and preferences.
We often think others are aware of our anxiety or our darkest feelings, but research shows they’re actually poor judges of our emotions, intentions, and thoughts. Thomas Gilovich, a psychologist at Cornell, has found that numerous obstacles and psychological biases stand in the way of knowing how you’re seen by others. We overestimate the extent to which our internal states are detectable to others—a bias known as the “illusion of transparency.” We also overestimate the extent to which our behavior and appearance are noticed and evaluated by others—a bias known as the “spotlight effect.”
We’re good at judging our own self-esteem, optimism and pessimism, and anything to do with how we feel. So for instance, others may think you’re very calm when in fact you’re so anxious in large groups that your palms sweat and your heart rate soars.
Personal spots exist because others know how you behave, but they don’t know your intentions or feelings, explains Simine Vazire, director of thePersonality and Self-Knowledge Lab at Washington University. “If you’re quiet at a party, people don’t know if it’s because you’re arrogant and you think you’re better than everyone else or because you’re shy and don’t know how to talk to people,” she says. “But you know, because you know your thoughts and feelings. So things like anxiety, optimism and pessimism, your tendency to daydream, and your general level of happiness—what’s going on inside of you, rather than things you do—those are things other people have a hard time knowing.”
Related articles
- The Illusion of Transparency (youarenotsosmart.com)
- Why Everyone Always Thinks Their Right. (mappingthestars.wordpress.com)
I Am Old But I Am Happy: Why Happiness And Emotional Stability Might Improve With Age
ScienceDaily (Oct. 28, 2010) — It’s a prediction often met with worry: In 20 years, there will be more Americans over 60 than under 15. Some fear that will mean an aging society with an increasing number of decrepit, impaired people and fewer youngsters to care for them while also keeping the country’s productivity going.
The concerns are valid, but a new Stanford study shows there’s a silver lining to the graying of our nation. As we grow older, we tend to become more emotionally stable. And that translates into longer, more productive lives that offer more benefits than problems, said Laura Carstensen, the study’s lead author.
“As people age, they’re more emotionally balanced and better able to solve highly emotional problems,” said Carstensen, a psychology professor and director of the Stanford Center on Longevity. “We may be seeing a larger group of people who can get along with a greater number of people. They care more and are more compassionate about problems, and that may lead to a more stable world.”
Between 1993 and 2005, Carstensen and her colleagues tracked about 180 Americans between the ages of 18 and 94. Over the years, some participants died and others aged out of the younger groups, so additional participants were included.
For one week every five years, the study participants carried pagers and were required to immediately respond to a series of questions whenever the devices buzzed. The periodic quizzes were intended to chart how happy, satisfied and comfortable they were at any given time.
Carstensen’s study — which was published online in the journal Psychology and Aging — was coauthored by postdoctoral fellows Bulent Turan and Susanne Scheibe as well as Stanford doctoral students and researchers at Pennsylvania State, Northwestern, the University of Virginia and the University of California’s campuses in San Francisco and Los Angeles.
While previous research has established a correlation between aging and happiness, Carstensen’s study is the first to track the same people over a long period of time to examine how they changed.
The undertaking was an effort to answer questions asked over and over again by social scientists: Are seniors today who say they’re happy simply part of a socioeconomic era that predisposed them to good cheer? Or do most people — whether born and reared in boom times or busts — have it within themselves to reach their golden years with a smile? The answer has important implications for future aging societies.
“Our findings suggest that it doesn’t matter when you were born,” Carstensen said. “In general, people get happier as they get older.”
Over the years, the older subjects reported having fewer negative emotions and more positive ones compared with their younger days. But even with the good outweighing the bad, older people were inclined to report a mix of positive and negative emotions more often than younger test subjects.
“As people get older, they’re more aware of mortality,” Carstensen said. “So when they see or experience moments of wonderful things, that often comes with the realization that life is fragile and will come to an end. But that’s a good thing. It’s a signal of strong emotional health and balance.”
Carstensen (who is 56 and says she’s happier now than she was a few decades ago) attributes the change in older people to her theory of “socio-emotional selectivity” — a scientific way of saying that people invest in what’s most important to them when time is limited.
While teenagers and young adults experience more frustration, anxiety and disappointment over things like test scores, career goals and finding a soul mate, older people typically have made their peace with life’s accomplishments and failures. In other words, they have less ambiguity to stress about.
“This all suggests that as our society is aging, we will have a greater resource,” Carstensen said. “If people become more even-keeled as they age, older societies could be wiser and kinder societies.”
So what, then, do we make of the “grumpy old man” stereotype?
“Most of the grumpy old men out there are grumpy young men who grew old,” Carstensen said. “Aging isn’t going to turn someone grumpy into someone who’s happy-go-lucky. But most people will gradually feel better as they grow older.”
Related articles
- Be glad you’re getting older (holykaw.alltop.com)
- As we age, shifting priorities can have surprising effect on emotional health (southcoasttoday.com)