Source Credit: 7 Myths About the Brain
Separating Fact From Fiction
By Kendra Cherry, About.com Guide
The human brain is amazing and sometimes mysterious. While researchers are still uncovering the secrets of how the brain works, they have discovered plenty of information about what goes on inside your noggin. Unfortunately, there are still a lot of brain myths out there.
The following are just a few of the many myths about the brain.
Myth 1: You only use 10 percent of your brain.
You’ve probably heard this oft-cited bit of information several times, but constant repetition does not make it any more accurate. People often use this popular urban legend to imply that the mind is capable of much greater things, such as dramatically increased intelligence, psychic abilities, or even telekinesis. After all, if we can do all the things we do using only 10 percent of our brains, just imagine what we could accomplish if we used the remaining 90 percent.
Reality check: Research suggests that all areas of the brain perform some type of function. If the 10 percent myth were true, brain damage would be far less likely – after all, we would really only have to worry about that tiny 10 percent of our brains being injured. The fact is that damage to even a small area of the brain can result in profound consequences to both cognition and functioning. Brain imaging technologies have also demonstrated that the entire brain shows levels of activity, even during sleep.
“It turns out though, that we use virtually every part of the brain, and that [most of] the brain is active almost all the time. Let’s put it this way: the brain represents three percent of the body’s weight and uses 20 percent of the body’s energy.” – Neurologist Barry Gordon of Johns Hopkins School of Medicine, Scientific American
Myth 2: Brain damage is permanent.
The brain is a fragile thing and can be damaged by things such as injury, stroke, or disease. This damage can result in a range of consequences, from mild disruptions in cognitive abilities to complete impairment. Brain damage can be devastating, but is it always permanent?
Reality check: While we often tend to think of brain injuries as lasting, a person’s ability to recover from such damage depends upon the severity and the location of the injury. For example, a blow to the head during a football game might lead to a concussion. While this can be quite serious, most people are able to recover when given time to heal. A severe stroke, on the other hand, can result in dire consequences to the brain that can very well be permanent.
However, it is important to remember that the human brain has an impressive amount of plasticity. Even following a serious brain event, such as a stroke, the brain can often heal itself over time and form new connections within the brain.
“Even after more serious brain injury, such as stroke, research indicates that — especially with the help of therapy — the brain may be capable of developing new connections and “reroute” function through healthy areas.” – BrainFacts.org
Myth 3: People are either “right-brained” or “left-brained.”
Have you ever heard someone describe themselves as either left-brained or right-brained? This stems from the popular notion that people are either dominated by their right or left brain hemispheres. According to this idea, people who are “right-brained” tend to be more creative and expressive, while those who are “left-brained tend to be more analytical and logical.
Reality Check: While experts do recognize that there is lateralization of brain function (that is, certain types of tasks and thinking tend to be more associated with a particular region of the brain), no one is fully right-brained or left-brained. In fact, we tend to do better at tasks when the entire brain is utilized, even for things that are typically associated with a certain area of the brain.
“No matter how lateralized the brain can get, though, the two sides still work together. The pop psychology notion of a left brain and a right brain doesn’t capture their intimate working relationship. The left hemisphere specializes in picking out the sounds that form words and working out the syntax of the words, for example, but it does not have a monopoly on language processing. The right hemisphere is actually more sensitive to the emotional features of language, tuning in to the slow rhythms of speech that carry intonation and stress.” – Carl Zimmer, Discover
Myth 4: Humans have the biggest brains.
The human brain is quite large in proportion to body size, but another common misconception is that humans have the largest brains of any organism. How big is the human brain? How does it compare to other species?
Reality Check: The average adult has a brain weighing in at about three pounds and measuring up to about 15 centimeters in length. The largest animal brain belongs to that of a sperm whale, weighing in at a whopping 18 pounds! Another large-brained animal is the elephant, with an average brain size of around 11 pounds.
But what about relative brain size in proportion to body size? Humans must certainly have the largest brains in comparison to their body size, right? Once again, this notion is also a myth. Surprisingly, one animal that holds the largest body-size to brain ratios is the shrew, with a brain making up about 10 percent of its body mass.
“Our primate lineage had a head start in evolving large brains, however, because most primates have brains that are larger than expected for their body size. The Encephalization Quotient is a measure of brain size relative to body size. The cat has an EQ of about 1, which is what is expected for its body size, while chimps have an EQ of 2.5 and humans nearly 7.5. Dolphins, no slouches when it comes to cognitive powers and complex social groups, have an EQ of more than 5, but rats and rabbits are way down on the scale at below 0.4.” – Michael Balter, Slate.com
Click Image to read reviews
Myth 5: We are born with all the brain cells we ever have, and once they die, these cells are gone forever.
Traditional wisdom has long suggested that adults only have so many brain cells and that we never form new ones. Once these cells are lost, are they really gone for good?
Reality Check: In recent years, experts have discovered evidence that the human adult brain does indeed form new cells throughout life, even during old age. The process of forming new brain cells is known as neurogenesis and researchers have found that it happens in at least one important region of the brain called the hippocampus.
“Above-ground nuclear bomb tests carried out more than 50 years ago resulted in elevated atmospheric levels of the radioactive carbon-14 isotope (14C), which steadily declined over time. In a study published yesterday (June 7) in Cell, researchers used measurements of 14C concentration in the DNA of brain cells from deceased patients to determine the neurons’ age, and demonstrated that there is substantial adult neurogenesis in the human hippocampus.” – Dan Cossins, The Scientist
Myth 6: Drinking alcohol kills brain cells.
Partly related to the myth that we never grow new neurons is the idea that drinking alcohol can lead to cell death in the brain. Drink too much or too often, some people might warn, and you’ll lose precious brain cells that you can never get back. We’ve already learned that adults do indeed get new brain cells throughout life, but could drinking alcohol really kill brain cells?
Reality Check: While excessive or chronic alcohol abuse can certainly have dire health consequences, experts do not believe that drinking causes neurons to die. In fact, research has shown that even binge drinking doesn’t actually kill neurons.
“Scientific medical research has actually demonstrated that the moderate consumption of alcohol is associated with better cognitive (thinking and reasoning) skills and memory than is abstaining from alcohol. Moderate drinking doesn’t kill brain cells but helps the brain function better into old age. Studies around the world involving many thousands of people report this finding.” – PsychCentral.com
Myth 7: There are 100 billion neurons in the human brain.
If you’ve ever thumbed through a psychology or neuroscience textbook, you have probably read that the human brain contains approximately 100 billion neurons. How accurate is this oft-repeated figure? Just how many neurons are in the brain?
Reality Check: The estimate of 100 billion neurons has been repeated so often and so long that no one is completely sure where it originated. In 2009, however, one researcher decided to actually count neurons in adult brains and found that the number was just a bit off the mark. Based upon this research, it appears that the human brain contains closer to 85 billion neurons. So while the often-cited number is a few billion too high, 85 billion is still nothing to sneeze at.
“We found that on average the human brain has 86bn neurons. And not one [of the brains] that we looked at so far has the 100bn. Even though it may sound like a small difference the 14bn neurons amount to pretty much the number of neurons that a baboon brain has or almost half the number of neurons in the gorilla brain. So that’s a pretty large difference actually.” – Dr. Suzana Herculano-Houzel
More Psychology Facts and Myths:
Source Credit: What’s Love Got To Do With It?
By Andrea F. Polard, Psy.D. on September 5, 2013 – 11:56am Psychology Today
It really should not have taken academic psychology so long to determine the key factors to happiness, especially because the results weren’t that surprising.* Money, beauty, and success are not quintessential, while compassionate giving of our money, appreciation of our actual looks, and the pursuit of personally meaningful goals are. Waiting for our parents to love us finally perpetuates feelings of being a victim while letting go of the past, forgiveness and gratitude propagate joy in the present. This is old news for psych-savvy people such as you and me, right?
But here is another piece of the puzzle, based on the findings of the truly long longitudinal and still on-going Harvard Grant Study that began in 1939. The study followed 268 male students for 78 years. The researchers predicted falsely that the students with masculine body types would become most successful. As it turns out, neither that, nor their socioeconomic circumstances, nor the students’ IQ correlated highest with success. It came as a surprise that something much more mundane mattered the most, something every Beatle fan and good parent has been suspecting all along: Love. Yes, it is true and supported by data now, “All we need is love.” Those men who had a warm mother or good sibling relationships earned a significantly higher income than their less fortunate counterparts.
Now back to happiness. The director of the study from 1966 to 2004, George E. Vaillant, looked at eight more accomplishments that went beyond mere monetary success. These were four items pertaining to mental and physical health and four to social supports and relationships. They all correlated with love, that is with a loving childhood, ones empathic capacity and warm relationships. Vaillant,
“In short it was a history of warm intimate relationships- and the ability to foster them in maturity- that predicted flourishing…”
“This is not good news,” you may say if you were heavily unloved in your past. But there is another important lesson to be learned from this grand Grant study. People can change. (So there, pessimists of the world!). In fact, it is never too late to learn how to give and receive love. The study shows that those students who were not loved in childhood but learned to give and receive love later on in their lives could overcome their disadvantage.
This is where I can relate. I had to overcome a mountain of problems, cross the desert without a drop of hope, face and embrace my fears and come out of my turtle shell, step by step, kiss by kiss, and frog by frog. It was tough, but I made it. In the end, I dared to be with a man who had something to give and who wasn’t afraid of my love either. By now, our three lucky beloved tadpoles are slowly growing into frogs themselves.
Why though does love heal almost all wounds and drive us right into happiness? I think mostly for two reasons, something I hope to see supported by data some day. First, being loved reduces our fear of the uncertainty in life. Scarcity, loss, pain will happen, but when we are being loved, all those difficulties seem surmountable. In fact, with the right support, difficulties can be viewed as opportunities for growth instead of as terrible monsters lurking in the dark. Second, loving others focuses our mind on something greater than our little Egos. Love brings out the best in us. Who’s been known to rise to the occasion and act nobly when thinking of oneself? We become creative inventers, noble knights and heroines when we dare to care for someone else but us.
So love is it. What’s left to do is nothing short of engaging in a life-long learning process about how to form and maintain relationships. And don’t forget that love comes in many colors. You might love a partner, gay or straight, your kids, your neighbors, your community, your dogs or your goldfish. Just love. And if you do not quite know how, there are ways to learn it still, step by step, kiss by kiss, and breath by breath.
We live in the most exciting and unsettling period in the history of psychiatry since Freud started talking about sex in public.
On the one hand, the American Psychiatric Association has introduced the fifth iteration of the psychiatric diagnostic manual, DSM-V, representing the current best effort of the brightest clinical minds in psychiatry to categorize the enormously complex pattern of human emotional, cognitive, and behavioral problems. On the other hand, in new and profound ways, neuroscience and genetics research in psychiatry are yielding insights that challenge the traditional diagnostic schema that have long been at the core of the field.
“Our current diagnostic system, DSM-V represents a very reasonable attempt to classify patients by their symptoms. Symptoms are an extremely important part of all medical diagnoses, but imagine how limited we would be if we categorized all forms of pneumonia as ‘coughing disease,” commented Dr. John Krystal, Editor of Biological Psychiatry.
A paper by Sabin Khadka and colleagues that appears in the September 15th issue of Biological Psychiatry advances the discussion of one of these roiling psychiatric diagnostic dilemmas.
One of the core hypotheses is that schizophrenia and bipolar disorder are distinct scientific entities. Emil Kraepelin, credited by many as the father of modern scientific psychiatry, was the first to draw a distinction between dementia praecox (schizophrenia) and manic depression (bipolar disorder) in the late 19th century based on the behavioral profiles of these syndromes. Yet, patients within each diagnosis can have a wide variation of symptoms, some symptoms appear to be in common across these diagnoses, and antipsychotic medications used to treat schizophrenia are very commonly prescribed to patients with bipolar disorder.
But at the level of brain circuit function, do schizophrenia and bipolar differ primarily by degree or are there clear categorical differences? To answer this question, researchers from a large collaborative project called BSNIP looked at a large sample of patients diagnosed with schizophrenia or bipolar disorder, their healthy relatives, and healthy people without a family history of psychiatric disorder.
They used a specialized analysis technique to evaluate the data from their multi-site study, which revealed abnormalities within seven different brain networks. Generally speaking, they found that schizophrenia and bipolar disorder showed similar disturbances in cortical circuit function. When differences emerged between these two disorders, it was usually because schizophrenia appeared to be a more severe disease. In other words, individuals with schizophrenia had abnormalities that were larger or affected more brain regions. Their healthy relatives showed subtle alterations that fell between the healthy comparison group and the patient groups.
The authors highlight the possibility that there is a continuous spectrum of circuit dysfunction, spanning from individuals without any familial association with schizophrenia or bipolar to patients carrying these diagnoses. “These findings might serve as useful biological markers of psychotic illnesses in general,” said Khadka.
Krystal agreed, adding, “It is evident that neither our genomes nor our brains have read DSM-V in that there are links across disorders that we had not previously imagined. These links suggest that new ways of organizing patients will emerge once we understand both the genetics and neural circuitry of psychiatric disorders sufficiently.”
We know a lot about the links between a pregnant mother’s health, behavior, and moods and her baby’s cognitive and psychological development once it is born. But how does pregnancy change a mother’s brain? “Pregnancy is a critical period for central nervous system development in mothers,” says psychologist Laura M. Glynn of Chapman University. “Yet we know virtually nothing about it.”
Glynn and her colleague Curt A. Sandman, of University of the California Irvine, are doing something about that. Their review of the literature in Current Directions in Psychological Science, a journal published by the Association for Psychological Science, discusses the theories and findings that are starting to fill what Glynn calls “a significant gap in our understanding of this critical stage of most women’s lives.”
At no other time in a woman’s life does she experience such massive hormonal fluctuations as during pregnancy. Research suggests that the reproductive hormones may ready a woman’s brain for the demands of motherhood — helping her becomes less rattled by stress and more attuned to her baby’s needs. Although the hypothesis remains untested, Glynn surmises this might be why moms wake up when the baby stirs while dads snore on. Other studies confirm the truth in a common complaint of pregnant women: “Mommy Brain,” or impaired memory before and after birth. “There may be a cost” of these reproduction-related cognitive and emotional changes, says Glynn, “but the benefit is a more sensitive, effective mother.”
The article reviews research that refines earlier findings on the effects of the prenatal environment on the baby. For instance, evidence is accumulating to show that it’s not prenatal adversity on its own — say, maternal malnourishment or depression — that presents risks for a baby. Congruity between life in utero and life on the outside may matter more. A fetus whose mother is malnourished adapts to scarcity and will cope better with a dearth of food once it’s born — but could become obese if it eats normally. Timing is critical too: maternal anxiety early in gestation takes a toll on the baby’s cognitive development; the same high levels of stress hormones late in pregnancy enhance it.
Just as Mom permanently affects her fetus, new science suggests that the fetus does the same for Mom. Fetal movement, even when the mother is unaware of it, raises her heart rate and her skin conductivity, signals of emotion — and perhaps of pre-natal preparation for mother-child bonding. Fetal cells pass through the placenta into the mother’s bloodstream. “It’s exciting to think about whether those cells are attracted to certain regions in the brain” that may be involved in optimizing maternal behavior, says Glynn.
Glynn cautions that most research on the maternal brain has been conducted with rodents, whose pregnancies differ enormously from women’s; more research on human mothers is needed. But she is optimistic that a more comprehensive picture of the persisting brain changes wrought by pregnancy will yield interventions to help at-risk mothers do better by their babies and themselves.
Source Credit: Chris Woolston, Special to the Los Angeles Times April 28, 2012
Mitt Romney on the stump, singles at the bar, car salesmen on the lot: All sorts of people are practicing the art of persuasion, with varying degrees of success.
Humans have been testing their own trial-and-error persuasion techniques forever, Cialdini says. Now, for better or worse, the professionals are moving in. Or, as he puts it, “the art of persuasion has turned into a science.”
Through experiments and real-world observations, researchers have unlocked some of the mysteries of persuasion: what works, what doesn’t work and why so many of us end up with candidates, dates and cars that we never really wanted.
People who learn these secrets can keep themselves from getting duped, Cialdini says. With practice, they can even reach the ultimate goal: getting others to do their bidding.
Strategic persuasion can pay huge dividends, adds Steve Martin (not the guy you’re thinking of, but Cialdini’s colleague and the British director of the consulting company Cialdini founded, Influenceatwork.com). For example, the British government recently asked him for advice to encourage delinquent taxpayers to pay up. Martin suggested a simple tactic: Instead of threatening people with fines, the government should send out a letter saying that the great majority of Brits pay their taxes on time.
That kind of peer pressure works. “So far, they’ve collected about $1 billion more than they would have otherwise,” Martin says.
Cialdini’s own research has identified six “weapons of persuasion” that can bring people to your side. Read and learn:
A rare find: Job seekers should do more than make the case that they’re right for a job; according to Cialdini, they should present themselves as a unique fit. As he explains, nobody wants to miss out on a scarce opportunity. The allure of scarcity explains why people line up at Best Buy at 4:30 a.m. on Black Friday and why inside info is valued more than common knowledge.
Count on payback: “Reciprocity is a part of every society,” Cialdini says. A classic experiment from the 1970s found that people bought twice as many raffle tickets from a stranger if he first gave them a can of Coke — proof that even tiny favors can work to your advantage. Likewise, your buddy is more likely to help you move that couch if you’ve ever given him a ride to the airport.
Be likable: A tough assignment for some, that’s for sure. But Cialdini’s research has found that a little easygoing pleasantness can be just as persuasive as talent or actual ability. Perhaps unfairly, looks count too: A study of Canadian elections, for example, found that attractive candidates received more votes than their less-blessed opponents,, even though voters claimed they didn’t care about appearances.
Society’s seal of approval: Your friend is more likely to try something — recycle, eat at the new tapas place, watch “Glee” — if you mention that lots of other people are doing it. That’s why his letter to Brit taxpayers was a billion-dollar success, Martin says. People may not want to follow the herd, Cialdini adds, but they do assume that other people make choices for a reason.
Play the consistency card: People will go to great lengths to avoid seeming flaky or wishy-washy. As Cialdini explains in his book, car salesmen exploit this trait by making fantastic “lowball” offers to potential customers. Once a customer decides to buy a car, he’s unlikely to want to flake out on the deal even if the price mysteriously balloons — Oops! There was a mistake! — before he gets the keys. Or, for a less slimy example, you’re more likely to get that raise or a promotion if you remind your boss that she has a long history of treating her employees well. (Surely she wouldn’t want to change her tune now.)
Speak from authority: Your suggestions will go a lot further if people think you’re pulling them from somewhere other than thin air. Martin has an example: In a recent study, a real estate company significantly increased home sales when the receptionist took a moment to inform potential customers of each agent’s credentials and experience. “The statements were true,” Martin says, “they didn’t cost anything — and they worked.”
A computerized self help intervention may help adolescents who suffer from depression. The specialized computer therapy acts much the same way as they do from one-to-one therapy with a clinician, according to a study published on BMJ.
Depression is common in adolescents, but many are reluctant to seek professional help. So researchers from the University of Auckland, New Zealand, set out to assess whether a new innovative computerized cognitive behavioral therapy intervention called SPARX could reduce depressive symptoms as much as usual care can.
SPARX is an interactive 3D fantasy game where a single user undertakes a series of challenges to restore balance in a virtual world dominated by GNATs (Gloomy Negative Automatic Thoughts). It contains seven modules designed to be completed over a four to seven week period. Usual care mostly involved face-to-face counseling by trained clinicians.
The research team carried out a randomized controlled trial in 24 primary healthcare sites across New Zealand. All 187 adolescents were between the ages of 12 and 19, were seeking help for mild to moderate depression and were deemed in need of treatment by primary healthcare clinicians. One group underwent face-to-face treatment as usual and the other took part in SPARX.
Participants were followed up for three months and results were based on several widely used mental health and quality of life scales.
Results showed that SPARX was as effective as usual care in reducing symptoms of depression and anxiety by at least a third. In addition significantly more people recovered completely in the SPARX group (31/69 (44%) of those who completed at least four homework modules in the SPARX group compared with 19/83 (26%) in usual care).
When questioned on satisfaction, 76/80 (95%) of SPARX users who replied said they believed it would appeal to other teenagers with 64/80 (81%) recommending it to friends. Satisfaction was, however, equally high in the group that had treatment as usual.
The authors conclude that SPARX is an “effective resource for help seeking adolescents with depression at primary healthcare sites. Use of the program resulted in a clinically significant reduction in depression, anxiety, and hopelessness and an improvement in quality of life.” They suggest that it is a potential alternative to usual care and could be used to address unmet demand for treatment. It may also be a cheaper alternative to usual care and be potentially more easily accessible to young people with depression in primary healthcare settings.
Soccer fans’ testosterone and cortisol levels go up when watching a game, but don’t further increase after a victory, according to a study published in the open access journalPLoS ONE.
The study was conducted with 50 Spanish soccer fans watching the finals between Spain and the Netherlands in the 2010 World Cup. The researchers, led by Leander van der Meij of the University of Valencia in Spain and VU University Amsterdam in the Netherlands, measured testosterone and cortisol levels for fans of different ages, genders, and degree of interest in the game. They found that the increase in testosterone was independent of all these factors, but the increase in cortisol level was more pronounced for dedicated, young, male fans.
The authors write that the testosterone effect is in agreement with the “challenge hypothesis,” as testosterone levels increased to prepare for the game, and the cortisol effect is consistent with the “social self-preservation theory,” as higher cortisol secretion among young and greater soccer fans suggests that they perceived a particularly strong threat to their own social esteem if their team didn’t win.
The Good News about Borderline Personality Disorder
Date: 06 Feb 2012
Guest Blog:
Amanda Smith, Founder of Hope For BPD
After being diagnosed with Borderline Personality Disorder in 2004, she started her path of recovery. As she oversees the programs of Hope for BPD, she has also served as Executive Director of a NAMI affiliate in Florida and currently serves on a local NAMI board of directors in Texas.
Harvard-based researcher Mary Zanarini, PhD has called borderline personality disorder (BPD) the “good prognosis diagnosis” and there are many reasons to be hopeful about the long-term outlook.
Borderline personality disorder—most frequently characterized by rapidly-changing mood swings, unstable relationships, identity disturbance, and chronic feelings of emptiness—is a mental illness with a lifetime prevalence rate of almost 6% among the general population.
Time and again, research has shown that individuals who have been diagnosed with borderline personality disorder can feel better about themselves and their world, are able to work towards academic and vocational goals, sustain healthy relationships, and experience a sense of purpose or meaning in their lives. We also know more now about the neuroplasticity of the brain and understand that our brains continue to change and adapt so that we can learn new behaviors and process information in healthier ways.
But there are many things that increase the likelihood of recovery. These include:
• taking part in an evidence-based treatment that was created specifically to treat BPD such as dialectical behavior therapy (DBT) and mentalization-based treatment (MBT)
• reading books and articles that actively promote recovery
• getting steady support and encouragement from family, friends, church leaders, and other people who have been diagnosed with BPD
• making a commitment to self-care that includes getting enough sleep, eating balanced meals, exercising, and treating physical illnesses
• being brave and asking for help before things become a crisis or an emergency
Family members who are in need of education and support can connect with organizations such as NEA-BPD and take part in their free Family Connections classes or NAMI’s Family-to-Family program.
Remember, the vast majority of people with BPD get better and go on to create lives worth living. If you’re someone who has been diagnosed with the disorder, that means you!
For more information about BPD, please visit Hope for BPD.
Amanda L. Smith
Treatment Consultation for Borderline Personality Disorder and Self-Injury http://www.hopeforbpd.com
Source and authorship credit: Everything you thought you knew about learning is wrong Psychology Today http://www.psychologytoday.com/
Everything You Thought You Knew About Learning Is Wrong How, and how NOT, to learn anything Published on January 28, 2012 by Garth Sundem in Brain Candy
Learning through osmosis didn’t make the strategies list
Taking notes during class? Topic-focused study? A consistent learning environment? All are exactly opposite the best strategies for learning. Really, I recently had the good fortune to interview Robert Bjork, director of the UCLA Learning and Forgetting Lab, distinguished professor of psychology, and massively renowned expert on packing things in your brain in a way that keeps them from leaking out. And it turns out that everything I thought I knew about learning is wrong. Here’s what he said.
First, think about how you attack a pile of study material.
“People tend to try to learn in blocks,” says Bjork, “mastering one thing before moving on to the next.” But instead he recommends interleaving, a strategy in which, for example,instead of spending an hour working on your tennis serve, you mix in a range of skills like backhands, volleys, overhead smashes, and footwork. “This creates a sense of difficulty,” says Bjork, “and people tend not to notice the immediate effects of learning.”
Instead of making an appreciable leap forward with yourserving ability after a session of focused practice, interleaving forces you to make nearly imperceptible steps forward with many skills.
But over time, the sum of these small steps is much greater than the sum of the leaps you would have taken if you’d spent the same amount of time mastering each skill in its turn.
Bjork explains that successful interleaving allows you to “seat” each skill among the others: “If information is studied so that it can be interpreted in relation to other things in memory, learning is much more powerful,” he says.
There’s one caveat: Make sure the mini skills you interleave are related in some higher-order way. If you’re trying to learn tennis, you’d want to interleave serves, backhands, volleys, smashes, and footwork—not serves, synchronized swimming, European capitals, and programming in Java.
Similarly, studying in only one location is great as long as you’ll only be required to recall the information in the same location. If you want information to be accessible outside your dorm room, or office, or nook on the second floor of the library, Bjork recommends varying your study location.
And again, these tips generalize. Interleaving and varying your study location will help whether you’re mastering math skills, learning French, or trying to become a better ballroom dancer.
So too will a somewhat related phenomenon, the spacing effect, first described by Hermann Ebbinghaus in 1885. “If you study and then you wait, tests show that the longer you wait, the more you will have forgotten,” says Bjork. That’s obvious—over time, you forget. But here’s thecool part:
If you study, wait, and then study again, the longer the wait, the more you’ll have learned after this second study session.
Bjork explains it this way: “When we access things from our memory, we do more than reveal it’s there. It’s not like a playback. What we retrieve becomes more retrievable in the future. Provided the retrieval succeeds, the more difficult and involved the retrieval, the more beneficial it is.” Note that there’s a trick implied by “provided the retrieval succeeds”: You should space your study sessions so that the information you learned in the first session remains just barely retrievable. Then, the more you have to work to pull it from the soup of your mind, the more this second study session will reinforce your learning. If you study again too soon, it’s too easy.
Along these lines, Bjork also recommends taking notes just after class, rather than during—forcing yourself to recall a lecture’s information ismore effective than simply copying it from a blackboard. “Get out of court stenographer mode,” says Bjork. You have to work for it.
The more you work, the more you learn, and the more you learn, the more awesome you can become.
“Forget about forgetting,” says Robert Bjork.
“People tend to think that learning is building up something in your memory and that forgetting is losing the things you built.
But in some respects the opposite is true.” See, once you learn something, you never actually forget it. Do you remember your childhood best friend’s phone number? No? Well, Dr. Bjork showed that if you were reminded, you would retain it much more quickly and strongly than if you were asked to memorize a fresh seven-digit number. So this oldphone number is not forgotten—it lives somewhere in you—only, recall can be a bit tricky.
And while we count forgetting as the sworn enemy of learning, in some ways that’s wrong, too. Bjork showed that the two live in a kind of symbiosis in which forgetting actually aids recall.
“Because humans have unlimited storage capacity, having total recall would be a mess,” says Bjork. “Imagine you remembered all the phone numbers of all the houses you had ever lived in. When someone asks you your current phone number, you would have to sort it from this long list.” Instead, we forget the old phone numbers, or at least bury them far beneath theease of recall we gift to our current number. What you thought were sworn enemies are more like distant collaborators.
* Excerpted from Brain Trust: 93 Top Scientists Dish the Lab-Tested Secrets of Surfing, Dating, Dieting, Gambling, Growing Man-Eating Plants and More (Three Rivers Press, March 2012)
@garthsundem
Garth Sundem is the bestselling author of Brain Candy, Geek Logik, and The Geeks’ Guide to World Domination. more…
The Psychology Behind Facebook
A new study from Boston University has looked at why people use Facebook. But not in the conventional ‘to keep in touch with friends’ or ‘to share photos’ sense. Oh no, this is FAR more interesting.
The study looks at human needs (think Maslow) and attempts to explain where Facebook fits within that context. The authors’ proposition is that Facebook (and other social networks) meets two primary human needs. The first is the need to belong to a sociodemographic group of like-minded people (linked to self-esteem and self-worth). Given this ‘need to belong’, it is hypothesised that there are differences in the way people use and share on Facebook according to cultural factors (individualistic v collectivist cultures). The thing is, some studies have suggested that being active on Facebook may not improve self-esteem, so we may be kidding ourselves if that’s (partly) why we use it!
The second need is the need for self-presentation. Further studies suggest that the person people portray on Facebook IS the real person, not an idealised version. BUT, it’s a person as seen through a socially-desirable filter. In other words, we present ourselves as highly sociable, lovable and popular even if we sit in our bedrooms in the dark playing World of Warcraft ten hours a day. There’s an aspirational element to our online selves. And hey, for me that’s certainly true – I’m a miserable sod in real life!
It’s a fascinating topic area, an understanding of which could really help marketers. Click the Source link below to read more about this study and lots of associated material. But in the meantime, stop showing off on Facebook and start just being yourself :o)
I’m a Clinical Psychologist and have a private practice and consultancy in Brisbane Australia. I have 24 years experience in child, adult and family clinical psychology. I have a wonderful wife and three kids.
I like researching issues of the brain & mind, reading and seeking out new books and resources for myself and my clients. I thought that others might be interested in some of what I have found also, hence this blog…