Peter H Brown Clinical Psychologist

Psychology News & Resources

Teaching To Students’ “Learning Styles”: A Myth Busted?

Learning-stylesSOURCE CREDIT: Is Teaching to a Student’s “Learning Style” a Bogus Idea?  Many researchers have suggested that differences in students’ learning styles may be as important as ability, but empirical evidence is thin: By Sophie Guterl Scientific American Sept 20 2013

ARE THERE INDIVIDUAL LEARNING STYLES? Students are adamant they learn best visually or by hearing a lesson or by reading, and so forth. And while some educators advocate teaching methods that take advantage of differences in the way students learn, some psychologists take issue with the idea that learning style makes any significant difference in the classroom.Image: Alexander Iwan/Flickr

Ken Gibson was an advanced reader in elementary school and easily retained much of what he read. But when the teacher would stand him up in front of the class to read a report out loud, he floundered. His classmates, he noticed, also had their inconsistencies. Some relished oral presentations but took forever to read a passage on their own; others had a hard time following lectures. Gibson now explains these discrepancies as “learning styles” that differ from one student to the next. He founded a company, LearningRx, on the premise that these styles make a difference in how students learn.

The idea that learning styles vary among students has taken off in recent years. Many teachers, parents and students are adamant that they learn best visually or by hearing a lesson or by reading, and so forth. And some educators have advocated teaching methods that take advantage of differences in the way students learn. But some psychologists take issue with the idea that learning style makes any significant difference in the classroom.

There is no shortage of ideas in the professional literature. David Kolb of Case Western Reserve University posits that personality divides learners into categories based on how actively or observationally they learn and whether they thrive on abstract concepts or concrete ones. Another conjecture holds that sequential learners understand information best when it is presented one step at a time whereas holistic learners benefit more from seeing the big picture. Psychologists have published at least 71 different hypotheseson learning styles.

Frank Coffield, professor of education at the University of London, set out to find commonalities among the many disparate ideas about learning style using a sample comprising 13 models. The findings, published in 2004, found that only three tests for learning styles met their criteria for both validity and reliability, meaning that the tests both measured what they intended to measure and yielded consistent results. Among the many competing ideas, Coffield and his colleagues found no sign pointing to an overarching model of learning styles.

In 2002 Gibson, after a brief career as a pediatric optometrist, started LearningRx, a nontraditional tutoring organization, based on the idea that different people rely on particular cognitive skills that are strongest. For instance, visual learners understand lessons best when they are presented via images or a slide show; auditory learners benefit more from lectures; kinesthetic learners prefer something concrete, such as building a diorama. “We have a natural tendency to use the skills that are strongest,” Gibson says. “That becomes our learning style.”

LearningRx trainers use cognitive skill assessments similar to IQ tests to identify a student’s areas of cognitive strengths and weaknesses—some people might be strong at memorizing written words or weak at doing mathematical computations in their heads. Then they administer “brain training” exercises designed to improve students’ weakest skills. Such exercises might involve a trainer asking a student to quickly answer a series of math problems in his head.

Click Image To Read Reviews And More

Click Image To Read Reviews And More

Daniel Willingham, a professor of cognitive psychology at the University of Virginia and outspoken skeptic of learning styles, argues that Gibson and other cognitive psychologists are mistaken to equate cognitive strengths with learning styles. The two, Willingham says, are different: Whereas cognitive ability clearly affects the ability to learn, an individual’s style doesn’t. “You can have two basketball players, for example, with a different style. One is very conservative whereas the other is a real risk-taker and likes to take crazy shots and so forth, but they might be equivalent in ability.”

As Willingham points out, the idea that ability affects performance in the classroom is not particularly surprising. The more interesting question is whether learning styles, as opposed to abilities, make a difference in the classroom. Would educators be more effective if they identified their students’ individual styles and catered their lessons to them?

The premise should be testable, Willingham says. “The prediction is really straightforward: If you appeal to a person’s style versus going against his preferred style, that should make a difference for learning outcomes,” he says.

Harold Pashler of the University of California, San Diego, and his colleagues searched the research literature for exactly this kind of empirical evidence. They couldn’t find any. One study they reviewed compared participants’ scores on the Verbalizer–Visualizer Questionnaire, a fifteen-item survey of true-or-false questions evaluating whether someone prefers auditory or optical information, with their scores on memory tests after presenting words via either pictures or verbal reading. On average, participants performed better on the free-recall test when they were shown images, regardless of their preferences.

Some studies claimed to have demonstrated the effectiveness of teaching to learning styles, although they had small sample sizes, selectively reported data or were methodologically flawed. Those that were methodologically sound found no relationship between learning styles and performance on assessments. Willingham and Pashler believe that learning styles is a myth perpetuated merely by sloppy research and confirmation bias.

Despite the lack of empirical evidence for learning styles, Gibson continues to think of ability and preference as being one and the same. Trainers at LearningRx ask their clients to describe their weaknesses, then measure their cognitive abilities using theWoodcock–Johnson Test. “Just by someone telling us what’s easy and hard for them, we can pretty well know where the deficiencies are,” he says. “Eighty-five to 90 percent of the time the symptoms and the test results are right on.”

When teachers wonder how to present a lesson to kids with a range of abilities, they may not find the answer in established learning style approaches. Instead, Willingham suggests keeping it simple. “It’s the material, not the differences among the students, that ought to be the determinant of how the teacher is going to present a lesson,” he says. For example, if the goal is to teach students the geography of South America, the most effective way to do so across the board would be by looking at a map instead of verbally describing the shape and relative location of each country. “If there’s one terrific way that captures a concept for almost everybody, then you’re done.”

Share/Save/Bookmark

Enhanced by Zemanta

September 20, 2013 Posted by | brain, Children, Cognition, Education, research | , , , , , , , , , , , , , , , | 2 Comments

Bad Habits Broken Here! 4 Steps To Breaking The ‘Habit Loop’

habitsSOURCE AND AUTHOR CREDIT: Put Your Habits to Work for You: Your worst habits can become your best friends Published on June 26, 2012 by Susan Krauss Whitbourne, Ph.D. in Fulfillment at Any Age at Psychology Today

Victimized by your bad habits? Believe that you’re stuck with them for life? According to Charles Duhigg, a well-respected author and New York Times columnist, you may want to hone rather than scrap your well-polished sequences of behavior. Habits can be put to either good or bad use, depending on the situation. At their worst, they cause destructive, addictions and dependence. At their best, habits allow you to boost your efficiency and effectiveness.

The term “habit” has acquired a bad reputation because it is associated either with addiction or mindlessness. However, because habits can routinize the boring and mundane aspects of your life, they are among the most efficient and effective of all the behaviors in your repertoire.  They allow you to offload your mental energy from routine daily tasks so you can devote more resources to the tasks that require real thought and creativity. They can also, despite what you may have been told, be controlled.

Click Image To Read Reviews And More

Click Image To Read Reviews And More

In his book, The Power of Habit: Why We Do What We Do in Life and Business, Duhigg takes us through a compelling personal and scientific narrative that, dare I say it, proves to be habit-forming on its own.  The doubter in you will be convinced by the many examples he provides of successful habit use that include corporate marketing, promotion of pop songs, overhauls of big business, and resounding of sports teams. He also shows us the downside of habits when they lead to inflexibility and inability to respond to changing circumstances among everyone from hospital workers to London Tube employees. As is true for the most tormented drug addict, it may take a crisis to break through the dysfunctional habits built into a large organization when its employees become too locked into “business as usual.”

The key to unlocking the power of habit for you is to understand that habits are formed and maintained through a cyclical process called a “habit loop.” Your habit loop begins to form when a behavior you perform leads to an outcome that you desire. If you want to get that reward again, you’ll repeat the behavior. Then, for the loop to be complete, you also need a cue to trigger your craving for the desired outcome.

There are plenty of examples throughout the book of habit loops for everything from winning the Super Bowl to improving safety records at an aluminum plant. Of the many compelling examples, perhaps the easiest to summarize here are is the habit loop involving the mall-based Cinnabon stores, those ubiquitous purveyors of some of the most dangerous food on the planet.  The habit, in this case, is eating the delicious calorie-packed morsel. The reward is the pleasure that comes from eating it. The cue—and this is the main thing— is the smell. As you’re making your way from Foot Locker to the Gap, that cinnamon scent hits you with full force. You may have no interest whatsoever at the moment in having a snack, and in fact have sworn off all sweets, but then that unmistakable scent overwhelms you and puts you under its spell. Your brain wants that treat and nothing else in the world can shut down that need (or so you feel at the time).  You stop everything and grab that Cinnabon, knowing that you’ve just made it that much harder to fit into those chinos you were planning to buy at the Gap.

This diagram shows the general habit loop. The Cinnabon loop works like this: Cue (Cinnabon scent)–> Routine (eating Cinnabon) –> Reward (sweet taste) –> Cue (next time you smell the scent). In a true addiction, you may not even wait until the scent hits you between the nostrils. You only have to imagine the scent, and it’s off to the next Cinnabon store you can find. The book is sprinkled with simple diagrams such as the basic habit loop shown here that cleverly and clearly illustrate the main points.Marketers capitalize on the habit-producing nature of powerful product cues.  Duhigg chronicles case after case of advertising campaigns designed to create cravings that make you want something you didn’t even know you needed.  For example, the rewarding feeling of being “clean” leads us to purchase products from toothpaste to air fresheners to produce that sensation over and over again, even if we’re not particularly dirty.  The product labels become the cues that trigger the craving, which in turn leads to the habit, which in turn leads to the reward, and so on.  Advertisers have convinced us that we not only have to be clean, but we need to feel clean, preferably with their trademarked products.

Click Image To Read Reviews And More

Click Image To Read Reviews And More

Businesses can also capitalize on the habitual behaviors of their customers to market strategically to people who show predictable patterns of behavior.  Ever wonder why certain ads pop up on certain websites or why you keep getting coupons in the mail perfectly targeted to your needs? As Duhigg shows, marketers have figured out how to read not only your keystrokes but your habitual browsing and shopping habits. It may creep you out to think that Target knows a woman is pregnant by her shopping habits, but by using the psychology of habits, that is just what Target (and many other businesses) do to pinpoint your buying needs.  Fair warning: if you’re convinced that corporate conspiracies reside everywhere, you may want to skip some of these chapters or you’ll never set foot in a store again. Of course, that would be unreasonable, so you should read these chapters and at least you’ll understand both the how and the why of the many ways that advertisers manipulate us.

Lest you wonder, at this point, whether my blog’s title was false advertising, rest assured that there is plenty of good that our habits can do for us. Duhigg shows, again in fascinating detail, that the very habit loops that lead us to buy products that we don’t need can also lead us to become capable of remarkable successes. For example, it’s valuable to have an automatic response to an emergency situation for which there’s no time to think. You want to get your hand out of the way of a closing car door before you have time to ponder your course of action. It’s also helpful to relegate repeated and mundane behaviors to habit status so that you can think about other things. For example, while folding all that laundry you’re folding because you’re programmed by advertisers to do the laundry, you don’t have to think in detail about your actions. You can think about something else.  If you had to stop and ponder every habitual action you complete over the course of your daily routines, you’d never be able to reflect on how to solve the real problems that face you at your work or in your close relationships. When you’re riding the elevator or walking to work, it’s helpful that you can put your brain on autopilot while you figure out how you’re going to settle the argument you’re having with your best friend.

Your habit loops can give your brain a break when you need it for real work, then, but they can also pave the way for you to get rid of the habits you want to change or eliminate. This is particularly true of addictions. Let’s take the case of problem gambling. According to Duhigg, one of the most significant contributors to problem gambling is not that gamblers win (otherwise casinos wouldn’t make a profit!). No, the main contributor to problem gambling is the near win. In a near win, you get, for example, 2 out of 3 matches on a slot machine. You’re not actually being rewarded, then, for your gambling habit, but because you see yourself as so closeto winning you become convinced that you’ll certainly win the next time around. Small wins can actually set you up for big habits that you’ll find almost impossible to break. However, habit change is not completely impossible.

Using the psychology of the habit loop, change becomes possible when you use the cue to trigger a new behavior that itself leads to a reward, perhaps different than the original reward, but a reward nevertheless.  Problem gamblers see the near win as a reason to keep gambling. Non-problem gamblers reward themselves for the near win (which they correctly interpret as a loss) by leaving the casino without losing more of their money. Understand the cues that trigger the behavior, substitute a new routine, and make sure that the new routine reaps its own reward. You’ll soon be craving the reward produced by the new routine, according to this logic.

On that note, Duhigg provides a 4-step plan for breaking a bad habit loop and substituting it with one designed to produce new habits that will benefit your mental and physical health. Unfortunately, habits are harder to break than to build, but this 4-step program can get you going in the right direction.

  1. Identify the cue, routine, and reward. Draw your own habit loop for the behavior you’re trying to change. As is true for mindful eating, just thinking about what you’re doing can often stimulate habit change right then and there. Your habit changes to eating less, or more healthily, when you realize what’s triggering your bad snacking habits.
  2. Find alternative rewards. Winning is clearly a reward for gambling, but for problem gamblers, near wins begin to take on highly rewarding value. To stop the gambling you need to find an outcome that will be even more rewarding for you. Because everyone’s reward structure is slightly different, you need to determine which reward will lead to the new craving that triggers the new, non-gambling, behavior.
  3. Figure out the actual cue. You may think that your constant online shoe shopping is due to a desire to look stylish, but perhaps there’s something else that triggers this habit of overspending. Using a tried and true method in behavior analysis, isolate the actual cue among the many possible stimuli operating on you when the habit kicks in. Duhigg suggests that you go 5 for 5 on this and look at the possible 5 categories of cues: location, time, emotional state, other people, and the immediately preceding action. Your desire to fill your closet may have nothing to do with your wanting to dress to impress but instead because you feel lonely, anxious, or spend time with friends who themselves are overly preoccupied with appearance.
  4. Make a plan to change. You may think that you can’t control your habits, but if you anticipate your characteristic response to a situation, you can change that response. Let’s say that you’re most likely to drink too much when you’re watching your favorite sports on TV, perhaps just because you needed something to do instead of just sitting there. Make a plan so that when the game is on you’ve got another activity you can engage in that would also give you something to do, particularly during the lulls in the action when your habitual response was to take another swig of beer. It could be playing an online (non-gambling) game, doing a crossword puzzle, or reading a magazine. By building a reward into the new behavior (doing something enjoyable while bored) you are increasing the chances that, over time, you can instill a new and healthier habit.

There’s no reason to let your habits dominate and possible ruin your life. Instead, you can use them to build large gains on small wins, redirect behavioral sequences that cause you to become addicted and improve your mental productivity. Old habits die hard, but they can die.

Follow Susan Krauss Whitbourne on Twitter @swhitbo for daily updates on psychology, health, and aging. Feel free to join her Facebook group, “Fulfillment at Any Age,” 

Copyright Susan Krauss Whitbourne 2012

Share/Save/Bookmark

Enhanced by Zemanta

September 17, 2013 Posted by | Addiction, Books, brain, Cognition | , , , , , , , , , , , , , , , | Leave a comment

Procrastinate Much? Focus On Starting Not Finishing

procrastinationSOURCE CREDIT: Author DONALD LATUMAHINA Lifeoptimizer.org
How to Achieve Goals Through Persistent Starting

Have you ever feel overwhelmed while trying to achieve a goal? I have, and I guess you have too. That’s why it’s important that you have a good strategy. Otherwise you might not achieve your goals, or will only achieve them through unnecessary stress and frustration.

One good strategy I found is persistent starting in The Now Habit by Neil Fiore. Here is what the book says about it:

“…essentially, all large tasks are completed in a series of starts… Keep on starting, and finishing will take care of itself.”

In essence, persistent starting means that you shouldn’t fill your mind with how big a project is. That will only make you feel overwhelmed. Instead, just focus on starting on it every day. By doing that, you will eventually finish the project and achieve your goal.

Click Image To Read Reviews and More

Click Image To Read Reviews and More

Why Persistent Starting Is Powerful

There are three reasons why persistent starting is powerful:

1. It helps you reduce stress. Instead of filling your mind with how big a project is, you fill it with the simple task that you need to do today. That makes the burden much lighter.

2. It helps you overcome procrastination. One big reason why we procrastinate is that we feel overwhelmed by what we face. As a result, we hesitate to take action. This principle makes the task feel manageable.

3. It allows you to overcome seemingly insurmountable challenges. By just continually starting, you will eventually achieve a big goal. The whole journey might seem daunting, but by going through it one step at a time, you will eventually reach your destination.

A simple example in my life is when I tried to finish reading the Bible. It seemed like a huge task. If I focused on how hard it would be, it’s unlikely that I would ever finish it. But I focused instead on reading four chapters a day without thinking about how far I still had to go. With this attitude, I eventually finished reading it within a year.

How to Apply Persistent Starting

Here are four steps to apply persistent starting:

Click Image to Read Reviews and More

Click Image to Read Reviews and More

1. Know your destination.

First of all, you need to know where you are going. If you don’t, you will only wander aimlessly. So set a clear goal. What is it that you are trying to achieve? How will success look?

2. Plan the route.

Now that you know your destination, you need to plan how to get there. A good way to do that is to set some milestones. These milestones serve two purposes:

They help you stay on track. You will know if you deviate from the right path.
They give you small victories along the way. Having a sense of accomplishment is important to stay motivated. By having milestones, you can get it along the way, not just at the end.
3. Keep doing the next simple task.

After planning the route, you should figure out the next simple task to do. What can you do today that will move you toward your destination? After you find it, then allocate time to do it.

4. Adjust your course as necessary.

You need to be careful not to go off course. So regularly check where you are (for example, by comparing your position with your next milestone) and adjust your course as necessary.

***

Persistent starting is a simple strategy, but it can help you achieve your goals with minimum stress and frustration. It works for me, and I hope it will work for you too.

Share/Save/Bookmark

Enhanced by Zemanta

September 13, 2013 Posted by | anxiety, Books, brain, Cognition, Cognitive Behavior Therapy, depression, research, Resilience, stress | , , , , , , , , , , , , , , | 2 Comments

MythBusters: The Human Brain Edition

20130909-225539.jpg

Source Credit: 7 Myths About the Brain
Separating Fact From Fiction
By Kendra Cherry, About.com Guide

The human brain is amazing and sometimes mysterious. While researchers are still uncovering the secrets of how the brain works, they have discovered plenty of information about what goes on inside your noggin. Unfortunately, there are still a lot of brain myths out there.

The following are just a few of the many myths about the brain.

Myth 1: You only use 10 percent of your brain.

You’ve probably heard this oft-cited bit of information several times, but constant repetition does not make it any more accurate. People often use this popular urban legend to imply that the mind is capable of much greater things, such as dramatically increased intelligence, psychic abilities, or even telekinesis. After all, if we can do all the things we do using only 10 percent of our brains, just imagine what we could accomplish if we used the remaining 90 percent.

Reality check: Research suggests that all areas of the brain perform some type of function. If the 10 percent myth were true, brain damage would be far less likely – after all, we would really only have to worry about that tiny 10 percent of our brains being injured. The fact is that damage to even a small area of the brain can result in profound consequences to both cognition and functioning. Brain imaging technologies have also demonstrated that the entire brain shows levels of activity, even during sleep.

“It turns out though, that we use virtually every part of the brain, and that [most of] the brain is active almost all the time. Let’s put it this way: the brain represents three percent of the body’s weight and uses 20 percent of the body’s energy.” – Neurologist Barry Gordon of Johns Hopkins School of Medicine, Scientific American

Myth 2: Brain damage is permanent.

The brain is a fragile thing and can be damaged by things such as injury, stroke, or disease. This damage can result in a range of consequences, from mild disruptions in cognitive abilities to complete impairment. Brain damage can be devastating, but is it always permanent?

Reality check: While we often tend to think of brain injuries as lasting, a person’s ability to recover from such damage depends upon the severity and the location of the injury. For example, a blow to the head during a football game might lead to a concussion. While this can be quite serious, most people are able to recover when given time to heal. A severe stroke, on the other hand, can result in dire consequences to the brain that can very well be permanent.

However, it is important to remember that the human brain has an impressive amount of plasticity. Even following a serious brain event, such as a stroke, the brain can often heal itself over time and form new connections within the brain.

“Even after more serious brain injury, such as stroke, research indicates that — especially with the help of therapy — the brain may be capable of developing new connections and “reroute” function through healthy areas.” – BrainFacts.org

Myth 3: People are either “right-brained” or “left-brained.”

Have you ever heard someone describe themselves as either left-brained or right-brained? This stems from the popular notion that people are either dominated by their right or left brain hemispheres. According to this idea, people who are “right-brained” tend to be more creative and expressive, while those who are “left-brained tend to be more analytical and logical.

Reality Check: While experts do recognize that there is lateralization of brain function (that is, certain types of tasks and thinking tend to be more associated with a particular region of the brain), no one is fully right-brained or left-brained. In fact, we tend to do better at tasks when the entire brain is utilized, even for things that are typically associated with a certain area of the brain.

“No matter how lateralized the brain can get, though, the two sides still work together. The pop psychology notion of a left brain and a right brain doesn’t capture their intimate working relationship. The left hemisphere specializes in picking out the sounds that form words and working out the syntax of the words, for example, but it does not have a monopoly on language processing. The right hemisphere is actually more sensitive to the emotional features of language, tuning in to the slow rhythms of speech that carry intonation and stress.” – Carl Zimmer, Discover

Myth 4: Humans have the biggest brains.

The human brain is quite large in proportion to body size, but another common misconception is that humans have the largest brains of any organism. How big is the human brain? How does it compare to other species?

Reality Check: The average adult has a brain weighing in at about three pounds and measuring up to about 15 centimeters in length. The largest animal brain belongs to that of a sperm whale, weighing in at a whopping 18 pounds! Another large-brained animal is the elephant, with an average brain size of around 11 pounds.

But what about relative brain size in proportion to body size? Humans must certainly have the largest brains in comparison to their body size, right? Once again, this notion is also a myth. Surprisingly, one animal that holds the largest body-size to brain ratios is the shrew, with a brain making up about 10 percent of its body mass.

“Our primate lineage had a head start in evolving large brains, however, because most primates have brains that are larger than expected for their body size. The Encephalization Quotient is a measure of brain size relative to body size. The cat has an EQ of about 1, which is what is expected for its body size, while chimps have an EQ of 2.5 and humans nearly 7.5. Dolphins, no slouches when it comes to cognitive powers and complex social groups, have an EQ of more than 5, but rats and rabbits are way down on the scale at below 0.4.” – Michael Balter, Slate.com

Click Image to read reviews

Click Image to read reviews

Myth 5: We are born with all the brain cells we ever have, and once they die, these cells are gone forever.

Traditional wisdom has long suggested that adults only have so many brain cells and that we never form new ones. Once these cells are lost, are they really gone for good?

Reality Check: In recent years, experts have discovered evidence that the human adult brain does indeed form new cells throughout life, even during old age. The process of forming new brain cells is known as neurogenesis and researchers have found that it happens in at least one important region of the brain called the hippocampus.

“Above-ground nuclear bomb tests carried out more than 50 years ago resulted in elevated atmospheric levels of the radioactive carbon-14 isotope (14C), which steadily declined over time. In a study published yesterday (June 7) in Cell, researchers used measurements of 14C concentration in the DNA of brain cells from deceased patients to determine the neurons’ age, and demonstrated that there is substantial adult neurogenesis in the human hippocampus.” – Dan Cossins, The Scientist

Myth 6: Drinking alcohol kills brain cells.

Partly related to the myth that we never grow new neurons is the idea that drinking alcohol can lead to cell death in the brain. Drink too much or too often, some people might warn, and you’ll lose precious brain cells that you can never get back. We’ve already learned that adults do indeed get new brain cells throughout life, but could drinking alcohol really kill brain cells?

Reality Check: While excessive or chronic alcohol abuse can certainly have dire health consequences, experts do not believe that drinking causes neurons to die. In fact, research has shown that even binge drinking doesn’t actually kill neurons.

“Scientific medical research has actually demonstrated that the moderate consumption of alcohol is associated with better cognitive (thinking and reasoning) skills and memory than is abstaining from alcohol. Moderate drinking doesn’t kill brain cells but helps the brain function better into old age. Studies around the world involving many thousands of people report this finding.” – PsychCentral.com

Myth 7: There are 100 billion neurons in the human brain.

If you’ve ever thumbed through a psychology or neuroscience textbook, you have probably read that the human brain contains approximately 100 billion neurons. How accurate is this oft-repeated figure? Just how many neurons are in the brain?

Reality Check: The estimate of 100 billion neurons has been repeated so often and so long that no one is completely sure where it originated. In 2009, however, one researcher decided to actually count neurons in adult brains and found that the number was just a bit off the mark. Based upon this research, it appears that the human brain contains closer to 85 billion neurons. So while the often-cited number is a few billion too high, 85 billion is still nothing to sneeze at.

“We found that on average the human brain has 86bn neurons. And not one [of the brains] that we looked at so far has the 100bn. Even though it may sound like a small difference the 14bn neurons amount to pretty much the number of neurons that a baboon brain has or almost half the number of neurons in the gorilla brain. So that’s a pretty large difference actually.” – Dr. Suzana Herculano-Houzel
More Psychology Facts and Myths:

References

Balter, M. (2012, Oct. 26). Why are our brains so ridiculously big? Slate. Retrieved from http://www.slate.com/articles/health_and_science/human_evolution/2012/10/human_brain_size_social_groups_led_to_the_evolution_of_large_brains.html
Boyd, R. (2008, Feb 7). Do people only use 10 percent of their brains? Scientific American. Retrieved from http://www.scientificamerican.com/article.cfm?id=people-only-use-10-percent-of-brain
BrainFacts.org. (2012). Myth: Brain damage is always permanent. Retrieved from http://www.brainfacts.org/diseases-disorders/injury/articles/2011/brain-damage-is-always-permanent
Cossins, D. (2013, June 7). Human adult neurogenesis revealed. The Scientist. Retrieved from http://www.the-scientist.com/?articles.view/articleNo/35902/title/Human-Adult-Neurogenesis-Revealed/
Hanson, D. J. (n.d.). Does drinking alcohol kill brain cells? PsychCentral.com. Retrieved from http://www2.potsdam.edu/hansondj/HealthIssues/1103162109.html
Herculano-Houzel S (2009). The human brain in numbers: A linearly scaled-up primate brain. Frontiers in Human Neuroscience, 3(31). doi:10.3389/neuro.09.031.2009
Randerson, J. (2012, Feb 28). How many neurons make a human brain? Billions fewer than we thought. The Guardian. Retrieved from http://www.guardian.co.uk/science/blog/2012/feb/28/how-many-neurons-human-brain
The Technium. (2004). Brains of white matter. http://www.kk.org/thetechnium/archives/2004/11/brains_of_white.php
Zimmer, C. (2009, April 15). The Big Similarities & Quirky Differences Between Our Left and Right Brains. Discover Magazine. Retrieved from http://discovermagazine.com/2009/may/15-big-similarities-and-quirky-differences-between-our-left-and-right-brains
Enhanced by Zemanta

September 9, 2013 Posted by | Alcohol, brain, Cognition, Education, Health Psychology, research, Technology | , , , , , , , , , | 2 Comments

Forget What You’ve Learnt About Learning

20120129-121719.jpg

Source and authorship credit: Everything you thought you knew about learning is wrong Psychology Today
http://www.psychologytoday.com/

Everything You Thought You Knew About Learning Is Wrong How, and how NOT, to learn anything Published on January 28, 2012 by Garth Sundem in Brain Candy

Learning through osmosis didn’t make the strategies list

Taking notes during class? Topic-focused study? A consistent learning environment? All are exactly opposite the best strategies for learning. Really, I recently had the good fortune to interview Robert Bjork, director of the UCLA Learning and Forgetting Lab, distinguished professor of psychology, and massively renowned expert on packing things in your brain in a way that keeps them from leaking out. And it turns out that everything I thought I knew about learning is wrong. Here’s what he said.

First, think about how you attack a pile of study material.

“People tend to try to learn in blocks,” says Bjork, “mastering one thing before moving on to the next.” But instead he recommends interleaving, a strategy in which, for example,instead of spending an hour working on your tennis serve, you mix in a range of skills like backhands, volleys, overhead smashes, and footwork. “This creates a sense of difficulty,” says Bjork, “and people tend not to notice the immediate effects of learning.”

Instead of making an appreciable leap forward with yourserving ability after a session of focused practice, interleaving forces you to make nearly imperceptible steps forward with many skills.

But over time, the sum of these small steps is much greater than the sum of the leaps you would have taken if you’d spent the same amount of time mastering each skill in its turn.

Bjork explains that successful interleaving allows you to “seat” each skill among the others: “If information is studied so that it can be interpreted in relation to other things in memory, learning is much more powerful,” he says.

There’s one caveat: Make sure the mini skills you interleave are related in some higher-order way. If you’re trying to learn tennis, you’d want to interleave serves, backhands, volleys, smashes, and footwork—not serves, synchronized swimming, European capitals, and programming in Java.

Similarly, studying in only one location is great as long as you’ll only be required to recall the information in the same location. If you want information to be accessible outside your dorm room, or office, or nook on the second floor of the library, Bjork recommends varying your study location.

And again, these tips generalize. Interleaving and varying your study location will help whether you’re mastering math skills, learning French, or trying to become a better ballroom dancer.

So too will a somewhat related phenomenon, the spacing effect, first described by Hermann Ebbinghaus in 1885. “If you study and then you wait, tests show that the longer you wait, the more you will have forgotten,” says Bjork. That’s obvious—over time, you forget. But here’s thecool part:

If you study, wait, and then study again, the longer the wait, the more you’ll have learned after this second study session.

Bjork explains it this way: “When we access things from our memory, we do more than reveal it’s there. It’s not like a playback. What we retrieve becomes more retrievable in the future. Provided the retrieval succeeds, the more difficult and involved the retrieval, the more beneficial it is.” Note that there’s a trick implied by “provided the retrieval succeeds”: You should space your study sessions so that the information you learned in the first session remains just barely retrievable. Then, the more you have to work to pull it from the soup of your mind, the more this second study session will reinforce your learning. If you study again too soon, it’s too easy.

Along these lines, Bjork also recommends taking notes just after class, rather than during—forcing yourself to recall a lecture’s information ismore effective than simply copying it from a blackboard. “Get out of court stenographer mode,” says Bjork. You have to work for it.

The more you work, the more you learn, and the more you learn, the more awesome you can become.

“Forget about forgetting,” says Robert Bjork.

“People tend to think that learning is building up something in your memory and that forgetting is losing the things you built.

But in some respects the opposite is true.” See, once you learn something, you never actually forget it. Do you remember your childhood best friend’s phone number? No? Well, Dr. Bjork showed that if you were reminded, you would retain it much more quickly and strongly than if you were asked to memorize a fresh seven-digit number. So this oldphone number is not forgotten—it lives somewhere in you—only, recall can be a bit tricky.

And while we count forgetting as the sworn enemy of learning, in some ways that’s wrong, too. Bjork showed that the two live in a kind of symbiosis in which forgetting actually aids recall.

“Because humans have unlimited storage capacity, having total recall would be a mess,” says Bjork. “Imagine you remembered all the phone numbers of all the houses you had ever lived in. When someone asks you your current phone number, you would have to sort it from this long list.” Instead, we forget the old phone numbers, or at least bury them far beneath theease of recall we gift to our current number. What you thought were sworn enemies are more like distant collaborators.

* Excerpted from Brain Trust: 93 Top Scientists Dish the Lab-Tested Secrets of Surfing, Dating, Dieting, Gambling, Growing Man-Eating Plants and More (Three Rivers Press, March 2012)

@garthsundem
Garth Sundem is the bestselling author of Brain Candy, Geek Logik, and The Geeks’ Guide to World Domination. more…

January 29, 2012 Posted by | ADHD /ADD, brain, Cognition, Education, research, stress, Technology | , , , , , , , , | 6 Comments

iPhone Addiction: Does Smart Phone = Dumber You?

20120129-124046.jpg

Source: Psychology Today, Susan Krauss Whitbourne, Ph.D. The smartphone is quickly becoming an extension of the human brain. The latest entry into the market, the iPhone 4S, contains a feature named Siri that (according to the Apple website): “understands what you say, knows what you mean, and even talks back. Siri is so easy to use and does so much, you’ll keep finding more and more ways to use it.” Now, I love technology as much as anyone (at least when it’s working), but as a psychologist, I have to join in the voice of critics who yearn for a simpler, less tecnical age. It’s not that I wish to return to the pre-computer age, of paper and pencil, however. Instead, I’m worried that we risk having our brains become vestigial organs. Research on technological tools suggests that offloading our mental functions to these electronic devices could cause our brains to go soft. Consider the evidence from a study reported in late 2010 by researchers at McGill University. Neuroscientist Veronique Bohbot and her team reported that relying on a global positioning system (GPS) to get to known locations reduces the function of the hippocampus, the “seahorse” shaped structure in the brain that controls memory and spatial orientation. Participants used to getting around on the basis of their own wits had higher activity and a greater volume in the hippocampus than the older adults using a GPS. What’s more, when it came to their actual performance, the non-GPS users performed better on a memory test. Bohbot recommends that you turn off the GPS when you’re navigating around your hometown and use it only for its actual purpose of finding locations you’ve never been to before. Your hippocampus will thank you, whether you’re 16 or 60.

It’s not much of a leap to extrapolate from the GPS to the smartphone. A normal cellphone can remember numbers for you so that you no longer have to do so. Confess– can you remember the actual cellphone number of the people you call most frequently? We used to rely on our neurons to hold onto these crucial bits of information. Now they reside somewhere out there in the ether. What’s worse is that most people don’t even take the time to write down a new phone number anymore. You call your new acquaintance and your new acquaintance calls you, and the information is automatically stored in your contacts. It’s great for efficiency’s sake, but you’ve now given your working memory one less important exercise. Memory benefits from practice, especially in the crucial stage of encoding. Let’s move from phone numbers to information in general. People with smartphones no longer have to remember important facts because when in doubt, they can just tap into Google. When was the last time St. Louis was in the World Series, you wonder? Easy! Just enter a few letters (not even the whole city name) into your “smart” search engine. Your fingers, much less your mind, don’t have to walk very far at all. Trying to give your brain a workout with a crossword puzzle? What’s to stop you from taking a few shortcuts when the answers are right there on your phone? No mental gymnastics necessary. This leads us to Siri, that seductress of the smartphone. With your iPhone slave on constant standby, you don’t even have to key in your questions. Just say the question, and Siri conjures up the answer in an instant. With a robot at your fingertips, why even bother to look the information up yourself? The irony is that smartphones have the potential to make our brains sharper, not dumber. Researchers are finding that videogame play involving rapid decision-making can hone your cognitive resources. Older adults, in particular, seem to be able to improve their attentional and decision-making speeded task performance when they play certain games. People with a form of amnesia in which they can’t learn new information can also be helped by smartphones, according to a study conducted by Canadian researchers (Svobodo & Richards, 2009). The problem is not the use of the smartphone itself; the problem comes when the smartphone takes over a function that your brain is perfectly capable of performing. It’s like taking the elevator instead of the stairs; the ride may be quicker but your muscles won’t get a workout. Smartphones are like mental elevators. Psychologists have known for years that the “use it or lose it” principle is key to keeping your brain functioning in its peak condition throughout your life. As we become more and more drawn to these sleeker and sexier gadgets, the trick will be learning how to “use it.” So take advantage of these 5 tips to help your smartphone keep you smart: 1. Don’t substitute your smartphone for your brain. Force yourself to memorize a phone number before you store it, and dial your frequently called numbers from memory whenever possible. If there’s a fact or word definition you can infer, give your brain the job before consulting your electronic helper. 2. Turn off the GPS app when you’re going to familiar places. Just like the GPS-hippocampus study showed, you need to keep your spatial memory as active as possible by relying on your brain, not your phone, when you’re navigating well-known turf. If you are using the GPS to get around a new location, study a map first. Your GPS may not really know the best route to take (as any proper Bostonian can tell you!). 3. Use your smartphone to keep up with current events. Most people use their smartphones in their leisure time for entertainment. However, with just a few easy clicks, you can just as easily check the headlines, op-eds, and featured stories from respected news outlets around the world. This knowledge will build your mental storehouse of information, and make you a better conversationalist as well. 4. Build your social skills with pro-social apps. Some videogames can actually make you a nicer person by strengthening your empathic tendencies. Twitter and Facebook can build social bonds. Staying connected is easier than ever, and keeping those social bonds active provides you with social support. Just make sure you avoid some of the social media traps of over-sharing and FOMO (fear of missing out) syndrome. 5. Turn off your smartphone while you’re driving. No matter how clever you are at multitasking under ordinary circumstances, all experts agree that you need to give your undivided attention to driving when behind the wheel. This is another reason to look at and memorize your route before going someplace new. Fiddling with your GPS can create a significant distraction if you find that it’s given you the wrong information. Smartphones have their place, and can make your life infinitely more productive as long as you use yours to supplement, not replace, your brain. Reference: Svoboda, E., & Richards, B. (2009). Compensating for anterograde amnesia: A new training method that capitalizes on emerging smartphone technologies. Journal of the International Neuropsychological Society, 15(4), 629-638. doi:10.1017/S1355617709090791 Follow Susan Krauss Whitbourne, Ph.D. on Twitter @swhitbo for daily updates on psychology, health, and aging and please check out my website,www.searchforfulfillment.com where you can read this week’s Weekly Focus to get additional information, self-tests, and psychology-related links.

Share/Save/Bookmark

Enhanced by Zemanta

October 24, 2011 Posted by | Addiction, brain, Cognition, Health Psychology, Identity, Internet, research, Technology | , , , , , , , , , , | 2 Comments

Emotional Intelligence: Learning To Roll With The Punches

It’s a hot-buzz topic that covers everything from improving workplace performance and successfully climbing the corporate ladder to building the happiest of marriages to ending school bullying. But what exactly is Emotional Intelligence (EI)? If we lack it, can we learn it? And how do we know if our EI is high or low? Is it only high if we’re really, really nice?

Three scholarly researchers – including University of Cincinnati Psychology Professor Gerry Matthews – delved into the science of EI and published “What We Know About Emotional Intelligence: How it Affects Learning, Work, Relationships, and Our Mental Health.”

Published by MIT Press (2009), the book was recently awarded the American Publishers Award for Professional and Scholarly Excellence – the PROSE Awards – in the biological and life sciences category of biomedicine and neuroscience. The book, co-authored by Matthews, Moshe Zeidner (University of Haifa) and Richard D. Roberts (Center for New Constructs, Educational Testing Service, Princeton, N.J.), was also on display at the UC Libraries’ Authors, Editors and Composers Reception and Program from 3:30-5 p.m., Thursday, April 22, in the Russell C. Myers Alumni Center.

MIT Press promotions describe EI as the “ability to perceive, regulate and communicate emotions – to understand emotions in ourselves and others.” Workplaces want to test for it to find the most EI-talented employees, and consultants are touting training and EI tests to improve productivity. “In the popular writings, EI tends to be defined very broadly and one can’t proceed with scientific research with such a vague and broad definition,” Matthews says.

Matthews’ research interests have explored how stress, mood and coping ability can affect performance on tests, in the workplace and on the highway. He adds that amid the grim economy, even the people who have jobs are feeling high levels of stress in the workplace and are feeling more challenged by workplace demands and concerns about job security. In general terms, those who can roll with the punches – with a shrug and a smile – may have higher Emotional Intelligence.

Click image to read reviews

Then again, “The intimate association of personality and emotion sets a trap for researchers interested in Emotional Intelligence,” writes Matthews. “It might seem that happy, calm states of mind should be seen as the person imbued with high Emotional Intelligence. However, such emotional tendencies may be no more than a consequence of biases in brain functioning or information-processing routines operating without insight or ‘intelligence.’ Some individuals – in part because of their DNA – are simply fortunate in being prone to pleasant moods, so it follows that emotional states do not alone provide an index of Emotional Intelligence,” Matthews states in the book.

In fact, Matthews says he’s skeptical that people who are better at managing stress hold higher Emotional Intelligence, but as the researchers found as they tried to narrow down the science of Emotional Intelligence, more research is needed. For instance, is someone with higher EI in the workplace more productive, or are they just better at self-promotion and forming positive relationships with co-workers? Matthews says he believes EI appears to be very modestly related to workplace performance, and could turn out to be nothing more than a business fad.

He adds the researchers are also skeptical about all of those EI tests, particularly those self-assessments. After all, people could be rating themselves the way they see themselves or the way they would like to be seen, and not like they actually are.

Currently, authors Matthews and Roberts are researching the testing of EI through video scenarios. The situation judgment test involves watching the videos unfold a challenging situation, and then the video comes to a stop and offers different options for resolving the problem. Matthews is building on his earlier research which explored whether negative moods affected good decision making abilities. “Through the video project, the idea is to see if emotionally intelligent people are better able to make rational decisions under stress,” he says.

The researchers are also examining the link between EI and school social and emotional learning programs.

Source:
Dawn Fuller
University of Cincinnati

Share/Save/Bookmark

Reblog this post [with Zemanta]

May 2, 2010 Posted by | anxiety, Books, Cognition, Health Psychology, Identity, Positive Psychology, research, Resilience | , , , , , , , , , , , | 4 Comments

And They All Lived Together In a Little Row Boat…Clap! Clap!: How Clapping Games Improve Cognition And Motor Skills In Children

BEER-SHEVA, ISRAEL, April 28, 2010 – A researcher at Ben-Gurion University of the Negev (BGU) conducted the first study of hand-clapping songs, revealing a direct link between those activities and the development of important skills in children and young adults, including university students.

“We found that children in the first, second and third grades who sing these songs demonstrate skills absent in children who don’t take part in similar activities,” explains Dr. Idit Sulkin a member of BGU’s Music Science Lab in the Department of the Arts.

“We also found that children who spontaneously perform hand-clapping songs in the yard during recess have neater handwriting, write better and make fewer spelling errors.”

Dr. Warren Brodsky, the music psychologist who supervised her doctoral dissertation, said Sulkin’s findings lead to the presumption that “children who don’t participate in such games may be more at risk for developmental learning problems like dyslexia and dyscalculia.

“There’s no doubt such activities train the brain and influence development in other areas.  The children’s teachers also believe that social integration is better for these children than those who don’t take part in these songs.”

As part of the study, Sulkin went to several elementary school classrooms and engaged the children in either a board of education sanctioned music appreciation program or hand-clapping songs training – each lasting a period of 10 weeks.

“Within a very short period of time, the children who until then hadn’t taken part in such activities caught up in their cognitive abilities to those who did,” she said.  But this finding only surfaced for the group of children undergoing hand-clapping songs training. The result led Sulkin to conclude that hand-clapping songs should be made an integral part of education for children aged six to 10, for the purpose of motor and cognitive training.

During the study, “Impact of Hand-clapping Songs on Cognitive and Motor Tasks,” Dr. Sulkin interviewed school and kindergarten teachers, visited their classrooms and joined the children in singing. Her original goal, as part of her thesis, was to figure out why children are fascinated by singing and clapping up until the end of third grade, when these pastimes are abruptly abandoned and replaced with sports.

“This fact explains a developmental process the children are going through,” Dr. Sulkin observes.  “The hand-clapping songs appear naturally in children’s lives around the age of seven, and disappear around the age of 10.  In this narrow window, these activities serve as a developmental platform to enhance children’s needs — emotional, sociological, physiological and cognitive. It’s a transition stage that leads them to the next phases of growing up.”

Sulkin says that no in-depth, long-term study has been conducted on the effects that hand-clapping songs have on children’s motor and cognitive skills.  However, the relationship between music and intellectual development in children has been studied extensively, prompting countless parents to obtain a “Baby Mozart” CD for their children.

Click image to read reviews

Nevertheless, the BGU study demonstrates that listening to 10 minutes of Mozart music (.i.e., the ‘Mozart Effect’) does not improve spatial task performance compared to 10 minutes of hand-clapping songs training or 10 minutes of exposure to silence.

Lastly, Sulkin discovered that hand-clapping song activity has a positive effect on adults: University students who filled out her questionnaires reported that after taking up such games, they became more focused and less tense.

“These techniques are associated with childhood, and many adults treat them as a joke,” she said.  “But once they start clapping, they report feeling more alert and in a better mood.”

Sulkin grew up in a musical home.  Her father, Dr. Adi Sulkin, is a well-known music educator who, in the 1970s and 1980s, recorded and published over 50 cassettes and videos depicting Israeli children’s play-songs, street-songs, holiday and seasonal songs, and singing games targeting academic skills.

“So quite apart from the research experience, working on this was like a second childhood,” she noted.

Source: American Associates, Ben-Gurion University of the Negev

Share/Save/Bookmark

Reblog this post [with Zemanta]

May 1, 2010 Posted by | ADHD /ADD, brain, Child Behavior, Cognition, Education, Exercise, Parenting, research | , , , , , , , , , , , , , , , | 1 Comment

Winners Are Grinners: Even If There’s Nothing to Win!

Whether it’s for money, marbles or chalk, the brains of reward-driven people keep their game faces on, helping them win at every step of the way. Surprisingly, they win most often when there is no reward.

Read Abstract Here

That’s the finding of neuroscientists at Washington University in St. Louis, who tested 31 randomly selected subjects with word games, some of which had monetary rewards of either 25 or 75 cents per correct answer, others of which had no money attached.

Subjects were given a short list of five words to memorize in a matter of seconds, then a 3.5-second interval or pause, then a few seconds to respond to a solitary word that either had been on the list or had not. Test performance had no consequence in some trials, but in others, a computer graded the responses, providing an opportunity to win either 25 cent or 75 cents for quick and accurate answers. Even during these periods, subjects were sometimes alerted that their performance would not be rewarded on that trial.

Prior to testing, subjects were submitted to a battery of personality tests that rated their degree of competitiveness and their sensitivity to monetary rewards.

Designed to test the hypothesis that excitement in the brains of the most monetary-reward-sensitive subjects would slacken during trials that did not pay, the study is co-authored by Koji Jimura, PhD, a post-doctoral researcher, and Todd Braver, PhD, a professor, both based in psychology in Arts & Sciences. Braver is also a member of the neuroscience program and radiology department in the university’s School of Medicine.

But the researchers found a paradoxical result: the performance of the most reward-driven individuals was actually most improved – relative to the less reward-driven – in the trials that paid nothing, not the ones in which there was money at stake.

Even more striking was that the brain scans taken using functional Magnetic Resonance Imaging (fMRI) showed a change in the pattern of activity during the non-rewarded trials within the lateral prefrontal cortex (PFC), located right behind the outer corner of the eyebrow, an area that is strongly linked to intelligence, goal-driven behavior and cognitive strategies. The change in lateral PFC activity was statistically linked to the extra behavioral benefits  observed in the reward-driven individuals.

The researchers suggest that this change in lateral PFC activity patterns represents a flexible shift in response to the motivational importance of the task, translating this into a superior task strategy that the researchers term “proactive cognitive control.” In other words, once the rewarding motivational context is established in the brain indicating there is a goal-driven contest at hand, the brain actually rallies its neuronal troops and readies itself for the next trial, whether it’s for money or not.

“It sounds reasonable now, but when I happened upon this result, I couldn’t believe it because we expected the opposite results,” says Jimura, first author of the paper. “I had to analyze the data thoroughly to persuade myself. The important finding of our study is that the brains of these reward- sensitive individuals do not respond to the reward information on individual trials. Instead, it shows that they have persistent motivation, even in the absence of a reward. You’d think you’d have to reward them on every trial to do well. But it seems that their brains recognized the rewarding motivational context that carried over across all the trials.”

Click image to read reviews

The finding sheds more light on the workings of the lateral PFC and provides potential behavioral clues about personality, motivation, goals and cognitive strategies. The research has important implications for understanding the nature of persistent motivation, how the brain creates such states, and why some people seem to be able to use motivation more effectively than others. By understanding the brain circuitry involved, it might be possible to create motivational situations that are more effective for all individuals, not just the most reward-driven ones, or to develop drug therapies for individuals that suffer from chronic motivational problems.Their results are published April 26 in the early online edition of the Proceedings of the National Academy of Science.

Everyone knows of competitive people who have to win, whether in a game of HORSE, golf or the office NCAA basketball tournament pool. The findings might tell researchers something about the competitive drive.

The researchers are interested in the signaling chain that ignites the prefrontal cortex when it acts on reward-driven impulses, and they speculate that the brain chemical dopamine could be involved. That could be a potential direction of future studies. Dopamine neurons, once thought to be involved in a host of pleasurable situations, but now considered more of learning or predictive signal, might respond to cues that let the lateral PFC know that it’s in for something good. This signal might help to keep information about the goals, rules or best strategies for the task active in mind to increase the chances of obtaining the desired outcome.

In the context of this study, when a 75-cent reward is available for a trial, the dopamine-releasing neurons could be sending signals to the lateral PFC that “jump start” it to do the right procedures to get a reward.

“It would be like the dopamine neurons recognize a cup of Ben and Jerry’s ice cream, and tell the lateral PFC the right action strategy to get the reward – to grab a spoon and bring the ice cream to your mouth,” says Braver. “We think that the dopamine neurons fires to the cue rather than the reward itself, especially after the brain learns the relationship between the two. We’d like to explore that some more.”

They also are interested in the “reward carryover state,” or the proactive cognitive strategy that keeps the brain excited even in gaps, such as pauses between trials or trials without rewards. They might consider a study in which rewards are far fewer.

“It’s possible we’d see more slackers with less rewards,” Braver says. “That might have an effect on the reward carryover state. There are a host of interesting further questions that this work brings up which we plan to pursue.”

Source: Washington University in St. Louis,
Share/Save/Bookmark

Reblog this post [with Zemanta]

April 28, 2010 Posted by | Addiction, brain, Cognition, research, Social Psychology | , , , , , , , , , , , | 2 Comments

Brain Training Or Just Brain Straining?: The Benefits Of Brain Exercise Software Are Unclear

You’ve probably heard it before: the brain is a muscle that can be strengthened. It’s an assumption that has spawned a multimillion-dollar computer game industry of electronic brain-teasers and memory games. But in the largest study of such brain games to date, a team of British researchers has found that healthy adults who undertake computer-based “brain-training” do not improve their mental fitness in any significant way.

Read The Original Research Paper (Draft POF)

The study, published online Tuesday by the journal Nature, tracked 11,430 participants through a six-week online study. The participants were divided into three groups: the first group undertook basic reasoning, planning and problem-solving activities (such as choosing the “odd one out” of a group of four objects); the second completed more complex exercises of memory, attention, math and visual-spatial processing, which were designed to mimic popular “brain-training” computer games and programs; and the control group was asked to use the Internet to research answers to trivia questions.

All participants were given a battery of unrelated “benchmark” cognitive-assessment tests before and after the six-week program. These tests, designed to measure overall mental fitness, were adapted from reasoning and memory tests that are commonly used to gauge brain function in patients with brain injury or dementia. All three study groups showed marginal — and identical — improvement on these benchmark exams.

But the improvement had nothing to do with the interim brain-training, says study co-author Jessica Grahn of the Cognition and Brain Sciences Unit in Cambridge. Grahn says the results confirm what she and other neuroscientists have long suspected: people who practice a certain mental task — for instance, remembering a series of numbers in sequence, a popular brain-teaser used by many video games — improve dramatically on that task, but the improvement does not carry over to cognitive function in general. (Indeed, all the study participants improved in the tasks they were given; even the control group got better at looking up answers to obscure questions.) The “practice makes perfect” phenomenon probably explains why the study participants improved on the benchmark exams, says Grahn — they had all had taken it once before. “People who practiced a certain test improved at that test, but improvement does not translate beyond anything other than that specific test,” she says.

The authors believe the study, which was run in conjuction with a BBC television program called “Bang Goes the Theory,” undermines the sometimes outlandish claims of many brain-boosting websites and digital games. According to a past TIME.com article by Anita Hamilton, HAPPYneuron, an example not cited by Grahn, is a $100 Web-based brain-training site that invites visitors to “give the gift of brain fitness” and claims its users saw “16%+ improvement” through exercises such as learning to associate a bird’s song with its species and shooting basketballs through virtual hoops. Hamilton also notes Nintendo’s best-selling Brain Age game, which promises to “give your brain the workout it needs” through exercises like solving math problems and playing rock, paper, scissors on the handheld DS. “The widely held belief that commercially available computerized brain-training programs improve general cognitive function in the wider population lacks empirical support,” the paper concludes.

Click on image to read reviews

Not all neuroscientists agree with that conclusion, however. In 2005, Torkel Klingberg, a professor of cognitive neuroscience at the Karolinska Institute in Sweden, used brain imaging to show that brain-training can alter the number of dopamine receptors in the brain — dopamine is a neurotransmitter involved in learning and other important cognitive functions. Other studies have suggested that brain-training can help improve cognitive function in elderly patients and those in the early stages of Alzheimer’s disease, but the literature is contradictory.

Klingberg has developed a brain-training program called Cogmed Working Memory Training, and owns shares in the company that distributes it. He tells TIME that the Nature study “draws a large conclusion from a single negative finding” and that it is “incorrect to generalize from one specific training study to cognitive training in general.” He also criticizes the design of the study and points to two factors that may have skewed the results.

On average the study volunteers completed 24 training sessions, each about 10 minutes long — for a total of three hours spent on different tasks over six weeks. “The amount of training was low,” says Klingberg. “Ours and others’ research suggests that 8 to 12 hours of training on one specific test is needed to get a [general improvement in cognition].”

Second, he notes that the participants were asked to complete their training by logging onto the BBC Lab UK website from home. “There was no quality control. Asking subjects to sit at home and do tests online, perhaps with the TV on or other distractions around, is likely to result in bad quality of the training and unreliable outcome measures. Noisy data often gives negative findings,” Klingberg says.

Brain-training research has received generous funding in recent years — and not just from computer game companies — as a result of the proven effect of neuroplasticity, the brain’s ability to remodel its nerve connections after experience. The stakes are high. If humans could control that process and bolster cognition, it could have a transformative effect on society, says Nick Bostrom of Oxford University‘s Future of Humanity Institute. “Even a small enhancement in human cognition could have a profound effect,” he says. “There are approximately 10 million scientists in the world. If you could improve their cognition by 1%, the gain would hardly be noticeable in a single individual. But it could be equivalent to instantly creating 100,000 new scientists.”

For now, there is no nifty computer game that will turn you into Einstein, Grahn says. But there are other proven ways to improve cognition, albeit only by small margins. Consistently getting a good night’s sleep, exercising vigorously, eating right and maintaining healthy social activity have all been shown to help maximize a brain’s potential over the long term.

What’s more, says Grahn, neuroscientists and psychologists have yet to even agree on what constitutes high mental aptitude. Some experts argue that physical skill, which stems from neural pathways, should be considered a form of intelligence — so, masterful ballet dancers and basketball players would be considered geniuses.

Jason Allaire, co-director of the Games through Gaming lab at North Carolina State University says the Nature study makes sense; rather than finding a silver bullet for brain enhancement, he says, “it’s really time for researchers to think about a broad or holistic approach that exercises or trains the mind in general in order to start to improve cognition more broadly.”

Or, as Grahn puts it, when it comes to mental fitness, “there are no shortcuts.”

Credit: Time.com

Share/Save/Bookmark

Reblog this post [with Zemanta]

April 23, 2010 Posted by | Age & Ageing, Books, brain, Cognition, Education, Health Psychology, Internet, research | , , , , , , , , , , , , , , | 6 Comments