Showing posts with label Working-memory. Show all posts
Showing posts with label Working-memory. Show all posts

Wednesday, October 20, 2021

The Struggle Is Real: Productive Struggle


Learning By Doing


Let's get this party started with a fun little puzzle. You may have seen this on your favorite social media platform [1]. 

See if you can solve it.

Struggling Productively

That was a difficult problem, right? Would you say you "struggled" while attempting to solve it? I know I did! This experience raises a couple of questions:

1. What, exactly, causes us to struggle?
2. When does struggling assist learning, and when does it harm learning? 
3. Under what conditions does struggling lead to long-term learning and transfer? 

Before we attempt to answer these questions, let's acknowledge the emotional components of struggling. I can only speak for myself, but phenomenologically, it doesn't always feel great. I get hot. I feel dumb. Intrusive thoughts distract me from the task-at-hand. Of course, if your working memory is loaded with these intrusive thoughts, then you have fewer resources to dedicate to the current task...which will ultimately cause you to fail!

Sources of Struggle

What causes us to struggle? 

Prior Knowledge: First, we may not have all of the prerequisite knowledge to solve the problem. Once we recognize this fact (hopefully earlier rather than later!), then we can treat it as a quest to find the missing information. 

Incorrect Assumptions: Another reason why we might struggle is because we've made an incorrect assumption. If you assume that a fox and wolf weigh the same, then you will eventually run into a road-block when solving the above problem. 

Unnecessary Problem Constraints: We might struggle because we imposed an unnecessary constraint on the problem. This often happens while solving an insight problem. For example, in the classic nine-dot problem, problem solvers unnecessarily add the constraint that they are not allowed to go beyond the edge of the box. 

Flawed Representation: Finally, we may have chosen a flawed or limited representation. We've seen time and again how important representations are for solving problems. For example, students fail to calculate the area of a parallelogram when presented outside of the canonical orientation (i.e., laying flat along the long side).

What makes struggling "productive?"

According to James Hiebert and Douglas Grouws's book chapter [2], productive struggle happens when a student is working on a problem just outside of their current ability level. This is related to Lev Vygotsky's idea of the "zone of proximal development" (see Fig. 1). 

There are things that you can do autonomously. These are well-practiced skills or declarative knowledge that you've mastered previously. There is also a bunch of things you can't do (at least not yet!). But in between those two spheres are things that you can do with some assistance.

Struggle is most productive when done under the watchful eye of a more knowledgeable partner. They are there to step in and nudge the novice in the right direction. 

Figure 1. Vygotsky's three zones, with the middle as the "zone of proximal development."


Struggle, then, is maximally helpful for several reasons. 

It allows students to appreciate the context of the lesson. If you just give a lecture on solving systems of equations, then the student may not have any appreciation for why systems of equations is a powerful problem-solving technique. However, if you first let them try to figure out how to calculate the weight of the chicken, fox, and wolf, then they might see the utility of systems of equations.

If a student is lacking a key piece of information, then struggling to solve a problem may expose a gap in their knowledge. An impasse in problem solving might force a student to confront the possibility that there is something wrong with their understanding [3]. 

There may be some small amount of discovery involved. 
For example, there are are (at least) three key insights when solving this animal weight problem: using variables (x, y, & z) instead of animals, isolating a variable for each of the three known weights, and finally substituting the isolated variables into the other equations. Having a key insight or making a discovery is highly motivating.

That brings us to the final reason. Struggle is useful because it necessarily engages a student's conceptual understanding. For any of the three key insights, there is a rich conversation that can connect back to knowledge that a student already possesses (e.g., the concept of a "variable," isolating a variable, variable substitution, and mathematical equivalence). 

The Classroom Connection

Struggling doesn't have to be fraught with negative emotions. In some contexts, struggling is actually kind of fun. Think about the last video game you played. Games are specifically designed to cause you to struggle. In fact, there is some research to suggest that players actually enjoy dying (the ultimate failure!) in first-person shooter games more than shooting other players [4]. Another example is a well-written mystery. You may struggle to figure out "who dun it," but it is an entirely enjoyable experience. It would be beneficial to everyone if academic tasks that cause us to struggle to be structured in a way that is more like a game, puzzle, or mystery. 

Perhaps our struggle as educators and instructional designers, is to figure out how to make struggling an enjoyable educational experience! 🧩


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] I adapted this problem from Sara Van Der Werf's blog, and you can follow her on twitter @saravdwerf. This might also be a good time to link back to our prior conversation about problem isomorphs.

[2] Hiebert, J., & Grouws, D. A. (2007). The effects of classroom mathematics teaching on students’ learning. Second handbook of research on mathematics teaching and learning, 371-404.

[3] VanLehn, K. (1988). Toward a theory of impasse-driven learning. In Learning issues for intelligent tutoring systems (pp. 19-41). Springer, New York, NY.

[4] Ravaja, N., Turpeinen, M., Saari, T., Puttonen, S., & Keltikangas-Järvinen, L. (2008). The psychophysiology of James Bond: Phasic emotional responses to violent video game eventsEmotion, 8(1), 114-120. https://doi.org/10.1037/1528-3542.8.1.114

Wednesday, March 24, 2021

The Mind’s CEO: Executive Function


Executive Function

Learning By Doing

Let's play a game. It's super fun...I promise! Download and print this file. Your goal is to cross out all of the lower case d's with two dots above it. Try to be as fast and accurate as possible. Don't forget to time yourself. Ready? Go! [1]

Back to the Front

Stop me if you've heard this one. The left hemisphere of your brain is responsible for logical processing; the right hemisphere is designed for creative and wholistic thinking. While there may be a tiny grain of truth to these over-generalizations, there is a much less talked about difference in brain functioning. As you go from the back of the brain to the front, thinking goes from extremely concrete to highly abstract. 

That's right! The very back of your brain is reserved for visual processing and low-level muscle control. But as you move forward, toward your forehead and eyes, thinking becomes much more complex. This is the location of higher-order thinking skills such as planning, organizing, and problem solving. This area of the brain called the pre-frontal cortex. This is where you will find executive functioning.

Executive function includes several different cognitive processes. They include, but are not limited to, working memory, response suppression, and attentional focus. 

Working Memory

Baddeley's model of working memory features three components (see Fig. 1). There are two slave systems — the visuospatial sketchpad and the articulatory loop — and a central executive. The central executive controls the operation of the slaves systems. It can store and retrieve information from each slave system, and it can also re-represent the same piece of information in different forms (e.g., translating a piece of an image into a word, or vice versa). 

In other words, the central executive must make decisions about the relevance of information and how best to represent it. It must also decide which information needs to be refreshed and maintained in working memory and which information can be safely discarded.

Figure 1. Baddeley's model of working memory.

Response Suppression

We all know how difficult it is for some people to suppress the urge to respond in certain situations. Below are several examples of response suppression, categorized by the domains in which they were found:

Popular Culture: Response suppression failure has made its way into movies (e.g., Roger Rabbit cannot contain himself when faced with the old "shave and a haircut trick") and games. In a previous post on Ironic Processing, we talked about the frustration inherit in the game of Taboo!.

Psychology: We also see examples of response suppression in the materials used in cognitive psychology. The Stroop task is a classic example because, in one variant of the task (i.e., when the color of ink and word conflict), you must suppress the urge to read the word and name the color of ink. 

Neuroscience: The frontal lobe (i.e., the seat of executive functioning), is responsible for response suppression. There is a really interesting example from neuroscience where Phineas Gage had his fontal lobe damaged. After his accident, he became something of a jerk. His behavior strongly suggested that he could not suppress his urges.

Classroom: Response suppression in the classroom is very real, and it can take on many different forms. Behaviorally, little kids (eventually) learn that they must raise their hand before blurting out the answer to the teacher's question. 

A more cognitive example can be found in the world's shortest IQ test
  1. A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost? _____ cents
When confronted with a question like this, you may feel the need to suppress your intuitive answer (10 cents) and apply your knowledge of algebra to determine the answer ((b+100) + b = 110). In this case, the fast answer isn't the right answer [2].

Attentional Focus 

In a previous post, we talked about the "myth of multitasking." Most people think they can do multiple things at once, but there are severe limitations. You might be able to walk, talk, and chew gum, but you won't be able to listen to a lecture, deeply process the contents, and simultaneously take notes. People are serial processors (as opposed to "parallel processors"). A useful metaphor for attention is that it is a spotlight, and it can only shine on one thing at at time. 

As serial processors, we need to make decisions about which stimuli to pay attention to. This is where executive functioning comes into play. When an alert goes off on our phone, we have to decide to pay attention to it (or not). Unfortunately, that "decision" isn't really a decision anymore. Over time, we become conditioned to immediately abandon what we were thinking about and look at our phone. In other words, we have trained our attentional system to give our phone primary status. Ideally, we would structure our learning environment so that it removes unnecessary distractions. Keeping cellphones on "Do not disturb" mode and out of view is the best way to prevent our attention from being captured.

Attentional distraction can also be internally generated. For example, if you are a minority, and you are reminded of your minority status, perhaps because of an off-hand comment or some other feature in the environment (i.e., you are the only one of your group), then those distracting thoughts can pull attention away from the task at hand. This phenomenon is called stereotype threat, and it has pernicious effects on performance [1].

The final example of attentional focus is on information within a task. Suppose you are asked to solve the following problem: 
Derek has 4 action-adventure video games and 9 board games. Desi has 3 role-playing video games and 2 lawn games. If they combine their games, how many video games do Derek and Desi have? 
Notice that this problem has some very tempting, but completely irrelevant, information. One skill that students need to learn is to ignore the distracting information as they solve the problem. Some teachers might recommend highlighting the relevant information (or crossing out the irrelevant information). The goal is to help the attentional system stay focused on the relevant bits.

The Classroom Connection

How can we structure the classroom environment to support the development of executive functioning? Here are a few recommendations:  

  1. We should strive to limit the number of distractions in the classroom. Put smartphones away and out of sight. 
  2. If the mode of instruction is primarily a lecture, then tell your students not to take notes during the lecture [2]. Instead, ask that they listen to what you are saying. After class is over, students should then be given a chance to write down everything they remember. I assume this is a controversial recommendation, so expect students to push back.
  3. Response suppression and attentional focus are both skills that can be learned. One way to develop these skills is mindfulness training, which is starting to gain some empirical support [4]. 

In summary, executive function is a critical component to higher-order thinking and reasoning. In other words, it definitely deserves the corner office! 


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] This "game" is a test of executive function because you have to hold in working memory the symbol-to-match. The stimuli were created to be highly confusable; therefore, you must suppress certain responses (e.g., the "d" with only a single dot above it, or a "d" with two dots below it). I heavily borrowed the design from the "d2" test of executive function: Lyons, E. M., Simms, N., Begolli, K. N., & Richland, L. E. (2018). Stereotype threat effects on learning from a cognitively demanding mathematics lesson. Cognitive science, 42(2), 678-690.

[2] The "intuitive" versus "algebraic" answer is a good example of the distinction Daniel Kahneman makes in his book Thinking, Fast and Slow

[3] I was fortunate enough to take a course from Herbert A. Simon. He didn't let us take notes during his lectures precisely because we are serial processors. In other words, he applied the findings from cognitive science (a field he helped start!) to his own class. 

[4] Bellinger, D. B., DeCaro, M. S., & Ralston, P. A. (2015). Mindfulness, anxiety, and high-stakes mathematics performance in the laboratory and classroom. Consciousness and cognition, 37, 123-132.

Friday, November 20, 2020

What a Load: Cognitive Load

 

Learning By Doing

Before we dive in, let's do a couple of math problems. Take a moment to compute the sum of the following number sentence: 

34 + 66 = ?

Ok, not too bad, right? I intentionally picked some numbers that are fairly "nice." Let's try another one: 

34 * 66 = ?

Same numbers, different operator. Also, much harder, right? Why is the second problem more difficult than the first? If you were an instructional designer, what would you do to help support a student who is learning multicolumn addition and multiplication for the first time?

"I'm carrying quite a load here." —Marge Gunderson, Fargo (1996)

The obvious answer to the question, Why is the second problem more difficult than the first?, is because the cognitive load is higher for the multiplication problem. Let's take a moment to model the cognitive operations as they are applied to each digit. In addition, we will also track the numbers as they enter (or leave) working memory. 

For the addition problem, the first thing we should ask ourselves is, Is this the right representation? The problem is stated as a linear number sentence: 34 + 66 = ?. But is that the easiest way to represent the problem? Perhaps it is easier to mentally transpose the numbers so they are stacked, with the place values aligned, like this: 

    66
+  34
    ??

Now I can mentally run the addition algorithm.
  1. To start, I have two items in working memory (WM: 66, 34). 
  2. I focus my attention on the ones place and recall the sum of 6+4. Now I have to add a new item to working memory, which is "10." Unfortunately, I can't think of it as a single item because I need to put zero in the ones place value and carry the "1" to the tens column. That brings our working memory total to 4 items (WM: 66, 34, 0, 1). 
  3. Now I focus my attention on the tens column. I need to compute the sum of 1+6+3, which is "10." Now I have a new item, which brings my total up to five items (WM: 66, 34, 0, 1, 10). 
  4. I can probably drop the "1" from working memory because I already processed it; however, I do need to assemble the sum by putting 10 in front of my zero in the ones column (WM: 66, 34, 0, 10). 
  5. Now I have my answer, 100. All of the items can now be expunged from working memory. 
Suppose instead, I decompose the digits so they are "60+6" and "30+4." The tradeoff is that I now I have 4 items in working memory to start; however, maybe the trade-off is worth it. 
  1. I start by decomposing the digits (WM: 60, 6, 30, 4).
  2. If I add from left to right: 60 + 30 is 90 (WM: 60, 6, 30, 4, 90). 
  3. Since I computed the sum, I can drop 60 and 30 from working memory (WM: 6, 4, 90). 
  4. Once I add 6 + 4, and get 10, I can drop 6 and 4  (WM: 90, 10). 
  5. Now I am down to two items. I add 90 + 10 and get my final answer. 
I modeled the addition problem twice to demonstrate that cognitive load depends on how you represent the problem. Both methods hit a peak of 5 items. However, the second method dropped down to 3 and 2 items very quickly; whereas, the first method had to carry 4 or more items for a longer duration.

If we conduct the same cognitive task analysis for the multiplication problem, we will find that the number of digits in working memory spikes at 14 or 15 items (depending on how you solve it). Since the limit of working memory is only 7±2 items, we are well beyond what most of us can carry around in our heads. 

You can almost feel the weight of the extra digits as you try to track all of the partial products. That extra weight you feel is the very essence of cognitive load.

Trading Cognition for Perception

This might be difficult, but imagine the point in your life when you did not know how to add. Your teacher had to help you at first, and then slowly withdrew their support as you progressed. It's likely your first experience with addition involved working with objects and/or your fingers. An adult might ask, "What is 2 plus 3?" To answer that, you hold up two fingers, and then start counting up to three. Once you've counted out three fingers, then you start and count up the total number of fingers. This is a very early strategy that kids use.

Over the course of your childhood, you may encounter the problem "2 + 3" hundreds, maybe even thousands, of times. With that much practice, you soon discard your counting strategy and commit the chunk "2 + 3 = 5" to long-term memory. Now, when you encounter the stimulus "2 + 3," you don't need to compute anything. Instead, it becomes a recognition task. 

In other words, you trade cognition (i.e., computation) for perception (i.e., recognition). Repeatedly solving the same problem, until it becomes routine, also goes by the name automaticity.

The S.T.E.M. Connection

What does this mean for education? We want our students to convert extremely basic symbols into larger and more complex chunks of information. For example, we want our geometry students to see the formula A = 2Ï€r, not as an equation with five separate symbols. Instead, we want them to see that whole formula as a single chunk. 

Why is that important? It's important because larger, more complex chunks means that working memory has more space for processing and computation. When various symbols, such as "2+3," are encoded as a single item, then working memory load decreases. If your student sees "5" instead of taking the time to work out the sum, then that student has the mental space to process more complex ideas. 

Cognitive load is not relegated to instructional materials. For instance, if students are thinking about their Instagram feeds, or are worried that they are going to fail an exam, then all of these intrusive thoughts are part of working memory. Those thoughts add to the students' cognitive load. The space in working memory is limited, which means intrusive thoughts are in competition with the space needed to actually solve problems, follow a logical progression of ideas, or recall items from long-term memory [1].

Our goal, as educators, is threefold. We want to: 

1) supply our students with representations that are conducive to the task at hand; 
2) help our students create higher-order chunks that are stored in long-term memory; 
3) and, reduce unwanted, negative, or intrusive thoughts that compete for space in working memory. 

We each carry a different load. Let's ensure it is a manageable cognitive load!


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] Spencer, S. J., Steele, C. M., & Quinn, D. M. (1999). Stereotype threat and women's math performanceJournal of experimental social psychology35(1), 4-28.

Thursday, January 18, 2018

The Stay Puft Marshmallow Man Paradox: Ironic Processing

Learning By Doing

Let's play a simple game. There's only one rule: Don't think about white bears. I will give you a minute.

Okay, so how did you do? Did you think about white bears? If you didn't think about white bears, what did you think about instead? What was your strategy? Maybe you should teach your strategy to Dr. Ray Stantz from the movie Ghostbusters (1984).

"I couldn't help it. It just popped in there." --Dr. Ray Stantz

Why, oh why, did Dr. Raymond Stantz conjure up the Stay Puft Marshmallow Man? All he had to do was clear his mind! WHY!? It's easy. He fell victim to a rather pernicious feature of the human mind. Sometimes, when you actively try to suppress thinking about something, your mind goes ahead and thinks about it. If you ever had the experience of not being able to stop laughing in church, then you've experienced ironic processing. Ironic Processing is the cognitive phenomenon of your mind betraying you and doing exactly the opposite of what you tell it. 

The game company Hasbro cleverly figured out a way to monetize ironic processing. They designed a board game aptly called Taboo. If you haven't had the frustrating experience of playing this game, the rules are as follows. You are given a word, and your goal is to get your partner to say that word. But here's the catch. You aren't allowed to use certain words as clues. For example, suppose I want you to say the word "Sweet." I am not allowed to use the words: Sugary, Tea, Nice, Sour, Sixteen. How evil is that? I'm terrible at this game because as soon as I read the list of verboten words, I immediately want to say them. Why? Ironic processing.


"Isn't it ironic...dontcha think?" --Alanis Morissette 

So what is going on? Why doesn't your brain do what it's told? According to one theory, the mind draws upon two separate processes to direct our behavior [1]. The first is an action-oriented process. It has a goal, and it motivates us to take steps toward that goal. Let's call this the "Operate" process. The second process needs to evaluate whether the goal has been achieved. Let's call this the "Test" process. The Operate and Test processes work in tandem to achieve a goal.

The problem arises when the Test process is checking Operate's progress before the Operate process has completely finished. In other words, Test is the annoying kid in the back seat asking, "Are we there yet?" Thus, if you are actively trying to suppress a thought, and the Test process kicks in to evaluate, then it ends up causing a violation of the thought suppression. By testing if you aren't thinking about white bears, you are now in violation of the rule. The Test process puts the "irony" in ironic processing.


The S.T.E.M. Connection

What is the connection to education? Put yourself in the shoes of a student who has test anxiety. It might be tempting to advise that student to not think about his or her anxiety. You could tell him or her to avoid negative thoughts about failure or the implications of failing. I'm sure you can see the problem with that advice. It would be analogous to telling the student not to think about "white bears." The problem is, if the student is thinking about failing, then they clearly aren't thinking about the material on the test. As we have seen in previous posts, working memory and attention are severely limited resources. If they are focused on the wrong information, then there will be fewer resources available to do well on the test.

What advice should we give instead? We might take a cue from high-pressure sports where the athlete faces negative thoughts (e.g., "Don't screw up. Don't screw up"). The advice for them is to focus on something (e.g., a word or concept) that is related to the task a hand [2]. In other words, if the student is worried about failing, then give that student something to think about instead.

It's not easy to do, obviously. But knowing how ironic processing works might help students understand how their mind betrays them. More importantly, knowing about the Operate and Test processes might also help students formulate their own strategies for handling situations when processing turns ironic. Once they have those strategies in hand, perhaps they can teach them to Dr. Stantz so he doesn't "accidentally" destroy downtown New York.


Share and Enjoy! 

Dr. Bob


Going Beyond the Information Given

[1] Wegner, D. M. (1994). Ironic processes of mental control. Psychological Review, 101(1), 34.

[2] Dugdale, J. R., & Eklund, R. C. (2002). Do not pay any attention to the umpires: Thought suppression and task-relevant focusing strategies. Journal of Sport and Exercise Psychology, 24(3), 306-319.

Thursday, September 17, 2015

I Work Out!: Brain Training

Editorial Note: I'm really excited about this week's topic because we are going to hear from a very good friend of mine, Dr. Jason Chein. In today's post, our guest writer is going to discuss a highly controversial and extremely interesting topic: brain training. Dr. Chein has conducted research in this area [1-3], which is why I'm so excited that he agreed to write this week's post. Take it away, Jason!

Is it time to hit the gym…for your brain? 

With so many advertisements and pop-culture books claiming that you can achieve a “smarter you” in just a few minutes a day of “brain training,” you might be thinking about hitting the cognitive gym. But don’t start strapping on your brain workout gear just yet. In today’s post I’ll take you through a brief history of some brain training research, and tell you about where the field stands today. (Hint: it’s not ready for primetime.)

If you were going by what had been the conventional wisdom in experimental psychology for the last several decades, then brain training — engaging in regular mental exercises that are intended to enhance your general cognitive functioning — would seem like a pretty silly idea. Just about all of the research from the mid-1960’s up through the turn of the millennium indicated that, while you could get incredibly good at just about any task (even really demanding ones) with enough practice, the benefits would be observed only for that specific task, and wouldn't transfer to other mentally challenging activities. Take for example the seminal work of Chase and Simon (1973) exploring the amazing memory of chess experts [4]. With just a few seconds to glance at the arrangement of pieces on the chessboard, advanced chess players can reconstruct the position of nearly every piece. Pretty impressive stuff! But, that’s true only if the pieces are in positions that “make sense” in the course of actual game play. If you change things up so that the pieces are placed randomly on the board (not in positions that would occur in a real game), then the experts’ memory drops to near novice levels (see for yourself in this video posted by psychologist Daniel Simons). And, it turns out that playing all that chess doesn't make someone generally smarter than others, or any better at problem solving in other situations. All those thousands of hours of practice and all it’s good for is beating someone at chess? Yep.


Core Strength

In the ensuing years, many psychologists have tried to find a mentally engaging activity that would leave a bigger footprint on the landscape of cognitive functioning, but time and time again the results suggested that practice with a given skill just doesn't transfer to other skills. So, you'd think everyone would have given up on the idea of brain training long ago (and many had). But in the early 2000's a new(ish) idea started to gain some traction. What if, just as performance with many physical activities can be enhanced by focusing exercises on "core" musculature, intellectual functioning could be generally improved by focusing mental exercises on "core" cognitive systems. Makes sense, right? And, based on a large body of prior behavioral experiments, and corroborating neuroimaging studies, researchers had a pretty good idea what some of those “core” cognitive abilities might be. One that seemed especially promising was working memory; the topic of an earlier post from Dr. Bob. Working memory is supposed to serve as a general workspace for the mind, and a slew of studies show that individual differences in working memory capacity can explain why some people excel while others lag behind on a very wide range of cognitively demanding tasks. If the capacity of this general mental workspace could somehow be expanded, perhaps through repeated exercises that target working memory, this could have a profound impact on overall intellectual functioning!

With this basic idea in mind, a few pioneering researchers decided to throw caution to the wind and to try their hand once again at the brain training enterprise. And, to many scientists great surprise (especially those who were pretty settled on the conclusion that practice just doesn’t transfer), the early results looked really promising. First came a pair of studies showing that training focused on working memory and other executive processes was effective in improving cognitive performance among kids diagnosed with ADHD, and it turned out, even among the healthy kids and college students who had been included as the comparison groups in those studies [5, 6]. Those exciting early results inspired another study [7] that really captured the imagination of the field, showing that scores on a test of general fluid intelligence (the closest thing we have to an index of someone’s general intellectual ability) were improved by working memory training, and in a dose dependent fashion (more training = more improvement). At the time that paper was published, I was myself already engaged in another working memory training study [1], which ultimately showed that a month of training could enhance both attention control and reading comprehension in college students (we looked, but didn’t find any evidence of improved fluid intelligence in this group).


Drinking From the Firehose

What started as a trickle of papers on working memory training soon turned into a deluge. Study after study seemed to be finding the same basic thing: that mental exercises targeting core functions of the mind (not just working memory, but also other “executive” and attentional functions) could produce meaningful transfer to important intellectual abilities. Yay, brain training works!…right?

Well, that depends on who you ask and what you mean by “works.” This is where the story gets interesting (and complicated, but don’t worry, I’ll keep it simple). After some of the initial excitement wore off, reports of failed replication attempts and null results (studies showing no benefits of training) started to come in. Others trying to reproduce the most impressive findings, like the gains in fluid intelligence and improvements in ADHD symptoms, weren’t always meeting with as much success. It seemed like the field was dividing into camps: let’s call them the ‘believers’ and the ‘doubters’. The doubters were understandably worried about failed replications, and raised some really important concerns about the methods used in earlier studies (like whether the groups that completed training and those that didn’t just had different expectations about how they should perform, similar to the placebo effect that can arise in drug studies). The believers kept at it, improved their studies to address the doubters’ concerns, and, even with these more careful measures in place (e.g., better control groups), many of their studies continued to produce exciting results.

So, which camp is right, the believers or the doubters? In situations like this we need to take a step back and look at the overall pattern and weight of the evidence. One way to do that is through meta-analysis – pull all of the relevant studies together, account for the size of the study sample (how many people participated) by giving more weight to larger studies, and then look to see where the “truth” lies. But here too the doubters and believers come to different conclusions. That’s because the answer you get depends on which specific studies you think should count, which methods you use to gauge the size of the training effect produced by each study, and most importantly, which behavioral outcomes you decide to focus on. There isn’t much debate about the benefits of training on tasks that are really similar to those that made up the training regime (we call these “near transfer” measures). In general, training does seem to improve performance on closely related tasks. So, if by “works” you mean “makes you better able to remember lists of things” (and indeed, that might be an important skill in some scenarios), then yes, it looks like training works. But does it boost your IQ, sharpen your attention, and improve your overall cognitive acumen (does it lead to “far transfer”)? I’d say we just don’t know yet. While there is some evidence that it can do these things, the overall body of evidence isn’t unequivocally favorable. But on the flip side, there also isn’t enough evidence that it doesn’t work (getting a little technical here, Bayesian factor analysis suggests that there is neither enough evidence to accept the claim nor to reject it). So pick your favorite metaphor – the jury is still out, the dust hasn’t settled, the waters are still too muddy – and maybe wait until the next New Year before you make a brain training resolution.


About the Author

Dr. Jason M. Chein is currently a faculty member at Temple University, where he is the principle investigator of the Neurocognition Lab. I met Jason in 1998 when we were both graduate students at the Learning Research and Development Center. While in grad school, Jason became an expert in cognitive neuroscience, which included learning cool methodologies like conducting studies using fMRI. While in grad school, Jason also became quite proficient at frisbee golf.


For More Information

[1] Chein, J., & Morrison, A. (2010). Expanding the mind’s workspace: Training and trans- fer effects with a complex working memory span taskPsychonomic Bulletin & Review, 17(2), 193–199.

[2] Morrison, A. B., & Chein, J. M. (2011). Does working memory training work? The promise and challenges of enhancing cognition by training working memoryPsychonomic Bulletin & Review, 18(1), 46-60.

[3] Morrison, A. B., & Chein, J. M. (2012). The controversy over CogmedJournal of Applied Research in memory and Cognition, 1(3), 208-210.

[4] Chase, W. G., & Simon, H. A. (1973). Perception in chessCognitive psychology, 4(1), 55-81.

[5] Klingberg, T., Forssberg, H., & Westerberg, H. (2002). Training of working memory in children with ADHD. Journal of clinical and experimental neuropsychology, 24(6), 781-791.

[6] Klingberg, T., Fernell, E., Olesen, P. J., Johnson, M., Gustafsson, P., Dahlström, K., Gillberg, C.G., Forssberg, H., & Westerberg, H. (2005). Computerized training of working memory in children with ADHD-a randomized, controlled trial. Journal of the American Academy of Child & Adolescent Psychiatry, 44(2), 177-186.

[7] Jaeggi, S. M., Buschkuehl, M., Jonides, J., & Perrig, W. J. (2008). Improving fluid intelligence with training on working memory. Proceedings of the National Academy of Sciences, 105(19), 6829-6833.

Thursday, September 3, 2015

They Call Me the Working Man: Working Memory (Part 2)

Editorial Note: This is a continuation of a previous post where we discussed a  model of a short-term memory buffer we called "working memory." In this post, we explore the underlying mechanisms for how working memory-capacity can change over the course of a lifetime. 
Here is a fun game called the dual n-back task. For the first task, you have to remember the spatial location of a series of squares. For the second task, you need to hold a few numbers in working memory. It becomes a dual task when you combine the tasks and do them at the same time. Sounds hard, right? It is. Try it for yourself.

Working Memory Expansion Pack: Adding Capacity


How did you do? Did you feel like your working memory was being taxed as you added the second task? What if you tried the n-back task when you were a kid? Would your adult self beat your younger self? In other words, does working-memory capacity change as we grow older?

There are roughly three different influences that can change working-memory capacity: neural development, knowledge, and recall strategies.
  

Neural Development. At age five, children can remember about four random digits or letters. By the time they are 20, they can remember upwards of seven or eight arbitrary digits or letters. It seems, therefore, that our brain adds capacity as it naturally develops.

Knowledge. Age, however, is hopelessly confounded with knowledge and experience. As we grow older, our brain undergoes massive changes and we file away volumes of new experiences and skills. So what would happen if we could somehow dissociate age and knowledge/experience? To do so, we would need to find areas in which kids, despite their young age, know more or have more experience than adults.

As it turns out, there are indeed areas in which kids know more than adults. Chess is just such a domain, as it is easy to find kids who have vastly more chess-related knowledge than adults. What if we pitted kids who are young, but highly knowledgeable about chess, against adults who are older, but less experienced when it comes to chess? Who would be able to remember more positions of chess pieces on a chess board? Who would be able to remember more numbers from a list of random digits? It turns out that researchers have investigated this and the results are plotted below [1].





As you can see, children were able to recall about nine chess positions, but only about six numbers. The results for adults were completely flipped. Namely, they remembered fewer chess positions than the children, but recalled more numbers. This suggests that children don't necessarily have a lower working-memory capacity. Instead, it indicates that they have less experience to help structure their recall.

Recall Strategies. You might be skeptical of that last statement. Why would experience increase your capacity to recall a list of digits? A perfect example of how experience can enhance the recall of random digits was demonstrated in a previous post in which we used the years of significant historical events in American history to form four-digit chunks. Also, we encountered a person who underwent deliberate training to increase his working memory capacity to a startling 79 items. Finally, we learned strategies for memorizing arbitrary lists of words, numbers, and phrases. Because adults have more experience temporarily storing information, we have come to develop our own strategies. Kids, on the other hand, have had fewer opportunities to figure out ways to hack their own memories.


The STEM Connection

Working memory capacity is obviously important for education. Research on this topic suggest a couple of conclusions. First, individuals have different working-memory capacities. We refer to this as an individual difference (like height or eye color). We also learned that working memory can change as a function of normal development. Some estimate that kids between the ages of 5-7 can remember four words and about the same number of digits. College-age students, on the other hand, can store upward of six words and approximately eight digits. 

We can't do anything to expedite normal development, but we can help students gain familiarity with the symbols and structures of information from a particular domain. That is where the real growth can happen. As we saw in the chess example, if students put in the time, they can expand their working-memory capacity to a point where they can outperform adults.

In conclusion, working-memory capacity seems to change over the course of a lifetime, and there are three potential explanations. First, we add capacity naturally as a consequence of natural brain maturation. Second, we learn new techniques and strategies for remembering, like repeating the same items over and over. Finally, we acquire new knowledge, which we can then use to help structure the to-be-remembered information. Kids who are chess experts can form larger chunks than adults who are novice chess players. Hopefully, this will supply you with the tools you need to go out and get a mental upgrade!


Share and Enjoy!

Dr. Bob

For More Information

[1] Chi, M. T. H. (1978). Knowledge structures and memory development. In R. Siegler (Ed.), Children's thinking: What develops? (pp. 73-96). Hillsdale, NJ: Erlbaum.

Thursday, August 27, 2015

They Call Me the Working Man: Working Memory (Part 1)


Editorial Note: 
For the next two weeks, I want to discuss the distinction between short-term memory and working memory. Once we've sorted out the differences, then we will dive into the connection between working memory and intelligence. First, let's talk about how to model what's going through your mind...right now.


Short-term vs. Long-term Memory

In a previous post, we talked about the distinction between short-term and long-term memory. The evidence for proposing that there are two distinct systems came from a study that demonstrated enhanced memory for items that were early in a list of words, as well as superior recall for items later in the list. To make sense of this type of U-shaped curve, the authors theorized that the items early in the list made it into a permanent memory buffer, whereas the items that occurred later in the list were still hanging around in short-term memory.

In addition to behavioral evidence, there is also neuro-scientific evidence for the two memory systems. Using a methodology called a double-dissociation, neuroscientists demonstrated that some patients have damage to their long-term memory, but their short-term memory works just fine. The double-dissociation was established when they also found patients with the opposite problem. Namely, patients' long-term memory was intact; however, they had difficulty remembering information for a short period of time.


Working Memory & The Three Sub-components

Although short- versus long-term memory was successful in explaining some of the empirical findings, it became clear that it couldn't explain all of the behavioral results. Here is an example. Consider the following list of words: pit, day, cow, pen, rig. According to the research on the limitations of short-term memory, these five items should fit comfortably in short-term memory. But consider a different list of words: man, cap, can, map, mad. Does it seem harder to remember these words? According to the model of short-term memory, this list should be neither easier nor harder than the previous list of words because, again, there are only five items. How do we reconcile these observations?

Because the concept of "short-term memory" was unable to explain these findings, the concept of a temporary memory buffer had to be extended. To do so, a cognitive scientist named Alan Baddeley proposed a revision to short-term memory that he called working memory [1, 2]. It is similar to short-term memory in the sense that it is a temporary storage facility, but it had to be elaborated to help explain why phonetically similar words, such as cap/map and man/mad were easily to confuse when trying to remember them. The new model of memory included three distinct sub-components: the central executive, the phonological loop, and the visuo-spatial sketch-pad. To see how these components interact, Baddeley provided the following diagram (see Fig. 1).


Figure 1. A schematic representation of the working memory components.


Central Executive

The first component is called the central executive. It is responsible for focusing your attention on relevant information and to switch attentional focus when needed. In other words, it is the central executive's job to coordinate the flow of information to and from the subsystems to accomplish a task. An example of coordinating information occurs when you are attempting to navigate with a map. You have to hold spatial information from the map in mind while looking up at the real world. The central executive has to synthesize the spatial information from the map with the verbal information located on the street signs.

Phonological Loop

The next component is the articulatory or phonological loop. The best way to visualize the phonological loop is to imagine an extremely short cassette tape. When I say "extremely short," I mean it only can hold about two seconds of audio or phonological information. It's also called an "articulatory" mechanism is because it replays the audio over and over. This makes intuitive sense because when people have a list of numbers or words they have to remember for a short period of time, they repeat it to themselves over and over. The purpose of rehearsing the list is to hold that information until it can be recalled. After which time, it can be dumped from the phonological loop.

Visuo-Spatial Sketchpad

Finally, the visuo-spatial sketch-pad is meant to track and momentarily retain spatial information. For example, when driving on the highway, it is necessary to keep track of the arrangement of cars behind you so that you don't unintentionally cut someone off when changing lanes. A quick glance in your rearview mirror quickly updates the spatial information found in the visuo-spatial sketch-pad.

"Are you sure we're not getting some interference?"

Occam's razor posits that the simplest explanation is best. Do we really need three different sub-components? In the case of a momentary memory storage, I think it is completely warranted [3]. The concept of working memory, which includes a central executive aided by two sub-systems, can explain behavioral findings that a unitary concept of short-term memory could not. Probably the best example of a finding that working memory can explain, but short-term memory cannot, is the concept of interference

Suppose we play a game similar to the old electronic game Simon. We will play two rounds. In the first round, just play as usual. For the second round, however, you have to repeat the word the. How did you do? If you're like most people, repeating the doesn't really interfere with your ability to play the game because the information is held in a spatial buffer.

However, suppose I ask you to memorize the following list of words, but after you read through the list, you have to repeat the.
  • Butterfly
  • Airport
  • Kitchen
  • Church
  • School
  • Knife
  • Solid
Now how did you do? If you're like me, it is impossibly hard. Why? Because the articulatory loop can't do its job refreshing the contents of the list that you want to remember.
That concludes Part 1 of our discussion of working memory. Check back next week for the link between working memory and intelligence, plus the connection to education!

Share and Enjoy!

Dr. Bob

For More Information

[1] Baddeley, A. D., & Hitch, G. J. (1974). Working memory. The psychology of learning and motivation, 8, 47-89.

[2] Baddeley, A. (2000). The episodic buffer: a new component of working memory? Trends in cognitive sciences, 4(11), 417-423.

[3] There have been further refinements to the model of working memory. For example, Baddeley proposed that an additional set of components are needed to bind episodic information held in long-term memory to the contents of working memory. Here is a schematic of those components (see Fig. 2). 


Figure 2. A further elaboration of the working memory model.

Baddeley, A. (2003). Working memory: looking back and looking forward. Nature reviews Neuroscience, 4(10), 829-839.

Thursday, December 18, 2014

Better Than Soup: Chunking

"Sloth love Chunk!" --Sloth


In a previous post, we talked about the severe constraints on working memory. Early estimates of the capacity of working memory started out around seven (plus or minus two) items. That translates into looking up a phone number in the phonebook (remember those?), walking over to the phone, and dialing the number. Unfortunately, seven seems like a very low number. In fact, later estimates put working-memory capacity around four items. Four items?! But that seems crazy low. Fortunately, there is a way to expand your working-memory capacity through a process called Chunking.

Does chunking really work? If it does work, what are the limits? How far can we stretch this strategy? 


Does Chunking Work?

How do we know that the brain is able to aggregate or "chunk" information? What is the evidence? To generate some evidence, this interesting study asked a couple of "volunteers" to memorize the position of chess pieces on a chessboard [1]. There were three types of participants. The first was a world-renowned chess master. The second was an intermediate player, but wasn't anywhere near the ability of the first player. The third person knew how to play chess, but was not ranked in any official capacity. The scientists showed them the configuration of a couple of chessboards that were in mid-game. The twist was that some boards were actual games, while the other boards had the same number of pieces, but they were randomly placed across the board. Before I tell you the outcome, what do you think they found? 

As you probably guessed, the chess master's memory for the position of the chess pieces was vastly superior to the intermediate and novice player's memory. What wasn't totally obvious, however, was how well they did relative to each other on the random boards. It turns out that they were all equally the same. This suggests that the chess master wasn't looking at individual pieces on the actual mid-game boards. Instead, he was aggregating the pieces into groups (e.g., a "castling" position). I love this study because it's an elegant demonstration of the process of chunking.


"Take It To the Limit" --The Eagles

The best answer to the question of limits comes from a study that attempted to train someone to expand his working-memory capacity [2]. Going into the experience, the person that was selected to endure the rigorous training regimen was a runner. That means he was well versed in thinking about numbers in terms of running times. He was able to chunk digits into running times. For example, 4:32:8 is an average time for a men's marathon. The runner worked for many training sessions by adding more and more complex retrieval structures. At the conclusion of the study, the participant was able to correctly recall 79 numbers. Impossible!

What does that mean for us ordinary mortals? First, this person wasn't special in any obvious way. That means that any one of us could also learn to memorize 79 digits if we were willing to put in the time and effort. Second, learning to memorize digits of numbers seemed to apply only to digits. In other words, the participant wasn't able to apply what he learned to memorize state capitals or other forms of information (e.g., letters). Finally, it also means that, although we have severe limits to our cognitive capacities, they can be overcome either by cognitive strategies and/or good, old-fashioned hard work (i.e., "deliberate practice"). 


A STEM Example

I'll be honest. When I took Physics in college, it was brutally difficult. Not because of the math (it was a non-calc version), it was hard because it seemed like each new concept arrived from out of the blue. Rotational kinematics seemed to have nothing to do with linear kinematics  Sure, the form of the equations seemed to have something in common, but they were largely taught as disconnected facts. 

Fast forward several years to my post-doc. I was blessed to work with a real physicist who pointed out to me that Physics is easy because you only need to know a few "first principles." From there, you can derive many other facts That hit me like a bolt of lighting. Once someone took the time to sit down with me and demonstrate the inner-connections, Physics didn't seem so hard. I don't want to trivialize education, especially for difficult topics, but the whole process can be made more simple (and perhaps fun?) if the material is presented as a sequence of ever-expanding chunks of information. 

Let's take velocity as an example. To build up to this advanced topic, it helps to start with our intuitive understanding of speed. Most of us have ridden in cars and talked about the measurement of speed in terms of "miles per hour." Once that gets translated into a symbolical representation (s = d/t), you can then expand it to include the concept of change (i.e., delta). Now the equation becomes s = Î”d/Δt. Not a lot has changed, and that's a good thing because the student needs to see the equation, not as something new, but slightly expanded. Then you can expand the notion of the delta: Î”d = d_final - d_initial. Plug this back into the equation, and you get a slightly more detailed expression. Again, each step is small and needs to be seen as a single chunk of information. 

Share and Enjoy! 

Dr. Bob


For More Information

[1] The chess study was conducted by a pair of researchers at Carnegie Mellon University (CMU) in the early 70s. The first author, Bill Chase, was my graduate-student advisor's late husband. I never had a chance to meet him, but he is a legend in the field of cognitive psychology. On the other hand, I did have the good fortune to take a course from the second author, Herb Simon. It was a fascinating course, and he gave probably the hardest final exam I have ever taken in my life. It had a single question: "Describe a computationally plausible model of cognition." We then had about three hours to provide an answer. 

Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55–81.

[2] Training someone to expand his working-memory capacity took 230 hours of practice! His training was conducted  by K. Anders Ericsson, who we will hear more about in subsequent posts. The original article can be found here

Ericsson, K. A. (1980). Acquisition of a memory skill. Science, 208(4448), 1181–1182.