1
ON THE SYMBIOSIS OF REMEMBERING,
FORGETTING, AND LEARNING
Robert A. Bjork
It is natural for people to think that learning is a matter of building up skills or knowledge in oneās memory, and that forgetting is a matter of losing some of what was built up. From that perspective, learning is a good thing and forgetting is a bad thing. The relationship between learning and forgetting is not, however, so simple, and in certain important respects is quite the opposite: Conditions that produce forgetting often enable additional learning, for example, and learning or recalling some things is a contributor to the forgetting of other things.
My goal in this chapter is to characterize the interdependencies of remembering, forgetting, and learningāinterdependencies that essentially define the unique functional architecture of how humans learn and remember, or fail to learn and remember. After some comments on the importance of forgetting, I discuss how learning and remembering contribute to forgetting; why forgetting enables, rather than undoes, learning; and why the interplay of forgetting, remembering, and learning is adaptive and yet poorly understood by the user.
WHY FORGETTING IS IMPORTANT
One of the āimportant peculiarities of human memoryā that motivated Elizabeth Bjork and me to propose our new theory of disuse framework (R. A. Bjork & Bjork, 1992), to which I refer intermittently in this chapter, is that human memory is characterized by a storage capacity that is essentially unlimited, coupled with a retrieval capacity that is severely limited. At any one point in time, most of the vast amount of information that exists in our memories (names, facts, procedures, numbers, events, and so forth), all of which was recallable at earlier points in time, is not recallable. Even the most overlearned information, such as a combination lock number, or phone number, or street address, which may have been instantly and automatically recallable after a long period of use, becomes nonrecallable after a long enough period of disuseābut remains in memory.
We labeled our framework a ānew theory of disuseā to distinguish it from Thorndikeās (1914) original ālaw of disuse,ā which asserted that memories, without continued use or access, decay from memory. Instead, we argueāas many others haveāthat although memories become inaccessible without continued use and access, they remain in memory. The theory distinguishes between the retrieval strength of a memory representationāthat is, how activated or accessible it is at a given point in time, which is influenced by local conditions, such as recency and current cuesāand the storage strength of that representation, which is an index of how entrenched or interassociated that representation is with related representations in memory. Recall is assumed to be entirely determined by current retrieval strength, whereas storage strength is a latent variable that acts to retard the loss (forgetting) and enhance the gain of retrieval strength. Other assumptions of the theory are mentioned below where they are relevant to particular interactions of remembering, forgetting, and learning.
The failure to recall information we know exists in our memories is a major frustration, but were everything in our memories to be recallable, we would suffer greater frustrations. Even recalling oneās current phone number, for example, would become a slow and error-prone process if every number one has had across oneās lifetime were to come to mind, requiring some kind of decision process to select the current number. As William James (1980) was one of the first to emphasize, āIf we remembered everything, we should on most occasions be as ill off as if we remembered nothingā (p. 680).
In short, because we remember so much, we do not want everything in our memories to be accessible. We have a constant burden, for example, to keep our memories current. We need to remember our current phone number, not our prior phone number; we need to remember where we parked the car today, not yesterday or a week ago; we need to remember how some current software or hardware works, not how the prior versions work; and on and on. Such updating, as I and my collaborators have argued in multiple papers over the years (e.g., R. A. Bjork, 1970, 1972, 1978, 1989; E. L. Bjork, Bjork, & Anderson, 1998), requires some mechanism to set aside, inhibit, or erase information that is now out of date and, hence, a source of errors and confusion. Without some such mechanism, I have argued, we would ādegenerate to a proactive-interference-induced state of total confusionā (Bjork, 1972, p. 218).
The mechanism in the case of human memory, in my view, is retrieval inhibition, which I have argued is a broadly adaptive mechanism in human memory (R. A. Bjork, 1989). Without continuing access and use, previously learned information and procedures are not lost from memory, but become inaccessibleāexcept, possibly, when highly distinctive situational, interpersonal, or body state cues associated with a given memory are reinstated. That is, retrieval of the information or procedures in question becomes inhibitedāand, as I sketch in the next section, learning and remembering other information and procedures contribute to such inhibition.
HOW LEARNING AND REMEMBERING CONTRIBUTE TO FORGETTING
Why do we forget information that was once recallable? The principal answer to that question, alluded to above, is not that the informationā like footprints in the sandāfades away or decays in our memories over time, as was thought to be the case by researchers during the early decades of controlled research on memory (e.g., Thorndike, 1914). The decay idea, which remains compelling to most people based on their introspections, was instantiated in Thorndikeās law of disuse, as mentioned above. Thorndikeās law, though, came to be thoroughly discredited, starting with a devastating critique by McGeoch (1932). Instead, McGeoch argued, information that has been stored in our long-term memories tends to remain there, but it can become inaccessible (forgotten) owing to one or both of two factors: āreproductive inhibition,ā which refers to losing access to information in memory by virtue of interference from competing information in memory, and āaltered stimulating conditions,ā which refers to the changing of the retrieval cues that are available to us as we move on with our lives. (For a brief history of research on interference and forgetting, see R. A. Bjork, 2003).
Learning, therefore, contributes to forgetting. As we learn new information, procedures, and skills, we create the potential for competition with related information, skills, and procedures that already exist in memory. Access to that earlier learning can then be inhibited or blocked by related aspects of the newer, and perhaps more accessible, learning. (Whether the primary mechanism is inhibition or blocking remains a matter of current dispute; see the chapters by Anderson, Bjork, Buzsaki, Hasher, and MacLeod in Roediger, Dudai, & Fitzpatrick, 2007). Such competition, however, goes both ways: Earlier learned information can also block or inhibit access to more recently learned information. That is, to use the jargon of research on interference and forgetting, we are subject to both retroactive interference and proactive interference.
Retrieval as a Memory Modifier
The results of more recent research add to the picture sketched above. The act of retrieving information from our memories does much more than simply reveal that the information in question exists in our memories. In fact, retrieving information modifies our memories: The retrieved information becomes more recallable than it would have been otherwise, and other information in competition with the retrieved informationāthat is, information associated to the same retrieval cue or set of cuesābecomes less accessible. Using our memories, in effect, alters our memories; that is, retrieval is a āmemory modifierā (R. A. Bjork, 1975). In our new theory of disuse, Elizabeth Bjork and I concur with Thorndikeās assertion that disuse is a key factor in forgetting, but not because unused memories decay, but rather because access to those memories becomes inhibitedāowing, primarily, to retrieval of competing memories (R. A. Bjork & Bjork, 1992).
Demonstrations of what might be considered the positive effects of retrieval as a memory modifierāthat retrieving information from memory is a powerful learning eventātrace back across almost 100 years of the research literature (e.g., R. A. Bjork, 1975, 1988; Carrier & Pashler, 1992; Gates, 1917; Glover, 1989; Hogan & Kintsch, 1971; Izawa, 1970; Landauer & Bjork, 1978; Landauer & Eldridge, 1967; McDaniel & Masson, 1985; Spitzer, 1939; Tulving, 1967; Whitten & Bjork, 1977), and there has recently been renewed interest in such effects, given their pedagogical implications (e.g., Karpicke & Roediger, 2008; Morris & Fritz, 2000; Pashler, Zarow, & Triplett, 2003; Roediger & Karpicke, 2006a,b; Storm, Bjork, & Storm, in press). For present purposes, however, it is the negative effects of retrieval as a memory modifierātermed retrieval-induced forgetting by Anderson, Bjork, and Bjork (1994)āthat are of interest: that is, the loss of access to information that is in competition with the retrieved information.
It is only from one perspective, however, that retrieval-induced forgetting is a negative effect. From another perspective, retrieval-induced forgetting modifies the accessibility of information in memory in adaptive ways. As we use our memories, we make more accessible the information, procedures, and skills we are using, and we make less accessible competing information, procedures, or skills. The interpretation my collaborators and I have advocated (e.g., Anderson, 2003; Anderson & Bjork, 1994; Anderson et al., 1994; Anderson, Bjork, & Bjork, 2000; Anderson & Spellman, 1995; E. L. Bjork, Bjork, & Anderson, 1998; Storm, Bjork, Bjork, & Nestojko, 2006) is that the act of recalling information from memory requires not only that the information be selected and produced, but also that other information associated to the same cues be selected against and not produced. The information selected against is inhibited, rendering it less accessible should it be the target of recall in the future (for arguments against inhibitory accounts, see MacLeod, Dodd, Sheard, Wilson, & Bibi, 2003; Perfect et al., 2004). How long such inhibitory effects might last is a matter of current research (e.g., MacLeod & Macrae, 2001) and debate, but Storm, Bjork, and Bjork (2008) have obtained evidence that the recall of items repeatedly selected against goes to essentially zero within a single experimental session. Whether repeatedly making items the target of retrieval-induced forgetting across experimental sessions would make those items permanently inaccessibleāat least in the absence of very specialized and discriminating retrieval cuesāremains to be seen.
Adaptive Aspects of the Interplay of Forgetting and Remembering
Beyond the general consideration that forgetting is important, given the storage and retrieval characteristics of human memory and the ongoing need we have to keep our memories current, there are other, more specific reasons why the interplay of forgetting and remembering is adaptive. Compared to some kind of system in which out-of-date memories were to be overwritten or erased, for example, having such memories become inaccessible, but remain in storage, has important advantages. Because those memories are inaccessible, they do not interfere with the retrieval of current information and procedures, but because they remain in memory they canāat least under some circumstancesābe recognized when presented and, more importantly, be relearned at an accelerated rate, should that be desirable. In fact, some of the findings discussed in the next section suggest that such inhibited memories are uniquely relearnable, especially if they were strongly encoded in memory at some earlier point. Phrased in terms of the assumptions of the new theory of disuse, the largest increments in both storage and retrieval strength occur when the to-be-learned (or relearned) information has low retrieval strength and high storage strength. Thus, some name or number or procedure from oneās past, even oneās distant past, can be relearned with great efficiency, should it become relevant again.
Another consideration has to do with the statistics of use. Information and procedures we will need in the near future tend to be from the recent past, which is one reason that computer programs and electronic gadgets, such as cell phones, make recently accessed documents, addresses, and numbers more readily accessible than other documents, addresses, and numbers. In the case of human memory, remembering information makes that information more accessible in the near future and any competing information and procedures less accessible.
In that context, another āimportant peculiarity of human memoryā (R. A. Bjork & Bjork, 1992) is relevantānamely, that with disuse of two competing memory representations, access shifts toward the earlier learned representation with time. Such a shift from recency to primacy across a period of disuse is a very general effect, one that occurs on multiple timescales and for many different types of memories. I have speculated elsewhere (Bjork, 2001) as to the mechanisms that might be responsible for such regression effects, but the important point for present purposes is that such regression effects, from the standpoint of the statistics of use, may also be adaptive. The reason has to do with why, in real-world contexts, people might stop using the most recent of competing representations. Often, those reasons will be accompanied by the earlier learned representation again becoming needed. One of many possible examples might be returning to the United States after a prolonged stay in Great Britain, during which driving a car and staying alive required acquiring a set of perceptual and procedural routines that differ, markedly, from the corresponding routines in the United States. Disuse of the newer, Great Britain-appropriate routines could mean ...