Educational Technology & Society 2(4) 1999
ISSN 1436-4522

Technology and the Biological Basis of Learning

Moderator: Robert N. Leamnson
Professor of Biology, UMass Dartmouth, Massachusetts, USA.

Muhammad K. Betz

Professor, Southeastern Oklahoma State University, USA.

Discussion Schedule
Discussion: 9 - 18 August 99
Summing up: 19 - 20 August 99

Pre-discussion paper

The intent of this discussion is to examine any and all technologies and all aspects of instructional design as a function of their effect on the learner considered as biological entities.

Biologists have a somewhat down-to-earth understanding of the learning process and speak of it in terms of biological processes. As Henry Plotkin said "When we come to know something, we have performed an act that is as biological as when we digest something." I have prepared then, a somewhat long background paper on what might be called the biological basis of learning. In it I propose several questions that might be considered, but as always, you are limited only by your imagination.

"Instructional Technologies and the Biological Basis of Learning"

In these opening remarks I will take it as given that all instruction, whether by lecture, book writing, or other technology, has as its purpose that someone learns something. And as we were made aware in May, how much learning goes on is in part at least a function of the design of that instruction. The intent of the present discussion is to focus concentration on the learning end of the process of education. The questions posed are:

  1. Can we agree that, no matter the format of the instruction, or the nature of the help provided by others, learning is essentially a private event achieved by the individual learner?
  2. Can we agree that learning, difficult to understand as it might be, is not totally mysterious, but must involve the brain and is therefore a biological process?
  3. Can a better understanding of brain function and the biological nature of learning help in the design of instruction, no matter the technology?

As background for the discussion, here is some contemporary biology- -some fact, some theory--that will serve to support the idea of considering learning to be a biological process.

The adult human brain is several times the mass it was at birth. Even so, the number of neurons, the cells that pass signals, is virtually the same in the adult as in the newborn. A great deal of the increase in mass is due to the growth of the individual neurons in the form of long projections called axons, and the protective cells that surround them. The development of the mental and motor faculties we think of as human is a function of the connections between neurons and not so much on the absolute number of neurons. And it is the axons that make the connections (synaptic junctions). Because a neuron can have a large number of axons (signal senders) and dendrites (signal receivers) it has been estimated that the average neuron would have 1,000 connections to other neurons, with some having perhaps 10,000.

With all that in mind one might be tempted to believe that maximizing the number of connections would result in maximizing human faculties. But the fact is, small children whose ability to reason, abstract, and articulate is well below that of an adult, do in fact have many more synaptic junctions than the adult. The explanation for the apparent anomaly comes from several discoveries that are comparatively recent.

  1. The budding and subsequent growth of new axons (potential connections) is exuberant (Jean-Pierre Changeux's term) in the young child, but in fact probably never really stops until death. The significance of that fact is that even the adult brain is an active organ constantly growing neural projections and making new connections. It's not a "hard wired" structure, like a computer, that must be used "as is."
  2. Axonal budding and growth is not particularly well directed. The axon is not "trying" to get anywhere in particular. While certain chemical gradients prevent it going willy-nilly where it may, it does not have a specific target at which it is "aimed." Consequently, budding neurons make many useless and unproductive connections to other neurons.
  3. A connection (synaptic junction) is not, however, necessarily permanent. One of the more significant discoveries of neurophysiologists is that all synaptic junctions are not equal. Some are physically larger, tighter, and are said to be "stable," meaning that one is likely to have them for life. Some pathways, then, do become hard-wired. Other junctions between neurons are temporary. Many axons regress or degenerate after making a temporary connection. Such junctions are said to be labile.
  4. The factor that determines whether a connection will degenerate or become hard-wired is the frequency with which it is used. Signals passed over a particular junction--even a labile one--have the effect of enlarging and strengthening the connection. Used frequently enough, a labile synaptic junction becomes stable. It's probable that most labile connections do not produce anything potentially useful, or else they do not get used, and so degenerate (use it or lose it).

"A Theory of Learning"

Explaining learning in terms of stabilizing useful circuits through repeated use might indeed be a "just so" story. But there is nothing about this theory that leads to any untenable conclusions. It has considerable explanatory power, and the testable predictions it makes have been verified at the cellular level experimentally. It is entirely compatible with practical experience that shows that we remember things more readily and accurately the more times they are experienced (although the relationship is probably not linear). Musicians, for example, are well aware of the efficacy of practice. And because the patterns formed through multiple connections are unimaginably complex, some pathways and sets of pathways are used not just for a single process (like remembering your phone number) but get used in a variety of situations in varying combinations with other pathways. That interesting phenomenon might explain our wonderful ability to abstract. If the same set of pathways gets used when encountering dozens or hundreds of disparate events, it might be that there is, in fact, something common to all those events. When we induce a "general rule" from specific events, it could be that we are using the neural paths that are common to all these events, but not those paths that would identify any single episode as such.

The Modular Brain

Most of us have had the embarrassing experience of getting to the bottom of a page only to discover that while our eyes were reading, our consciousness was elsewhere. We can recall what we were thinking about, but nothing of what we were reading. Laboratory research again provides an explanation. The whole brain is not involved equally throughout in each and every mental task. The brain is a modular device with clusters of cells dedicated to certain types of activity. The modules that enable vision, and even the recognition of words, are not necessarily or in all cases corresponding with the thinking modules. It takes some effort to get different modules to start comparing notes. Sometimes someone can hear accurately every word spoken to them, but not understand the cognitive content because they were not "thinking" about what was said. (It has even been suggested that some peoples' speech suffers from a similar problem. We've all heard the old chestnut about ideas going from the teacher's notes into the students' notes without passing through the head of either. Well, something had to pass through some part of the brain, it just wasn't the thinking module.)

There is even a biological explanation for "paying attention" to whatever is stimulating the brain. When interest, need, or curiosity prompts us to pay attention, as well as observe or listen, the frontal parts of our brain become active. There are two (at least) notable effects. Axons from the frontal neurons have found their way to other parts of the brain. These axons sometimes attach to other axons (instead of dendrites--the more typical situation). These axon-to-axon connections act as switches to either attenuate or enhance the probability that a signal will get past the next junction (these are called "gating signals"). When we are "concentrating" the frontal neurons attenuate signals from sources of distraction, such as extraneous noise or things moving in the periphery of our vision. But they also promote signal passage in the thinking modules, the parts at work when we are trying to make sense of something. So our grandmothers were on to something when they told us to concentrate, and repeat whatever it was we wanted to learn.

If this model has merit (and it seems to be the only non-mysterious one around) it should prove beneficial to consider it when designing instruction. We would not see student brains as computers ready to process over hard wired circuitry any program we feed them. When learning, the brain is not so much being used as it is being changed. Learning stabilizes circuitry that would otherwise degenerate and be lost forever. What kinds of instruction might force the use of labile junctions, repeatedly and with concentration? What kinds of designs might be counter productive, distracting the mind instead of helping it to concentrate? What kinds of instruction might inspire a student to learn something she has no natural interest in?

And so......

"The time has come, the Walrus said,
to talk of many things.
Of books and sites and the Internet,
of what the synapse brings.
Why fore digital instead of ink,
and whether the neuron sings."

(with appropriate apologies)

Post-discussion summary

Discussion threads

Types of biological learning

The first contribution was from William R. Terrell who commented on his experience training helicopter pilot students. His task was to reduce the time required to complete a 200-item start-up checklist. William noted that pilots could not memorize the entire checklist but could memorize up to seven items. The checklist was reduced to 35 items, which were chunked into segments of seven independent actions, thereby reducing time to execute the checklist 70%. His view of biological learning related to the brain's limitations of memory slots, and he urges the compilation of a library of heuristics regarding such biological/brain factors.

Muhammad Betz comments that biological learning should not be limited to brain learning and refers to Lorenz who studied learning from the point of view of Evolution. Lower organisms learn without a brain and while higher organisms learn with a brain, they do not dispense with the learning mechanisms of lower organisms. He adds that the evolutionary view of learning is an important backdrop to brain learning.

Johanna Dold dispenses with the relevance of the evolutionary view of learning as not within the context of today's human society and its demands, while Dennis Nelson equates biological learning with physical or chemical learning and admits spiritual and mental components to learning as well.

Paul Pavlik adds to the discussion based on his insights derived from training animals, in yet another perspective on biological learning. He asks how animals learn so quickly and answers that they pool their intelligence, work together and share knowledge through many generations. He claims that dogs anticipate and communicate with body language to learn (and teach). Paul asserts that we are fighting natural, biological learning systems and asks, "How many other species are we trying to learn from?"

Martin Owen points readers to a valuable source on biological learning, i.e., "The Tree of Knowledge," by Humberto Maturana and E. Valera. Martin implies that brain learning/considerations do not exist outside a social context.

Brain learning proper

Crispin Weston commented on several quotes from Bob's paper.

Bob: "Explaining learning in terms of stabilizing useful circuits through repeated use..." Crispin supports the notion that repetition stabilized brain circuitry but adds that there should be a constantly changing context or elaboration in order to attain mastery of learning.

Bob: "Sometimes someone can hear accurately every word spoken to them, but not understand the cognitive content because they were not 'thinking' about what was said." Crispin hypothesizes that this problem relates to the new information not being linked to the center of consciousness and, not being retrievable, the new synaptic junctions wither. He goes on to suggest that we differentiate parts of the brain which are used for passive versus active use. The key here as he sees it, is the necessity of generating student activity to overcome loss of learning.

William (Bill) Klemm supported Bob's premise that the adult brain is constantly growing projections and connections and asserted that older adults deter mental incapacity by brain activity.

Manny Halpern offers two questions to the forum: (1) Do you believe that brain learning applies to motor learning (excluding speech)? And (2) What are the structural and functional relationships between memory and learning? She posits that short term memory can be hypothesized as electrical energy and long term memory as chemical energy.

William Klemm comments on the scenario of a student listening and taking notes. Bill asserts that the student remembers little because the brain is distracted by the mechanics of note taking and that so-called unconscious learning during the event is minimal. This unconscious knowledge, or procedural learning, requires even more extensive rehearsal for retention than conscious knowledge.

Crispin Westin responds to a point made by discussion moderator in reference to Crispin's earlier comment that listening is a passive activity and that attentive listening is an active process. Crispin notes that for a skilled listener or reader, the act of listening or reading is active, but not so for the unskilled. Crispin's point is that knowledge = encoding + retrieval, and that retrieval depends upon the organization of neural pathways. In order to ensure the capacity for retrieval of information, ways of encoding must be considered. He concludes that the chances for retrieval are improved if there is practice for retrieval at the initial episode of encoding.

Brain learning versus other learning

Young S. Kim observed that postings on this discussion equated 'learning' to information retention, and commented that emphasis should be shifted from retaining information to learning the skills to acquire information as needed.

Bernard Colo comments on statements by Bob and Crispin, and asserts that teaching should incorporate three levels of learning: cognitive; affective; and psychomotor. Ensuring learning from these three levels imprints information into long term storage, and if that information was imprinted in a manner related to the learners' experience, it can be more easily retrieved.

Ania Lian comments on Bob's contributions that for the sake of study/discussion, we can separate learning and the environment in which it takes place. Ania states that our goal is not to produce changes in the learner's brain, but to affect the interaction of the learner and the learning environment. She implies that analysis and resulting conclusions without a critical examination of the environmental conditions is a mistake.

Moderator contributions

Professor Robert (Bob) Leamnson moderates/comments on the above postings. First he mentions the 'evolutionary thread' as introduced by Muhammad and posits the credibility of the perspective. He refers to Muhammad's assertion that brain learning is more relevant to abnormal circumstances and mentions with a reference to Alan Cromer, that the classroom is an unnatural setting (i.e., the right forum for brain learning considerations). Bob then comments on Johanna's contribution by adding a distinction between biological evolution and sociological evolution. Social evolution is seen as rapid change and biological evolution is seen as slow change and confirms that biologically, humans have had the same brain for tens of thousands of years. Bob then addresses Crispin Weston's inference that brain learning by circuit repetition implies rote memorization but allows for more complex brain learning which uses older connections to capture new information before new circuitry is constructed. He then asserts that information and knowledge are handled by different parts of the brain in reference to what information is passively acknowledged yet lost by learners and what knowledge is actually retained. He closes by saying, "I am absolutely convinced that 'physical' activity does not guarantee learning. ... Learning is what takes place in the head."

With respect to Crispin's assertion that capacity to retrieve knowledge is based on more than receiving information, Bob refers to an assertion in his original paper that initial learning creates labile brain synapses that become 'hard-wired' with use. Bob distinguishes between 'hearing' and 'listening' and asserts that 'listening' is always active. Bob considers 'retrieval' to be another attempt at 'hard-wiring', but goes on to mention the unsolved problem of 'forgetting.' Bob closes by referring to the limbic system, the brain component associated with emotions. He states that the limbic system is linked to other brain modules and accounts for the relationship between learning and emotions.

With respect to a posting by Bernard Colo related to three levels of learning: cognitive; affective; and psychomotor, Bob implies that these three types of learning evidence a similar type of brain activity in different parts of the brain. He adds that brain learning theory seeks to identify the brain functions common to different types of learning. Bob refutes Bernard's claim that knowledge is learned actively or passively and notes that learning is signified by brain change. Purely passive learning only uses existing hard-wired networking, so nothing new is retained. Lastly, Bob equates Bernard's use of the word 'imprinting' as the stabilizing of labile synapses through repeated use.

In his last posting in this discussion, Bob suggests readings on the, Brain Change Model. That is, "Neuronal Man," by Jean-Pierre Changeux, and "Neurobiology of Learning and Memory" by J. Martinez and R. Kasner. He also suggests that William James hit very close to the concept of Brain Change learning a century ago, and that contemporary neurobiology serves to give a more complete model of learning.