Educational Technology & Society 4(1) 2001
ISSN 1436-4522

Does Technology Present a New Way of Learning?

Robert N. Leamnson
Professor of Biology
UMass Dartmouth, 285 Old Westport Rd.
Dartmouth, MA 02747-2300 USA



This reflective essay suggests certain precepts that should be effective in the design of instruction and its evaluation, particularly if advanced or experimental technology is involved.  The arguments are based on the assumption that human learning is a biological process that, when reduced to its essentials, is the product of evolution and does not change even as study habits do.  The emphasis here is on the aspects of learning that are common to all humans and less on personal preferences for kinds of content or study methods.

A brief review of the biological basis of learning is followed by some speculative suggestions on the use of technology and the outcomes of any pedagogy that are of timeless importance.

Keywords: Brain, Learning Styles, Teaching, Technology, Emotions, Assessment

The Biology of Learning

We hear so frequently that "today's students" are different from their predecessors, learn in new ways, and so need to be taught differently, that these assertions seldom get the scrutiny they deserve.  In these few pages I hope to look at the learner primarily as a biological entity, and learning as a biological process.  I hope to use human biology as a first principle from which to deduce certain arguments regarding teaching and technology.

In one sense the claim that "we all learn differently" is a hypothesis that is too true to be good - by which I mean it lacks any potential for informing pedagogy.  As Steven Stahl (1999) found from an extensive review of the literature and from personal interviews, attempts to create a teaching style to match learning styles produce no detectable improvement.  The long-standing push to emphasize the differences among learners has not led to any improvement in education and has not produced any pedagogical methods that would lead to improvement.

The argument to be presented here is that we should be looking at the commonalties among learners rather than the differences.  Should it be true that, at some level, all learners are doing the same thing, that fact would make instructional design a realistic goal. 


No Learning Without a Brain

It must be clear from the outset that I will be speaking here of "human learning" only.  The word "learning" becomes useless when it is expanded to include such far-flung phenomena as phototropism in protozoa.  The relevant question here is what is happening when a human person learns something.  We can continue to learn after severe trauma to any of our organs except the brain.  By the same token, the smallest lesion to that organ can annihilate memory, language, or any other faculty required for learning.   A good starting point, therefore, would be some level of understanding of what the brain is doing when it learns.

Considering the curiosity that the brain has inspired in scientists for a very long time, it is perhaps surprising that a model of learning based on neural function has taken so long to influence pedagogy.  Some recent reviews by Albright (2000)  and Squire (1999) show the directions of contemporary thinking at this turn of the century. 


Genetics and Epigenesis

So far as their gross anatomy is concerned, normal human brains are remarkably similar.  When the brain is developing, a bewildering array of diffusable chemicals and cell-surface proteins cause developing neurons to grow projections (axons) toward some target cells while resolutely avoiding others.  Such growth is under regulated  genetic control and takes place during embryonic development.  The human infant brains that look very much alike also have the various components and modules appropriately connected to one another.  "Connected" here means that a neuron's growing axon must be able to signal its target cell when it comes in contact with it.  The two cells do not fuse, but elaborate a patch-like connection called a synapse.  The way synapses operate is beyond the scope of this essay - suffice it to say that the synapse allows a signal from one neuron to be relayed (under appropriate conditions) to the other neurons it is connected to.  (Signals move away from the cell body always by way of axons, and are received and move toward the cell body by usually shorter projections called dendrites.  Because the axon is branched at its distal end, a neuron can make connections with the dendrites of other neurons at a thousand or more sites.)

There have been, over the years, a number of puzzling observation regarding brain development.  1)  Not all the neurons produced by cell division survive - many degenerate spontaneously.  2)  Because many neurons die, the adult brain has fewer of them than it had at birth, but it is several times larger at maturity than it was at birth.  3)  The total number of connections (synapses) in the child's brain is much larger than in the adult's.  The two phenomena of interest here are the increase in brain size without a net increase in the number of neurons, and the dramatic reduction in the number of synapses as the brain matures.  Both of these puzzling observations have come to play a part in current thinking about what the brain is doing in order to learn. 

The increase in brain size is due in part to an increase in the number and size of glial cells.  These cells do not transmit signals but serve insulating and protective functions.  The net number of neurons stays nearly constant, but they enlarge with age and continue to grown axons to make connections with other neurons.  The discovery that neurons continue to send out axons - something that continues to happen through life - has contributed greatly to current theories about learning.

The continuous growth of axons with brain development seems at first incompatible with the concurrent reduction in the number of connections or synapses.  The explanation came with the realization that most connections in the developing brain are not permanent - they are said to be labile, or easily broken.  Only a relatively few of them become tight and permanent - these are said to be stable.  A useful theory of learning has developed around a singular observation - labile synapses become stable as a result of frequent use.  "Use" means here that they are actually conducting signals from one neuron to the next.  Logic would suggest that a connection that is used frequently is one that has contributed to a beneficial pathway.  Connections that provide no useful path fall apart and the axon regresses or degenerates.  All of which implies a considerable amount of randomness in the growth of axons.  While most budding axons have as targets only the dendrites of other neurons of a particular type, the actual cells they make contact with is primarily a matter of chance.

Before birth it is genetics that determines in large part the overall structure of the brain and its essential interconnections, such as those for vision, hearing and so on.  During a young child's growing years, however, the growth of axons has a large "epigenetic" element, meaning that genetics determines only the type of cell that is the target, and perhaps its general location.  The actual cell to which it connects is a matter of chance.  So it is that most cell-to-cell connections produce no useful pathway and degenerate.  If the new pathway perchance helps a child make some sense of the world it will get used repeatedly and become stable.  So the profusion of synapses in the child's brain does not indicate knowledge, only a vast potential for learning.  Gopnik et al. (1999) suggest that the child in the crib is operating much like a scientist in that it is trying out (from a superfluity) various scenarios - complex neural webs - until a set "makes sense."  That particular web, and its multitude of connections, will get used repeatedly and so become a network of stable connections. 


A Theory of Learning

From these observations Changeux (1985), Edleman (1989), Squire (1999) and others have proposed similar theories of how we learn and (in some cases) certain pedagogies that capitalize on these theories.  Basic to the common theory are two salient facts:  Repeated use stabilizes pathways or webs of interconnected neurons; unused pathways, no matter how potentially useful, will disintegrate.

If we accept for the moment that something is learned when it is both understood and remembered, the above observations on brain biology have implication for how students study and how teachers teach.  It is possible to remember words without an understanding of the concepts they imply, or to understand a concept when first encountering it but not be able later to remember it in detail.  Functional (or fast) magnetic resonance imaging (fMRI) has demonstrated that the regions or modules of the brain that are active when a subject is struggling with a problem or abstract concept are clearly displaced from regions active when memorizing strings of words.  So it is that it is possible to have memory without understanding.  On the other hand, the student who is concentrating can find a web of labile connections that enable understanding, but without repeated use, the connections that enabled that understanding may well degenerate and with them any hope of long term memory of the novel concept.


Implications for Learning

Assuming all of the above, there are two elements required to learn something that is both new and challenging.  The first is focused attention.  What in common parlance we call "concentrating" is, at the biological level, the trying of multiple labile circuits in search of one that "makes sense" of the new idea.  Once such a web is found and the learner is said to "get it," repeated reconstruction of the concept is required for that particular web of circuits to become stable and capable of being reactivated, or recalled.  Current understanding of what's going on in the brain can explain, then the wisdom of some very old advice on how to learn; concentrate and practice.


The Limbic System

Most people can discipline themselves to study, the usual form of practice, but the other element of learning, concentration, can be elusive.  Brain biology can, again, provide some explanation for why concentrating or focusing attention is sometimes almost automatic and at other time difficult if not impossible.

During embryonic development genetic instructions result in axonal linkages between structurally and functionally different regions of the brain.   In particular, the forward-most parts of the brain are well-connected to what is called the limbic system - a collection of components deep in the center of the brain. The limbic system determines our emotional states, but it is always in communication with the forebrain. This frontal region has an organizing function; it weighs its various inputs and chooses options.  It also has the remarkable ability to reduce activity in certain regions and enhance activity in others.  In other words, it focuses attention where it is most needed.  Because the forebrain is firmly connected to the limbic system, our emotional involvement with any person or thing can strongly influence where our attention is focused.


Implications for Technology

The oldest technologies were all at one time new.  The ability to extract metal from rock, for example, was not just a new technology at one time, but one that had profound implications for the course of societies.  Literacy - the ability to send and receive information by way of written symbols - was also new at one time and as the work of Parry, Luria, and others show (Ong, 1982), had an unanticipated effect on the way people thought and spoke.  Societies were forever changed by literacy.  As Neil Postman has said (1992), "[t]echnological change is not additive; it is ecological.  A new technology does not merely add something; it changes everything."

Emerging technologies need to be considered, then, in light of their potential for taking advantage of the biological nature of learning.  They need to be monitored as well, however, for possible misuse.


Possible Misuse

Many young people seem to have a natural propensity for video technology, particularly if it entertains or is interactive.  TV and play stations are part of their growing up.  TV (with two-way sound) was tried in classroom in the 1960s, based possibly on the assumption that young people liked to watch TV, and would learn what they heard and saw.  That hope did not materialize.  Teachers on TV were even less interesting than teachers in person.  If there is any precaution to be noted in the use of any technology to educate, it would be that the technology in and of itself is not the answer to the problem.  Clearly, the intent of teachers using any technology is not that students simply become proficient with, or enamored by, the technology itself.  Understanding and remembering course content remains the real goal.

I would suggest here two conditions in particular where computer technology would enhance learning. 

  1. Something (or somebody) has stirred up an interest in the student and the technology is available to satisfy and exploit that interest.  The interest intended here is not in the technology itself, but in some content, or problem, or body of information that is made available by the technology.  It is this aspect of technology use that might explain the considerable success of some distance learning endeavors.  These have been most successful when the students (often in their mid-thirties and employed full time) have a recognized and real need of learning and have no other access that is interactive.  In such cases computer technology works wonderfully well, not necessarily because it's the best vehicle, but because it's the only one available, and the users are highly motivated.  Younger undergrads can also profit if some teacher or book or other source has fired their curiosity and the technology becomes their gateway to satisfying a pre-existing need to know.
  2. A second condition that would seem to offer promise capitalizes on students' natural interest in the technology itself.  Here great care in design is essential.  For some young people technology is intrinsically fun and they will use it for that reason.  The trick, if you will, is to gradually transfer students' engagement from the technology to the content.

A particularly good example of the second condition appears in these pages (Roth, p xx).  Students in that case had an interactive program that looked a bit like a game.  An object moved in a sometimes surprising way when just two variables were changed in varying combinations.  Students didn't "know" at first that they were dealing with force and velocity and that both of these have scalar and vector properties.  Two elements contributed to the success of this method.  First, the program was a bit like a puzzle that needs solving and so students took to it readily.  Equally important, they worked at a terminal in groups and quite naturally made their opinions known both by gesturing and by speaking.  While it took some time to do so, students' refined their language until words like "force," "velocity," "larger," and "direction" were being used precisely and meaningfully. 

The question of whether high technology is good for education or not is as meaningless as whether textbooks are good or not.  There cannot be a yes/no answer.  As reports of technology use accumulate it is becoming obvious that "appropriateness" is the operative word.  The assumption, for example, that all young people like gadgetry and computers in particular is clearly wrong.  The idea that all subjects can be taught equally well with one technology is quite probably wrong.  That learning can never be improved by technology is certainly and demonstrably wrong.

Probably the best advice for the use of technology in teaching is that given by Neil Postman (1999) when contemplating any change:  1) "What problem are you trying to solve?" and 2) "Whose problem is it?"  That technology can solve "the problem" of education or that it has nothing to offer are equally preposterous positions.

It is hoped that a better understanding of learning as a biological process, a matter of brain change rather than simply brain use, will make contributions to both the design and use of technology - at any level of sophistication - in teaching.  What should be becoming clear is that "hands-on" activity, whether in laboratories or with computers, is neither a sufficient nor a necessary component in learning (Leamnson, 2000).  Only when students are motivated to think about the concepts in question (and verbalize them one way or another) does activity of any kind enhance learning.

The answer to the question "Does technology present a new way of learning?" must be that it does not.  All learning is biological brain change and all teaching is an attempt to encourage and stimulate students to do what it takes to make those changes, that is, to focus their attention and to practice.  A more useful statement might be that computers and associated technology, and the access they afford, constitute a new way of studying.  But as noted earlier, technology doesn't just add something, it changes things. The pressing question is whether new ways of study will result in learning as understood in these pages.


Outcomes to be Assessed

I suggest that it would be a serious mistake to look for, or design, new learning outcomes simply because the pursuit of learning is utilizing new technology.  What humans think about changes almost daily, but the way we think has not changed in many thousands of years.  What the brain does to retain information is part of our biological endowment and that does not change because our environment changes.

We might well consider which talents of our ancestors were of greatest benefit when they were developing metallurgy, chemistry, geometry, the calculus, and great works of literature.  Are these talents outdated?  The ability to concentrate, to evaluate, to examine alternatives, to solve problems, and to verbalize our thoughts are not useless talents, no matter the level of technology in a society.  I would propose, then, a kind of test for the efficacy of a technology, old or new.   Does it improve the users' problem-solving abilities?  Does it encourage concentration?  Does it enhance engagement with content?  Does it build facility with language?

In short, the point of any technology cannot be mere facility with the technology.  The point of technology, old or new, is to increase or enhance human learning.  A good appreciation of the biological basis of learning can only contribute positively to the design of instruction, including the use of technology.



  • Albright, T. D., Jessell, T. M., Kandel, E. R. & Posner, M. I. (2000). Neural Science: A Century of Progress and the Mysteries that Remain. Review Supplement, Cell, Vol. 100/ Neuron, Vol. 25, Boston: Cell Press.
  • Changeux, J.-P. (1985). Neuronal Man: The Biology of Mind, Translated by Laurence Garey, New York: Oxford University Press.
  • Edelman, G. M. (1989). The Remembered Present: A Biological Theory of Consciousness, New York: Basic Book.
  • Gopnik, A., Meltzoff, A. N. & Kuhl, P. K. (1999). The Scientist in the Crib: Minds, Brains, and How Children Learn, New York: Wm. Morrow and Co.
  • Leamnson, R. (2000). Learning as Biological Brain Change. Change, 32 (6), 34-40.
  • Ong, W. J. (1982). Orality and Literacy: The Technologizing of the Word, New York: Methuen.
  • Postman, N. (1992). Technopoly:  The Surrender of Culture to Technology, New York: Knopf.
  • Postman, N. (1999). Building a Bridge to the Eighteenth Century: How the Past Can Improve Our Future, New York: Knopf.
  • Squire, L. R. & Kandel, E. (1999). Memory: From Mind to Molecules, Scientific American Library, New York: Freeman and Co.
  • Stahl, S. (1999). Different Strokes for Different Folks? A Critique of Learning Styles. American Educator, Fall, 27-31.