Educational Technology & Society 2(4) 1999
ISSN 1436-4522

Embedding Ubiquitous Use of Educational Technology: is it possible, do we want it and, if so, how do we achieve it?

Chris O’Hagan
Dean of Learning Development
University of Derby, Derby DE22 1GB
United Kingdom


Attempts to use CIT in education are now ubiquitous. There is hardly an institution without its pockets of expertise, and hardly a national government that has not made pronouncements about the need for schools/colleges/universities to enter the 'information age'. The investment in particular institutions is sometimes substantial, but even so, this rarely amounts to more than 1 or 2% of total expenditure, and is usually much less. The problem is that cash for development usually has to be found in addition to normal operating costs which are already on the margins of adequacy, while any benefits in terms of efficiency take years to work their way through. Unlike private industry, the public education sector cannot borrow or float a share 'rights issue' to raise the required investment. However, money is not the only problem, and possibly not even the central problem in embedding the use of technology - that is, in moving from pockets of expertise plus optimistic exhortations to regular, everyday use across the curriculum.

Technology Works

The history of educational technologies is littered with pockets of expertise, government pronouncements, and investment in equipment that has quickly ended up gathering dust in forgotten corners. Is this because they have generally failed to deliver? It depends how you measure failure. An extensive literature has grown up around attempts to demonstrate the effectiveness of technology, and though particular studies have reached different conclusions, overall the evidence stacks up to suggest the use of technologies is not inferior to the use of more conventional delivery. There is 'no significant difference'. (see Russell, 1999) Conflicting outcomes are probably best explained by local rather than intrinsic factors, and at the very least technology can provide useful flexibility and variety in developing delivery of the curriculum. Ken Spencer's paper provides a short overview of some of the research carried out across this century, and he concludes (perhaps somewhat ahistorically, because ubiquity has not always followed despite the positive research results) that it is an unstoppable force. In support of such a bullish assertion, Spencer points to a technology so deeply embedded "we hardly acknowledge its presence" - the written word.

To be sure, the post-war period has seen dramatic developments in the use of the written word by teachers. The photocopier has enabled them to quickly produce copies of learning guidance and learning resources for each individual student, and the word processor has enhanced this facility by enabling the production of high quality master copies, customised by the teacher. This revolution in typesetting and printing which began with the typewriter and ink duplicator has driven the development of resource-based learning methods, and has supported the emergence of student-centred models of learning. No greater howl of distress can be heard than when the photocopier breaks down - though the grumbling when the network crashes is growing louder by the minute. Fortunately there is usually more than one photocopier close at hand, and maintenance contracts usually demand a repair within four working hours – and now we are seeing a parallel duplication of servers and the rise of tough service level agreements in central IT service departments as well.

Another example of a successfully embedded technology is the overhead projector, which has displaced the blackboard/whiteboard to a large degree in higher education, and continues to make inroads in other sectors. Clearly, technology can work to enhance both efficiency and effectiveness in some cases, particularly where it is relatively reliable, ubiquitously available, and empowers the teacher to customise his or her teaching and learning resources (which includes the ability to customise and develop links between bought-in resources). This can be summarised in the axiom 'teacher access with autonomy leads to ubiquitous use'. Although this may be a necessary condition, it is probably not sufficient.

However, the examples of the photocopier, typewriter/word processor and overhead projector would suggest that embedding ubiquitous use of technologies in education is both possible, and desirable, and can happen over a relatively short time scale, even if not as short as some of the hardware suppliers, politicians, optimists and nerds would like.

Making It Work Pedagogically

Although few would deny that technology can add to the variety and flexibility of learning opportunities, there is some controversy as to whether it can actually enhance learning of itself. Andrew Agostino's paper draws on a challenge, posed by R. E. Clark nearly 20 years ago, to demonstrate that the medium rather than the instructional design can influence learning. Clark argues that the media are "mere vehicles that deliver instruction but do not influence student achievement any more than the truck that delivers our groceries causes changes in our nutrition." Thus one could ask what is the difference between a multiple choice test delivered on paper, with the teacher giving oral feedback to students, and one delivered by computer with feedback loops? A poorly devised test, without properly integrated formative or summative intention, is a poor test however it is delivered. As Harry Braverman (1974) argued, technology works by embedding the skills and routines of the artisan into a machine. A numerical-control loom still replicates the techniques of the handloom weaver to produce cloth. It may enable more variety and production efficiency, but it is hard to argue that the cloth is intrinsically 'better'. Thus those who demand a 'new pedagogy' for the new technology are misguided. It is simply a question of embedding old methods in the new machines. Isn't this what happened in the case of photocopiers, word processors and overhead projectors? They simply broadened the use of familiar methods. There is an attraction to this view: it ought to be easier to enable staff to adapt familiar methods to the technology than to help them to learn supposedly brand new instructional methods ab initio.

Agostino argues that to challenge Clark we need a radical re-appraisal of the way research is conducted. Research based in situated cognition theory, where the media are viewed as artefacts embedded in the context (and not standing somehow above it), would reveal that media do alter the nature of learning, do subtly change the instructional design. Martin Owen would probably concur, taking the argument a step further - the failure of the new media to influence learning positively is real, and not just an illusion stemming from poor research design. It is partly the result of poor software design. Instead of adapting packages originally produced for non-educational purposes, we need to focus on the development of an integrated range of software dedicated to pedagogical purposes and based on internationally common and agreed standards.

Technophobia and accusations of technocracy have obscured the need for education to engage with the technology more holistically, and Owen argues that the analogy of the CIT user to a car driver who need know nothing about how the car works is not helpful. Interestingly, he draws his insights into technology from seminal ‘novels’ of the 1970’s, by Persig and Marquez.

The Position of the Teacher

The challenge to more conventional analyses continues in Sara Dexter's paper, which wittily compares the introduction of computers into schools by education chiefs to the naïve expectations of 'cargo cults' which believe they can gain miraculous 'goods' if only the tribe follows the correct ritual behaviour. She draws on the work of Durkheim and Goffman to argue that a school is a society with a culture of 'collective representations'. These are more than just habits or rituals but shared perspectives on the purposes of education, the role of the school, the role of the teacher etc. Behaviour does not simply obey the collective representations; it manifests and reinforces them. For technology to be accepted and adopted the whole web of representations may need a complete re-ordering, with reframed myths, rituals and norms. It will require the consent and participation of the entire school culture.

Kim Dooley has also developed a holistic diffusion model which helps to analyse the contextual influences on teachers and their differences in attitudes to technology. His results suggest that 'collective representations' might not be quite as universally shared in the school community as Dexter proposes. Nevertheless, Dooley shares her view that we need a holistic model for diffusion which takes the whole school and its staff into account - in this case a 'concerns-based' model. Another paper in this special edition describes an application of the Dooley model in a school. (Dooley, Metcalf and Martinez)

Levers of Change

In the last few years a new vocabulary has emerged in education, some of it migrating from industry and commerce. Thus we now talk about top-down and bottom-up processes, where once we simply spoke of ‘ownership’. It is now realised that empowering teachers is not sufficient – there is a ‘glass ceiling’ between them and the strategists at the top, composed of a mixture of middle and senior managers who have more ‘urgent’ priorities. Thus top-down direction from the strategists has become necessary to change the order of priorities. Of course, this dirigiste element won’t work unless the teachers have acquired the skills ‘bottom-up’ to be ‘pulled through’ into the implementation of teaching and learning strategies. Carmel McNaught et al describe a well-planned top-down/bottom-up approach at RMIT, Melbourne, Australia, with a funding level most of us would envy, but which indicates the seriousness with which some institutions regard the need to embed ubiquitous use of technologies to achieve missions which have been reformulated to meet the anticipated challenges in the coming millennium.

But how do we achieve bottom-up empowerment. One way is to buy out staff time to engage with innovation, another to develop mentors to bring on board the less enthusiastic, another to fund key projects. These are all included in the McNaught study and also very much reflect our experience at Derby. (O’Hagan, 1999)

Darien Rossiter, from QUT, Brisbane, Australia, describes a further essential component – the development of technological literacy in staff and students, across an entire institution. Thomas Spotts points to another important lever – the technology must help the faculty member do a better job. Of course, it might be added as a balance to this – do teachers know what a ‘better job’ would be if they are not trained as teachers? This is a problem in higher education in most countries. However, the debate about ‘technology’ does seem to have restored the lost dialogue about ‘pedagogy’ to some extent, which had become submerged under more instrumental concerns about budgets, government edicts on curriculum, evaluation and accountability etc.

Jane Barnard of the Open University also emphasizes the need for software to relate to teachers current non-technological practices. Using a framework developed by Braun and McIntyre from examining how teachers organize their thoughts about their usual classroom practice (in a non-technological context), she analyses how a group of biology teachers were able to adapt their practice to accommodate a computer package. She finds that teachers will sometimes accept complex computer activity which does not sit easily with their normal expectations of the way students will learn an element in the syllabus, if it is "inextricably tied to a useful Progress goal". On the other hand, she argues, too much of this kind of use could be counterproductive, shifting the teacher’s focus "away from concentration on their students as individuals".

Larry Dooley, Teri Metcalf and Ann Martinez complete the fascinating insights provided by these Case Studies, with a study of diffusion of CIT within a single school, and offer some recommendations, such as the formation of interdisciplinary teams with at least one more knowledgeable teacher, the use of informal training, and a collegial mentor program for new teachers or ‘low users’, where the mentor is a ‘medium’ user and not a "technology trainer or high user, because the knowledge/skill gap is too great".

Snapshots of Current Practice

I offered an opportunity to contributors to submit short ‘snapshots’, perhaps unsupported by the usual scholarly framework, because it is not just the well-supported study that can offer those of us struggling with processes already in motion, right now, insights into our practical problems. Sometimes we can find illumination from ongoing reports of others’ ‘praxis’.

Daniel Lim, and Cresswell and Syson offer descriptions of two very different approaches to embedding ubiquity in a university – the one almost revolutionary, the other more evolutionary, yet both with considerable success in bringing staff and students on board. Clearly, the University of Minnesota model has financial implications that students at Coventry university would find hard to meet, particularly given the demographic intake of the latter. But continuing falls in the price of high-powered laptops could bring the Minnesota experience within the reach of universities like Coventry before long.

I am pleased that this special edition is able to include a paper on bringing the potential offered by CIT to learning-disabled students. John Wilding discusses some of the problems and a particular research program. There can be little doubt that insights from such analysis of the difficulties, and attempts at solution, will have spin-off into the more general development of CIT for education, as well as benefiting the disabled.

Kupritz and McDaniel recount their experience of a crash course for teachers in instructional technology – a mixed, but illuminating experience, which caused them to reflect that current technology is "better geared toward objective knowledge" but "falls short as an affective learning tool for subjective knowledge, where validity is not absolute and is value-laden".

Karen Norum was even more troubled by an encounter with technology when she was obliged to deliver a class by videoconference to an unseen audience. "The monthly distant monologues reminded me that learning is a social process…. While it is possible to form virtual communities, these communities are not the same thing." Albeit that the set-up she was asked to operate within might have been better conceived to allow the teacher to feel closer to the students, her story reminds us that technology will require that teachers develop new skills, and not all will feel at home with any and every aspect of the new media, even after appropriate training.

Norum’s experience suggests that the question teachers will need to ask is "How can technology help me to play to my strengths as a teacher?" The answer for some may be that asynchronous computer conferencing extends their skills as a tutor, as a ‘listener’, but for others who are creative in designing group project work, or problem-based learning, the WorldWideWeb or multimedia may suggest new possibilities. It might be that weaknesses can be overcome: the boring lecturer who finds it hard to project, or speak in a lively way, could find they can create an exciting and engaging computer presentation covering the same material.


Despite a few cautions (such as given by Norum, and Kupritz and McDaniel) I believe that the overall impression supplied by the papers in this special edition is that, yes, it is possible to embed ubiquitous use of educational technology and, yes, we do want it, at the very least because it offers variety and flexibility, and at best because it can actually enhance student learning.

How do we achieve it? There is no one way, and although we may be able to define some necessary conditions, no one can say that they know absolutely what is sufficient. But anyone reading and reflecting on the various discussions included here will be much better armed to analyse their local change environment – school, college or university – and to help lead the diffusion of technology to the benefit of present and future students.


  • Braverman, H. (1974). Labor and Monopoly Capital: the degradation of work in the twentieth century, New York: Monthly Review Press.
  • O’Hagan, C. M. & Fry, J. (1999). The academic development fund at the University of Derby 1994-98: origins, implementation and lessons. In K. Sarlin (Ed.) EUNIS99: Information Technology Shaping European Universities, Helsinki: European University Information Systems Organisation, 49-53, 951-22-4542-6.
  • Russell, T. L. (1999). The No Significant Difference Phenomenon, North Carolina: NC State University, 0-9668936-0-3.