Educational Technology & Society 5 (1) 2002
ISSN 1436-4522

The Experience of Practitioners with Technology-Enhanced Teaching and Learning

Som Naidu and David Cunnington
The University of Melbourne, Department of Teaching, Learning and Research Support
Information Division, Victoria, Australia 3010

Carol Jasen
The University of Melbourne, Department of Learning and Educational Development
Faculty of Education, Victoria, Australia 3010



This paper describes a research project, which seeks to explore the experience of educators with technology-enhanced teaching and learning. A particular focus of this investigation is on how the use of information and communications technology is influencing teaching practices and students’ approaches to learning at the University of Melbourne. This is a naturalistic inquiry into the experience base of practitioners who have been engaged in technology-enhanced teaching and learning. Our goal is to look beyond superficial data and examine closely how information and communications technology is fundamentally influencing the nature of the teaching and learning transactions. We are interested in the untold stories of practitioners and participants in this work. Data that is collected is archived on a website, and used in a variety of ways for faculty development.

Keywords: Action research, Naturalistic inquiry, Practitioner experience, Technology enhanced teaching and learning


Educational institutions all around the world are beginning to pay greater attention to the improvement of their teaching and learning practices with the innovative use of information and communications technologies (ICT). Although a great deal of work has gone on in the investigation of the effects of computer-based learning, there is a lack of reliable knowledge about what works, why and in what ways? This paper describes a research project that is seeking answers to these questions from the perspective of practitioner experiences. The goal of this investigation is to look beyond survey data derived from questionnaires into the experiences of practitioners in order to ascertain how ICT is fundamentally influencing the nature of the teaching and learning processes in various subject matter domains. The aim is to tell the untold stories of practitioners and participants. The stories we are collecting and the profiles of practice that we are developing will comprise the data for the development of conceptual models of best practice. These models may then form the subject of empirical study in the future.

The outcomes of this research are expected to provide a deeper level understanding of how the use of ICT is influencing teaching and learning in fundamental ways. In that regard, this is exploratory research as it seeks to compile stories of the experiences of teachers and students with technology-enhanced teaching and learning. These stories or vignettes of practitioners will be used to build a “gallery of stories” on technology-enhanced teaching and learning available on a website for the benefit of all, especially novices. Models of behavior and practice derived from this research will provide the context for future empirical studies, such as studies of correlations between innovative teaching and learning designs and specific learning outcomes, and/or approaches to learning and teaching.


Context and Scope

The application of ICT in teaching and learning has the potential to change educational practices in significant ways (Ben-Jacob, Levin & Ben-Jacob, 2000; Rogers, 2000). For example, the application of e-mail and computer conferencing, in conjunction with multimedia, databases and electronic libraries, has enabled the emergence of a whole new kind of educational activity called ‘eLearning’. Information and communications technologies are also enabling established campus-based providers to rethink and re-engineer the nature of their teaching and learning practices. The University of Melbourne, like many other educational institutions, is currently involved in just such a process as part of a strategy to position the University as a global player in higher education. As a direct result of this and along with the adoption of ICT, innovative approaches to teaching and learning such as problem-based learning and collaborative learning are being encouraged. These initiatives have led to the rise of new roles for teachers such as “facilitators of learning” as opposed to “deliverers of content” (de Verneil & Berge, 2000; Evensen & Hmelo, 2000; Salmon, 2000). They have also exposed students to new models and approaches to learning such as “computer supported collaborative learning” (Koschmann, 1996), and “computer supported problem based learning” (Bernard, Rojo de Rubalcava, & St-Pierre, 2000; Crook, 1994; Dillenbourg, 1999; Koschmann et al., 1996; O’Malley, 1995).

While interest is growing in the integration of technology in learning and teaching, there is still very little known about how the use of ICT is changing teachers approaches to teaching and students approaches to learning (Rumble, 2000). The need to investigate what is happening with technology-enhanced teaching and learning is now imperative. This includes, among other things, understanding how approaches to teaching are being impacted, how teacher-thought about teaching and learning is being modified, how students’ approaches to learning are changing, and how student support is changing with the use of ICT.



The research described in this paper comprises a naturalistic inquiry into the modus operandi of educators (Lincoln & Guba, 1985). Naturalistic inquiry is particularly suited to settings such as this where the context is dependent on individual interpretations and perceptions. Such a contextual inquiry demands the active use of the inquirer’s tacit knowledge combined with qualitative data gathering tools such as interviews, direct observations, self-reportingand think-aloud, and document analysis. The inquiry takes the form of successive iterations of these elements: purposive sampling, inductive analysis of the data, development of grounded theory based on the inductive analysis, and projection of next steps in a constantly emergent design (Lincoln & Guba, 1985). Throughout the inquiry, and especially at the end, the data and interpretations are continuously checked with respondents, and differences of opinion are negotiated until the outcomes are agreed upon or differences of opinion are understood and reflected as such. This information is then used to develop a case report or profile, which is tested for “credibility” and “confirmability” (Lincoln & Guba, 1985).


Data Gathering

Interviews are being used as the principal instrument for data gathering initially. The initial sample comprises practitioners who are known for spearheading the use of ICT into teaching and learning at The University of Melbourne. This sample will grow to include other practitioners at the University of Melbourne and possibly practitioners from other organizations including tertiary educational institutions, and also commercial enterprises.

In the spirit of naturalistic inquiry, interviews are conducted on location. An interview protocol has been developed which sets out the goals of the interview and questions to guide the interview (see Table 1).

Interviews routinely begin with a discussion of this interview protocol. This is to ensure that interviewees understand the questions, and are comfortable with their motives. With the permission of the interviewees, all interviews are audiotaped and subsequently transcribed. These transcripts comprise the raw data for the development of profiles of practice. The interview protocol follows the action research methodology. This comprises planning, doing/taking action, observing and reflecting.


Teaching and Learning Experiences with Educational Technology

What is our goal and focus?

We are interested in your story and your experience.

These experiences will be presented in a database with a focus on the outcomes and impacts of whatever you and your students have done. It seeks to be reflective and conversational, and will be available to all University of Melbourne staff.

So we would like you to reflect on your experience in terms of the following:


  • Briefly describe the project.
  • What were your goals and motivations?
  • Why were they important, to whom and to what?
  • What aspects of your teaching and learning you were trying to influence (e.g., innovative approaches to content presentation, activation of learning, assessment, socialization, or provision of feedback)?
  • Describe your approach to learning and teaching in relation to this project?
  • What was unique or innovative about this approach?
  • What limitations of theoretical perspective did you encounter?
  • What unique challenges did you face in planning your approach?
  • How did you know if you were on the right track?

What you did

  • How did you go about choosing the tools and technologies?
  • What influenced your choice of these tools and technologies?
  • What challenges did you face in selecting these tools and technology?
  • What limitations did you experience; financial, technical or organizational?

What happened?

  • Please describe the implementation?
  • Any problems? What worked and didn’t work?
  • What monitoring processes did you put in place?
  • How did you make use of the data that was gathered?

What you learned

  • How did this innovation influence your view of teaching and learning?
  • In what ways have you changed in the way you think about your teaching?
  • How did it influence your students’ approaches to studying and learning?
  • Did it impact your understanding of your students’ studying and learning?
  • What are your successes, failures, serendipitous findings, lessons learned?
  • What would you do differently next time?


Table 1. Initial Interview Protocol


Action Research

Action research is concerned with social practice and aimed towards improvement. It is a reflective and cyclical process, which is systematically pursued. Furthermore, it is participatory as well as individualistic in nature. Several conditions are individually and jointly necessary for action research (Carr & Kemmis, 1986). Firstly, a project takes the form of social practice. Secondly, the project proceeds through a spiral of cycles of planning, acting, observing and reflecting, with each of these activities interrelated and being systematically implemented. Thirdly, the project involves those responsible for the practice in activity, including those affected by the practice.

Education is a form of social practice, which in the majority of cases involves interaction between teachers and students, as well as among students. Teaching and learning issues, which are at the heart of educational practice, are usually ill-defined and complex phenomena. Understanding these issues requires understanding a whole range of issues such as history, attitudes, motives and biases of faculty and students alike. Positivist approaches to research are unsuited in this area of practice, as they tend to focus too specifically on specific variables while holding others constant. Action research is better suited to this situation because it is better able to capture the richness and complexity of the social practice.

Action research is fundamentally about improving practice. It is also an iterative process, which involves planning, acting, observing and reflecting. Improvement is based on lessons learned from previous iterations. Sometimes action research may seem somewhat chaotic because of a great deal of time spent in the beginning on identifying the problem or seeing a way through the problem, largely because of the complexity of the project and/or its ill-defined nature (Cook, 1998). Action research need not be all that imprecise. The action research cycle incorporates systematic observation and evaluation. Both generators and consumers of data can scrutinize processes and outcomes. Finally, action research is both participatory and an individual activity. Group-based action research has the advantage of benefiting from group discourse, while individual problem-solving activity is based on the centrality of the reflective process (Schon, 1983).


Development of Profiles of Practice

The interview transcripts comprise the raw data. Researchers examine these transcripts to develop individual profiles of practice along the lines of the interview protocol. These are then presented to each interviewee to allow the filling of gaps in the profiles, verification of existing materials and addition of any other thoughts on the matters raised during the interview. This in itself is an iterative process and might involve further interview and consultations with interviewees. The profiles are entered onto the database only when complete agreement has been reached between the interviewee and the researchers on the content of the profiles.


Development of the Database

The database is used to generate profiles of practice for a website that is available to all University of Melbourne academics. Data is entered using a simple web-based form to populate the fields (see Table 2).










Goals and motivations

Choice of tools

What worked

Impacts on your view of teaching and learning

Approach to learning

Influences on choice of tools

What did not work

Impacts on your student’s approach to studying

Limitations of the approach

Challenges in selecting tools

What criteria for success were used

What did you learn


Limitations experienced

How those criteria were measured

What you would do differently next time


Table 2. Profiles of practice database fields


Data is also included for standard project details such as the names and affiliations of faculty and project leaders, date of implementation, and project type. Each profile also contains a brief summary of the project and current issues. There is no requirement that all fields contain data. Where appropriate, data can be provided in formats other than text, such as images, audio files or links to web sites.

The database is used to generate a showcase of academic practice. Users of the site can display full details of individual projects (see Tables 3 – 6) or customize the display of stories by focussing on a particular action research process, group of questions, or faculty. Display options include a choice of predefined categories, as well as browseable lists and individually constructed searches.

Pre-defined lists enable the display of projects by faculty and department, as well as by each of the action research process. Search functions provide for customised views of the data that enable investigation based on specific interests. Keyword searching is also available across all fields of the database.

Each story segment is displayed with appropriate metadata describing the corresponding project. The project metadata provides links to additional departmental databases containing technical and project information, and to details of academics involved in the project. Links are provided to live web content where this is appropriate.


A Selection of Four Typical Profiles


Faculty of Engineering and Computer Science, Department of Geomatics


Summary of Project

Comprised a number of projects that had to do with the introduction and use of multimedia-based simulations of authentic environments (such as the zoo) as part of the learning environments for students of survey engineering.


Planning – Summary comment

Presenting material to students in a way that would enhance their understanding of concepts and increase their active participation in the teaching and learning context in lectures or tutorials. I talked with colleagues who had been doing a lot of work with multi-media and presentations in 3D visualization. Most of the components that we had put together over the years had to be swept aside and we had to start from scratch. We had a fairly clear plan about problem-based scenarios, based on visual spatial problems that we could put into good effect to cover an entire course.


Planning – Goals and motivations

The primary issues are design skills and seeing the bigger picture. In many cases we were teaching fairly low-level concepts in terms of chalk and talk. It is much more effective when you can present it in a very visual way, where the students can get their hands on the surveyor simulator. We were trying to put learning concepts in a problem-based, real-world environment.


Planning – Approach to teaching and learning

We saw the writing on the wall – that instead of talking about things and using static diagrams and drawings, we could put up some interactive animations, which would present these types of survey techniques in the context of a real problem. The only way to do it was by simulation.


Planning – Challenges

We had never really considered it until the teaching and learning project grants came along, because we knew the amount of support that we would need to develop this stuff was unavailable.


Doing – Summary comment

The development process, with one exception, wasn’t that difficult. My solution to this is that you monitor what people are doing reasonably closely and talk to them on a regular basis.


Doing – Choice of tools

It was partially driven by available staff. We had some very clear plans for the initial project. It would be broad-based and primarily with Java sitting behind the simulation. There were some other alternatives, but we felt that the best option was to look around at what other people had done and how they had done it.


Doing: Challenges in selecting tools

Finding people was by far the biggest problem. It took us quite some time to find people with the right skills who were willing to work here for the sort of salaries we could pay. That was one of the biggest challenges.


Doing – Limitations experienced

I’ve always tried to avoid the situation where you are developing a product in one particular calendar year and then you are implementing it in the next because I honestly think that doesn’t work.


Observing - Summary comment

I always tend to use some sort of student focus where there’s an observation process. I think that would be common amongst all the projects.


Observing - What did and did not work

I think we ended up providing a rich resource of information that was not always used effectively. We underestimated the amount of effort required to get programs up and working, to make them effective, to make them realistic, to provide the academic content with the background of all the other material on the Website.


Observing - What criteria for success were used?

I don’t think you can generalize too much. It’s a bit like – was this subject well taught? It’s a very blunt instrument that doesn’t give you any details. You can ask students standard questions about effectiveness, but they are blunt instruments. We were interested in how the simulation was impacting how teachers taught, and how students approached their study.


Observing - How those criteria were measured

I think a lot of it was anecdotal - responses from students in terms of questions and discussion, whether or not they used part of the simulation or the animations. I think that’s the sort of feedback that we’ve tended to use, rather than any formal type of evaluation.


Reflecting - Summary comment

I’ve had good experiences with groups of students sitting around the computer watching a simulation, discussing what they’re doing in a collaborative learning situation. There has been a higher level of engagement and interaction, both between the students and with the material, than we could ever have obtained using other techniques.


Reflecting - Impact on your view of teaching and learning

I think it has taught me to appreciate that student’s approach their learning differently. Earlier in my teaching career, I really had no concept that students could not learn by lectures and tutorials. Certainly, more recently to have all those tools at my beck and call has really taught me that it’s a fundamental truth that student’s approach their learning differently.


Reflecting - Impact on your students’ studying and learning

Presentation is still very valuable, because a lot of students get very good value out of good lectures and demonstrations. You have to provide variety and alternatives for the students who don’t learn effectively in that environment. There was always the fallback position, that if any of the teaching devices didn’t work, students came and talked to me – more recently using email.


Reflecting – What you learnt

In the end, it comes down to resources, money and priorities.


Reflecting - What you would do differently next time

I think better planning of the project is the answer. Having a much clearer idea of students’ needs and student responses, because that perhaps is one of the most unpredictable sides.


Table 3. Sample Profile #1


Faculty of Agriculture and Horticulture, Institute of Land and Food Resources


Summary of Project

This project sought to introduce undergraduate students of agriculture and horticulture to issues and themes related to communication (including computer mediated communication) and other generic skills necessary for operating in the workplace.


Planning - Summary comment

Critical issues in planning for effective teaching are identifying what prospective employers in industry want and how they see our students as new employees. What are the qualities that employers are after and what do they want taken care of in the process of internal training? Also, what are the needs from the student’s perspective?


Planning - Goals and motivations

To better help the employers and the students. We take a diverse range of students, to get them as close as we can to meeting employers’ needs - the ability to work in a team, to communicate well, high IT skills are a pretty universal set of extra things that we equip students with.


Planning - Approach to teaching and learning

I have taken a systems view of it. What is the context that surrounds both the students and the employers?

I talk to employers and obtain their perspective. I see students’ performances at various stages of their careers and identify strengths and weaknesses that may be relevant in the design of the learning activities.


Planning - Challenges

The challenge has been to bring the communications and IT skills of students up to a level that is required by employers in industry. To take a very broad spectrum of students, ranging from those who are very competent through to those who have very poor communication skills or almost no knowledge or familiarity with computers, through to the level that employers are expecting.


Doing - Summary comment

We didn’t have all of the expertise to deal with the various issues, such as how do you get students to work in teams, plan out a project and organize a conference. Being subject-orientated was important, articulating some goals and getting the students excited about being part of an adventure. It was painting an exciting vision.


Doing - Choice of tools

We used various specialized packages like simulation modeling systems, GIS or expert systems and published the results on the Web. Students were using a combination of being in a real environment and a virtual environment, which is what they are going to be confronted with when they start working.


Doing - Challenges in selecting tools

Back in the early days, all we had were terminals and a central computer. We had a mixture of the virtual and the real. Later, when the Web/Internet started working better, we started using that.


Doing - Limitations experienced

We evolved from the Global Learning Environment to what is now WebRAFT. It still doesn’t do everything that we want it to do and there is no one system, Blackboard or Web CT that does everything we want to do.


Observing - Summary comment

We didn’t have the luxury of developing the prototype and testing it and then finally implementing it. We just had to go with a live operational subject and modify it as we went.


Observing - What did and did not work

One of our blunders was thinking that we could implement a large amount of feedback on writing, to get the writing to improve. That just generated too much of a workload for the students and the staff. Another problem was that the project was carrying a lot of content that was required in the old curriculum and it took two or three attempts before we could get an optimal balance between old and new material.


Observing - What criteria for success were used?

We looked very closely at the feedback we got from our multiple evaluations. We had a whole array on our wish list from the students’ feedback and from our own impressions of assessment of the exam performance.


Observing - How those criteria were measured

We cross-checked to see whether it was consistent or not and sometimes we got stronger signals from some channels, than from other channels.


Reflecting - Summary comment

I think the lesson for other people trying to accomplish what we did was that having good relations with people all over the university is very important.


Reflecting - Impact on your view of teaching and learning

What I got excited about was subject-oriented learning. It is not the student and it’s not the teacher, it is really the topic and unless the teacher is excited about it, the students will either get switched on or switched off. I was fascinated with the technology and multimedia - trying them out.


Reflecting - Impact on your students’ studying and learning

I think the students appreciate having the top person in the university talking to them about their particular topic and they were excited about bringing in specialist lecturers.


Reflecting - What you learnt

There is an incredible array of material out there about making learning experiences better for students. We need to be sharing it more, to enable staff to think about it. The technology will just come along, once you have got the pedagogy sorted out.


Reflecting - What you would do differently next time

If as teachers, we weren’t being measured with a ‘quality of teaching survey’ that is so immediate and doesn’t really relate to the final learning outcomes as reflected in two to three years out of the course, let alone out of the subject, it would help us focus better on the assessment and on designing the learning. I would consider changing that.


Table 4. Sample Profile #2



Faculty of Arts, Department of Philosophy


Summary of Project

Reason!Able is a stand-alone PC package designed to assist students at all levels, including those with no explicit training in logic or argument, to acquire general informal reasoning skills. We looked at research about critical thinking courses and it showed that they just weren’t having the effect that they were claimed to have, and were perhaps actually hurting students’ critical thinking.


Planning: Summary comment

We wanted to help students learn to acquire general and fundamental skills of reasoning and argument. We teach general skills that can be applied in any domain whatsoever e.g. in students’ other academic subjects and in their chosen profession.


Planning – Goals and motivations

I wanted the students to go through a certain fairly standard, straightforward routine and I needed a dynamic form. I’d initially used a HyperCard stack, but wanted to come up with a dynamic type of software tool, that would simultaneously teach the students all the concepts and procedures that they needed to know.


Planning – Approach to learning

I was fundamentally concerned with the problem that the students did not learn. They were trying to learn, not succeeding, and not reflecting on the fact that they weren’t succeeding. If you want to acquire skills, you’ve got to practice. It’s true for cognitive skills just as much as practical ones. But it’s not just any old practice; it’s got to be the right kind of appropriately guided, scaffolded, and motivating practice.


Planning – Challenges

The idea that quality practice will lead to an improvement in skills is the one thing that has been absolutely constant. What we were trying to create with the software was what I later came to call an ‘environment tool’ for quality practice in reasoning.


Doing – Summary comment

Teaching is still a cottage industry where a whole lot of people are just assumed to be able to do it and they go by unquestioned. There has not yet been the pressure to force the change, and that will happen with globalization, and corporatisation of education.


Doing – Choice of tools

We thought that maybe there were more ways of representing complex structures of reasoning, which take advantage of representational resources, which for practical reasons couldn’t be used very effectively. In Reason!Able you’ll see an almost complete switch from a Hypercard approach to an all-in-one workspace approach. All the information is at all times available on the screen.


Doing: Challenges in selecting tools

Despite difficulties, it was the first time ever that somebody had built a way of handling argumentation that made it visual, manipulable, and graphical. It’s a real turning point.


Doing - Limitations experienced

In a certain sense, we failed the grants people in our first grant, because we found out that the task was a lot more difficult and challenging. From the point of view of the project, what we said we would do at the beginning and what we delivered at the end, I don’t think it was a success. I think that we’ve succeeded at the end of 3 projects.


Observing - Summary comment

We are dealing with a huge spectrum of intuitiveness and familiarity. It is very interesting, that many university students are having difficulty, yet elementary school students are coping with it.


Observing - What did and did not work

There’s a concept of developing something that’s actually much more visual. Students have been more successful in learning how to think critically because of these visual elements. It is the visual element that is doing a lot of the work.


Observing - What criteria for success were used?

So far, our students gain three to four times as much as any other students in the world, in terms of critical thinking skills. The studies showed that Reason!Able helps people learn critical thinking.


Observing - How those criteria were measured

Last semester, we did a number of pre and posttest studies, comparing users at the University of Melbourne, with users at Monash University in Australia and McMaster University in Canada. Their results were only half as good as ours. There was something about our approach, which was working much better.


Reflecting - Summary comment

It has changed me a lot, there’s no question about that. We don’t think deeply and hard enough about effectiveness and quality. We inherit a framework of practices and assumptions and we work within that so that what everybody else accepts around me is fair/reasonable practice, and that is what obviously I would accept as reasonable practice.


Reflecting - Impact on your view of teaching and learning

The software is tapping into a much larger set of brain resources than the typical ways of presenting arguments. We are getting students to use more of their brain. The software is partly visual, partly manipulable and that helps them and it makes life easier for them. It reduces the cognitive burden.


Reflecting - Impact on your students’ studying and learning

The students are being affected far more than they realize. At a deeper level, they are being exposed for the first time to what it is to be critical and what it is to have a rational opinion. They’re getting an understanding of what it takes, how complex the world is, what the issues are and how much work is involved in actually thinking through an issue.


Reflecting - What you learnt

I’m a much better teacher. If the primary role of this course is to improve critical thinking skills, then yes I think I’m a better teacher. I’m using better methods, better tools, and getting better results. If you’re in a deep and challenging project, then you’ve really got to expect that it may not pan out the way you think it will. You’ve just got to be prepared to change your direction, re-conceive the project, and ask for more money.


Table 5. Sample Profile #3



Faculty of Arts, School of Social Work


Summary of Project

LaSWOP (Law and Social Work Practice) focuses on the direct interpersonal aspects and legal dimensions of social work practice. It is a web-based interactive virtual experience that uses three case-based scenarios representing typical situations in the fields of child protection, juvenile justice and mental health. The School of Social Work has a professional practice framework of teaching. We maintain an awareness of what is happening in the field, the sorts of skills, knowledge and practice areas that are in contention. Our content is driven by what the University, the profession and the community expects of our undergraduate and postgraduate courses.


Planning – Summary comment

We had strong encouragement from the faculty and the University to incorporate multimedia into our teaching, assessment and learning. We were keen to see if, as teachers, there were different, interesting, and challenging ways to engage with students.


Planning – Goals and motivations

We wanted to allow students to engage with one set of materials but for the purposes of two subjects. We are trying to integrate theory and practice, to mirror the realities of practice, to build a bridge between the constructs, frameworks, and theoretical knowledge and skills knowledge.


Planning – Approach to learning

The project uses some different approaches to teaching, learning and assessment and that partly relates to the issue of engaging the students in the interaction.


Planning – Challenges

Trying to engage students is a big challenge. Students have a range of academic backgrounds, a range of volunteer and paid employment in the human services area. A lot of our students are post 1st qualification and several years into practical work. Quite a number are in their 30’s, 40’s or older, which makes them terrific cohorts for teaching purposes. It is a very rich teaching environment.


Doing – Summary comment

The driving issue was bringing together theory and practice in an accessible, challenging, and interesting way that mirrored the practice, but would be accessible to students.


Doing – Choice of tools

The web and its multimedia capabilities gave us a medium to do that.


Doing: Challenges in selecting tools

Finding tools that would enable us to engage with students and present a more interesting mirror of practice, a lot of which is now done by computer technology anyway.


Doing - Limitations experienced:

The biggest stumbling block was time - for thinking about what we wanted to do and to negotiate with skilled people like in the Multimedia Education Unit about what they might be able to do.


Observing – Summary comment

All this effort would have had greater impact, if we were able to use some spin off from this project as part of other practice-based courses. Furthermore, we may not need state of the art technology and we don’t necessarily have to spend this amount of money, or do what the some ‘experts’ in Educational Technology might suggest.


Observing - What did and did not work

The students and our peer-reviewers found the developed environment very useful. Clearly there were many things that were working there. For us the developers, however, it was clear that we could not possibly sustain this level of effort and commitment to multimedia developments in teaching.


Observing - What criteria for success were used?

We engaged in some post LaSWOP discussions with ourselves and with some research assistance we used online discussions, including an email-based feedback mechanism to collect data.


Observing - How those criteria were measured

Students who opted to do the LaSWOP assignment or the LaSWOP process received a report outcome, which was worth 50% of my subject. Students who chose not to do it did an alternative assignment, which was worth 40%.


Reflecting – Summary comment

We are going to have to revisit practice scenarios from time to time, certainly at least annually to ensure that practice is correct and current. There is continuous monitoring and that takes enormous amount of time.


Reflecting - Impact on your view of teaching and learning

What we are trying to do is mirror practice, and practice does change. I am acutely aware of the need for my students to engage with the material. Whether it is multimedia or otherwise, it has to be current, appealing, engaging and reflective of practice. I think that has translated, in an indirect sense, into some of my other teaching in lectures and seminars.


Reflecting - Impact on your students’ studying and learning

I want to bring out the best that I can in the students and I’m aware that there will be some for whom different media, and different approaches to teaching will work better. Having used LaSWOP, students feel that they know more about the sorts of skills or the knowledge they are expected to have.


Reflecting – What you learnt

My regret is that it has taken as long as it has to get this far.


Reflecting - What you would do differently next time

In terms of resources and difficulty, it would have been made much easier, if we had had someone with IT knowledge and computer skills within the department. I think that if we had got the grant in the same context now, we would probably get better value for money.


Table 6. Sample Profile #4


Concluding Remarks

The work that is described and discussed in this paper grew out of a growing call for evidence of the impacts of ICT in tertiary teaching and learning. While this is a question that has been asked many times, answers to it have not been conclusive. There is a lot of evidence to suggest that the use of ICT in tertiary teaching and learning has many advantages. There are also suggestions that these benefits do not justify the cost, time and effort that this kind of work entails. Many of these findings are, however, problematic as they are based on neither reliable nor valid research techniques. The work that is reported in this paper incorporates investigation techniques that depart from the commonly used approaches to the quantification of user perceptions with questionnaires and surveys of sorts.

Our goal in this work is to capture the experience base of practitioners with the use of a range of data gathering techniques that are grounded in the principles of naturalist inquiry. We realize that data derived from these kinds of approaches are not easily generalizable to other contexts. Among other things, generalizability is a function of sampling and we expect that over time, we will have in this gallery, the amount of information and data that is necessary to make meaningful generalizations from it to similar situations and contexts. We anticipate that this gallery will grow into an extremely rich resource of the experience base of not only many of our pioneering efforts but some of the most innovative work that is being undertaken in this regard at the University of Melbourne and elsewhere. A larger collection of profiles in the database would enable the examination of patterns and models of behavior among practitioners that could become the subject of further study. Questions that might seem relevant include: (a) What are the reasons for using particular approaches to teaching? (b) What are the prominent approaches to effective student study behavior? We hope that further exploration of these questions will help explain how the use of technology is impacting teaching, learning and work.



  • Ben-Jacob, M. G., Levin, D. S., & Ben-Jacob, T. K. (2000). The Learning Environment of the 21st Century. Educational Technology Review: International forum on Educational Technology Issues and Applications, Spring/Summer, 13, 8-12.
  • Bernard, R. M., Rojo de Rubalcava, B., & St-Pierre, D. (2000). Instructional design for collaborative distance learning: The state of practice and research. Distance Education, 21(2), 260-277.
  • Carr, W., & Kemmis, S. (1986) Becoming critical: Education, knowledge and action research. Brighton, Sussex: Falmer Press.
  • Cook, T. (1998). The importance of mess in action research. Educational Action Research, 6(1), 93-108.
  • Crook, C. (1994). Computers and the collaborative experience of learning. London: Routledge.
  • de Verneil, M., & Berge, Z. L (2000). Going Online: Guidelines for the Faculty in Higher Education. Educational Technology Review: International forum on Educational Technology Issues and Applications, Spring/Summer, 13, 13-18.
  • Dillenbourg, P. (Ed.) (1999). Collaborative learning: Cognitive and computational approaches. NY: Pergamon.
  • Evensen, D. H., & Hmelo, C. E. (Eds.). (2000). Problem-based learning: A research perspective on learning interactions. Mawah, NJ: Lawrence Erlbaum Associates.
  • Koschmann, T. D. (Ed.). (1996). CSCL: Theory and practice of an emerging paradigm. Mahwah, NJ: Lawrence Erlbaum Associates.
  • Koschmann, T. D., Kelson, A. C., Feltovich, P .J., & Barrows, H. S. (1996). Computer-Supported Problem-Based Learning: A principled approach to the use of computers in collaborative learning. In T. D. Koschmann (Ed.), CSCL: Theory and practice of an emerging paradigm (pp. 83–124). Hillsdale, NJ: Lawrence Erlbaum.
  • Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage Publications.
  • O’Malley, C. (Ed.). (1995). Computer supported collaborative learning. Berlin: Springer-Verlag.
  • Rogers, D. L. (2000). A Paradigm Shift: Technology Integration for Higher Education in the New Millennium. Educational Technology Review: International forum on Educational Technology Issues and Applications, Spring/Summer, 13, 19-27.
  • Rumble, G. (2000). Student support in distance education in the 21st Century: Learning from service management. Distance Education, 21(2), 216-235.
  • Salmon, G. (2000). E-moderating: The key to teaching and learning online. London: Routledge.
  • Schon, D. A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.