Educational Technology & Society 5 (3) 2002
ISSN 1436-4522

Contrasts in learning: a collaborative evaluation by practitioners and students

Liz Creese
School of Management
RMIT University, GPO Box 2476V
Melbourne 3001, Victoria, Australia
 Tel: +61 3 9925 5958
Fax: +61 3 9925 5580
elizabeth.creese@rmit.edu.au

Jane Kemelfield
Learning Technology Services
GPO Box 2476V
Melbourne 3001, Victoria, Australia
Tel. +61 3 9925 9550
Fax. +61 3 9925 9625
jane.kemelfield@rmit.edu.au

 

ABSTRACT

This paper documents some of the learning emanating from a learner-centred evaluation of a change from face-to-face to online lectures in an Organisational Behaviour course at RMIT University. An academic and learning technology mentor conducted the evaluation, working as co-researchers with ‘Managing Change’ students. Primarily, the evaluation aimed to illuminate the experience of using the online lectures, as perceived by Engineering students. As preliminary research to a wider study, the evaluation was also formative. The findings suggest that the Engineering students appreciated the flexibility of online lectures. However, they devalued the online experience because it lacked the interaction of face-to-face lectures and consequently adopted a surface approach to learning. The strong message emerging from the student evaluation data was the importance of social interaction for understanding organisational behaviour concepts. The practitioners could not initially ‘see’ this message, such were the demands of managing the logistics of the change to online delivery and the lack of adequate organisational support. The surface level of learning for the Engineering students is starkly contrasted with the deep level of learning for the researchers, leading to transformation for the Managing Change students and eventually the practitioners.

Keywords: Participant action research, Online lectures, Organisational behaviour, Communities of learners, Approaches to learning


Context

This evaluation is focussed around an introductory course in organisational behaviour. The course is a core first-year unit for several undergraduate Business programs, as well as being offered to Engineering and Science undergraduates as part of their double-degree program. The initial emphasis of the study—to trial the delivery of lectures online—sits within a broader strategic framework to ‘renew’ (RMIT, 2000) large courses (often involving in excess of six hundred students) for flexible delivery. Other demands upon the lecturers who deliver the course include catering for diverse student demographics and, increasingly, the requirement to deliver the course in a variety of remote and off-shore (in particular Asian) locations: a fully online offering of the Organisational Behaviour course is scheduled for delivery in Vietnam in 2002 as part of a flagship Bachelor of Commerce program.

Introduction to Management, a compulsory course for Engineering students, is a lecture-only offering of the Organisational Behaviour course (in contrast, Business students also attend tutorials). It was decided to conduct the trial of online lectures with this group of eighty Engineering students, rather than the full group of Business students, as the smaller student numbers would be more manageable. As well as accommodating the imperative from policymakers of the University—to ensure that the Bachelor of Commerce was successful commercially—the researchers’ primary aim was to focus on improvement of student learning.

Although the Engineering students were the critical reference group (Wadsworth, 2000) at whom the research was ultimately aimed, the evaluation also involved another group of students: twenty-five third-year students studying a Managing Change course. These Managing Change students worked with the practitioners to gather the evaluation data about the student experience of learning online. In doing so, they were able to undertake experiential learning about change management.

 

Structure of courses and how the evaluation was facilitated

The collection of evaluation data was linked to assessment. Table 1 shows how the two courses were structured, how students were assessed and how the evaluation was integrated into the students’ assessment.

 

The Managing Change course

The Introduction to Management course

Learning Resources

  • An introduction to evaluation concepts and approaches from the Evaluation Coordinator of RMIT’s Learning Technology Services.
  • Text books and articles.
  • Discussion and feedback on evaluation proposals and reports.
  • Online lecture content provided by Pearson Education.
  • Prescribed text book.

Assessment

  • The evaluation of the online lectures for Introduction to Management was integrated into the Managing Change course as the major assessment, a group project. The students formed syndicates and each syndicate was allocated a group of Engineering students to evaluate. (60%)
  • Reflective Journal (40%)

Formative assessment provided evaluative data for the Managing Change students:

  • Quizzes (also provided by Pearson). Marks were allocated for attempting a quiz, rather than obtaining correct answers. (22%)
  • Discussion Board. Used to reflect upon the experience of learning about online, with Engineering students required to post concerns and suggestions for improvements. (8%)

Summative assessment was not available as evaluative data to the Managing Change students for their final project report:

  • Final examination (70%). Required students to demonstrate knowledge of theory (multiple choice questions) and ability to apply theory (short essay questions).

Table 1: Structure of Courses and Generation of the Evaluation Data

 

An effort was made to ensure that potential barriers which could disrupt the trial were minimised:

  • A weekly computer laboratory session was provided for the Engineering students to ensure that all students had access to the online lectures.
  • To ensure interaction between the Managing Change and Engineering students (essential for the Managing Change students to collect evaluation data from the Engineering students), classes/laboratory sessions for both groups were held at the same time and in nearby rooms.
  • At the start of semester, Engineering students were required to attend an induction session. This ensured that students were able to login and that they knew how to navigate through and use the online learning environment.
  • To address the predicted anxiety involved in the change to online lectures, Engineering students were given printed instructions about the online environment, and information on how to contact the mentor and the ‘help desk’ and for assistance. In addition, the online discussion board promoted a means for students to convey problems, allowing an opportunity for the mentor or academic to immediately address some issues.

 

Why were online lectures used?

The educational limitations of lectures, whatever the means of delivery, are well-documented. For Ramsden (1992), lectures tend towards the transmission or telling of facts. Online lectures would fit Laurillard’s (1994) description of “text on-screen … and more unpleasant to read” (p.20) than text in a book (Laurillard, 1993). However, the decision to trial online lectures in place of the face-to-face lectures was based upon several reasons.

First, the evaluation of online lectures was the preliminary study to the redesign of the whole Organisational Behaviour course. The renewed course would be based on a rationale of learning activities to interact with the online lectures. This initial trial – of online lectures only – allowed one component of the planned fully online course to be trialled separately. This allowed the lecturer to progressively undertake the professional development required to understand what was involved in co-ordinating the design and delivery of a fully online course, including effective use of online communication to support student learning.

Secondly, the use of commercial resources dictated a ‘lecture and topic’ format and the decision to use Pearson’s online content was pragmatic. Inglis et al (1999) recommend customisation of commercial online resources, rather than expending vast resources in creating one’s own. This was certainly applicable to the Organisational Behaviour trial; there was not the time or financial resources to develop new online content (although a significant degree of customisation was required to contextualise the American materials for an Australian setting).

Finally, it was thought that using the lecture format would assist students and other stakeholders to make the transition (Bridges, 1966) to ‘online’. Continuing the familiar educational approach of communication of theory (lectures) followed by formative self-assessment (online quizzes) provided a linkage between the known lecture-tutorial setting and the unknown environment of online learning.

 

Methodology

Aim of the Evaluation

The aim of the evaluation was to establish whether lectures which used commercial online materials could adequately meet students learning and other needs. The evaluation of Pearson Education’s Organisational Behaviour course was formative: it provided an opportunity to pilot some components of the future fully online course and inform its design. However, it was also to “address and illuminate the complex array of questions” (Patton, 1990, p. 119) involved in a change from face-to-face to online delivery.

 

Evaluation approach

The evaluation approach sits on the extreme participatory end of the evaluation of educational technology continuum (Oliver, 2000). The use of a practical form of action research (Kemmis, 2000) enables improved student learning through reflection on practice (Schon, 1987). Given the power relationship between the student and the academic, a key challenge to participatory action research remains: how is it possible to achieve genuine participation with students when they are the subject of the research? The decision for the practitioners to collaborate on the evaluation, with the Managing Change students “as part of a negotiated assessment task” (Phillips, Tripp, Rice, Bain and McNaught, 2000), aimed to make the evaluation truly learner-centred. The Managing Change evaluators, being students, were consideredto be free of any assumptions or bias that the practitioners may bring to the research. Any emerging theory “derived from the data” (Glaser and Strauss, 1967, Turner, 1983) was felt to be grounded in the true experience of the learners, the students studying the online lectures.

 

Data Collection and Stages of Analysis

Figure 1 shows how data was collected from the Engineering students and the stages of analysing this data. Managing Change students collected evaluative data directly from the Engineering students, using observation, interviews, questionnaires, focus groups and the study of online discussion logs. The data was then analysed according to the research aims established during each syndicate’s project planning. Later, a former Managing Change student used the evaluation reports to summarise the data into themes. The academic then conducted subsequent analysis, informed by a wider study and research in phenomenographic literature.


Figure 1: Data Collection and Stages of Analysis

 

The evaluation co-ordinator of the university’s Learning Technology Services was invited to give a workshop with the eight Managing Change syndicates. This was based around a planning sheet for the students to design the what and how of their evaluations.  The Managing Change curriculum emphasised evaluation as part of the change process. A collaborative action inquiry approach was encouraged by booking the computer lab, dedicated to engineering student use,  opposite the Managing Change classroom.  Timings were also co-ordinated so that each syndicate could meet their sample of about ten engineering students over lunch.

However Managing Change students , who had all previously studied Organisation Research Methods, were free to choose their research approach. Their aims covered technological, social, educational and organisational issues within the overall aims of the practitioner study.  A range of research approaches was adopted with all but one syndicate combining quantitative and qualitative methods. These were also role played effectively using direct quotations in a presentation of the report. Other syndicates were able to make recommendations for improvement based on the highlighting of various issues.  However most comprehensive and revealing of the underlying reasons for the non-acceptance of flexible learning by overseas students in particular, was the four cycles of action research carried out. This syndicate made up of overseas students also won the Pearson Education Award, negotiated by the practitioners as incentive for student involvement in the research.

 

Evaluation of the Findings by Managing Change Students

The evaluation reports by Managing Change students highlighted two main issues: firstly, the technological and logistical issues involved in a change of lecture delivery format, and secondly, the desire of students to learn through social interaction, especially organisational behaviour concepts which are social science based.

The student evaluation data was broadly categorised into themes relating to the technology and themes about the quality of teaching and learning. The technological themes included:

  • usability issues with the Web interface (in many cases this could be attributed to poor information technology literacy);
  • issues with access, in particular the authentication process; and
  • an appreciation of the flexibility afforded by the online environment.

Themes around the teaching and learning quality were couched in different ways in the evaluation reports of the Managing Change students. These issues were categorised as follows:

  • lack of understanding of theory and the inability to apply theory;
  • lack of social interaction between teacher and student;
  • lack of social interaction between peers;
  • failure to use discussion boards to explore the lecture content; and
  • inadequacy of quizzes for developing an understanding of organisational behaviour.

As reported by the Managing Change students, the Engineering students’ lack of theoretical understanding of organisational behaviour was seen, especially by international students, to be the result of a lack of interaction with a lecturer in a face-to-face setting. Related to this was the use of the discussion board. It was used primarily for posting comments about the experience of online learning, rather than for facilitating a discussion about organisational behaviour. In addition, the quizzes only encouraged students to skim-read the lectures in order to pick out the correct answers.

The significance of these findings, in terms of the emerging theories discussed below, can be seen in the degree and depth of learning for the various parties collaborating in this evaluation.

 

Contrasts in Learning

Learning by Engineering Students

As suggested by the evaluation data, the Engineering students did not gain a sufficient understanding of the organisational behaviour concepts, as is required to be able to generally apply this theory to practice. The quizzes, intended for formative self-assessment, did not test the development of such understanding. Although the discussion board also contained questions intended to prompt students to apply theory to practice, contributing to these discussions was optional. As the co-ordinator of the course, with no allocated ‘teaching’ time, the academic barely had time to respond to and act upon the comments about the experience of learning online, let alone the time to encourage and motivate students to actively participate in the discussion of lecture content. Ironically, the mentor did have more time to engage in the online discussions and respond to the concerns of the Engineering students, however the mentor was not a ‘tutor’ with expertise in organisational behaviour and did not feel it was appropriate to respond to questions about the online lectures.

The findings suggest that the Engineering students were reacting to changes from face-to-face to online delivery of lectures. From their perspective, the change removed the value of face-to-face lectures (the opportunity to interact, at least minimally, with a teacher). Further, this opportunity for interaction was not satisfactorily replaced with appropriate, online-assisted communication. In addition, many had to contend with technical problems and additional costs of printing (which, for the international students, were on top of tuition fees). Although the ability to be a self-directed learner is essential for a course delivered online, the Engineering students were not given assistance in developing of these skills; and it was acknowledged only in passing that self-directed learning was even required.

Despite the general acknowledgement by students of the flexibility afforded by the medium, these other factors appear to have led to a devaluing of the online experience by the Engineering students, who not surprisingly adopted a surface approach (Biggs, 1999) to learning. In the final examination, the students could reproduce answers to the multiple-choice questions, however, in the short essay questions, they generally had difficulty applying organisational behaviour concepts to practice, which is the key learning objective of the course.

 

Learning by Managing Change Students

The Managing Change students were able to learn deeply (Biggs, 1999) and actively (Revans, 1982, Pedler, 1997) about managing change through the experience (Kolb, 1984) of evaluating the change. The students’ journals, reflecting on the experience of the evaluation, also documented the transformative learning (Mezirow, 1991, Cranton, 1999) that resulted from the experience. Managing Change students were able to shift from being teacher-centred to student-centred, as the course comprised working collaboratively in self-directed teams with the very close facilitation by the academic, and extremely detailed and regular peer and academic feedback. Several of the syndicates said that it was the most stimulating and intellectually challenging course of their university careers. The quality of the evaluation reports testify to this.

 

Learning by the Practitioners

For the practitioners, learning led to improved practice and was multi-levelled. Initially, learning for the practitioners was specific to managing the technological and administrative logistics of an online course. Student learning was espoused (Agryis, 1985) as the main objective of the evaluation. However, available time and energy was largely directed at managing the logistics of planning and delivering the first ‘lecture only’ stage of the course redesign (Creese and Kemelfield, 2000). This learning, as has been seen through this evaluation, was important because students’ perception of such contextual issues does affect their approach to and outcomes of learning (Prosser and Trigwell, 1999). Once familiar with the logistical requirements of online delivery, the practitioners were then able to direct their attention to the more important issue of pedagogy.

The key transformational learning was the shift in the practitioners’ understanding, whereby they could fully comprehend the meaning emerging from the student evaluation data. Namely, the students’ need to understand concepts through social interaction and the value of appropriate, interactive use of the computer (Laurillard, 1993, 1994). The lesson here is that online content cannot be developed in isolation, without considering the process of interaction that leads to learning. The decision to conduct the full course redesign in stages, with the first stage focussing on lectures only, meant that the during evaluation the practitioners’ focus was more on logistics than issues student learning. Interactive activities, designed to engage students in learning, were planned for the second stage of the full course redesign, but clearly should have been present even at this early stage, especially given the educational limitations of the lecture format.

Lastly, the practitioners’ experiences are representative of the organisation’s need to learn (Senge,1991) and better manage demands on staff. For an academic to concentrate on issues of teaching and learning, provision of sufficient resources is essential. In addition, there is a need to recognise that repurposing online content from a commercial provider does not necessarily provide time savings. The convenience of having Pearson’s content in an electronic, ‘ready-to-use’ format was significantly outweighed by the problems of implementing this content into a non-compatible learning management system and the need to customise for the local context.

 

Conclusion

The type of transformative learning, experienced through the partnership between practitioners and the Managing Change students in this evaluation, is the ultimate aim for the redesign of the Organisational Behaviour course. Learning by the practitioners, as a result of the evaluation, has prompted cyclical improvements to the whole course redesign which have enabled this type of learning to occur through the building of collaborative virtual learning communities (Pallof and Pratt, 1999). The main learning activity has been the establishment of virtual peer learning teams as part of the ‘classroom as organisation’ model (Tyson, 1999).The case demonstrates the validity of using students as co-researchers to achieve a truly learner-centred evaluation—without the Managing Change students much of the practitioner learning would not have been possible. The pedagogical message to emerge from the evaluation data was that, technological issues aside, students want to understand concepts fully and need to do this through a process of social interaction. However the practitioners were initially too overwhelmed by logistics to be able to ‘see’ this. Once they were able to comprehend the implication of the student evaluation data, the practitioners could act. The organisational lesson of the evaluation is the need for sufficient support, in terms of resources, so that practitioners can concentrate on their teaching and educational vision during both the design and delivery of an online course.

 

References

  • Agryis, C. (1985). Strategy, Change and Defensive Routines, Boston: Pitman
  • Biggs, J. (1999). Teaching for Quality Learning at University, Buckingham: Open University Press.
  • Bridges, W. (1966). Transitions: making sense of life’s changes, London: Nicholas Brealey.
  • Cranton, P. (1994). Understanding and Promoting Transformative Learning, A Guide for Adult Learners, San Francisco: Jossey-Bass.
  • Creese, E., & Kemelfield, J. (2000). Evaluating Online Lectures for First Year Students, in conjunction with Experiential Learning for Third Year Students: An Exploratory Case Study. Proceedings of the Seventeenth Annual Conference of the Australian Society for Computers in Learning and Tertiary Education, University of Southern Queensland.
  • Glaser, B. G., & Strauss, A. L. (1967). The Discovery of Grounded Theory, Chicago: Aldine Publishing Company.
  • Inglis, A., Ling, P., & Joosten, V. (1999). Delivering Digitally – Managing the Transition to the Knowledge Media, London: Kogan Page Ltd.
  • Kemmis, S. (2000). Exploring the relevance of critical theory for action research. In Reason, P. & Bradbury, H. (Eds.) International Handbook of Action Research, London: Sage.
  • Kolb, D. A. (1984). Experiential Learning, New Jersey: Prentice-Hall.
  • Laurillard, D. (1994). Multimedia and the changing Experience of the Learner. Paper presented at the Asia Pacific Information Technology in Training and Education Conference and Exhibition, June 28-July 2, Brisbane, Australia.
  • Laurillard, D. (1993). Rethinking University Teaching. A framework for the effective use of educational technology, London: Routledge.
  • Mezirow, J. (1991). Transformative dimensions of adult learning, California: Jossey-Bass.
  • Oliver, M. (2000). An Introduction to the Evaluation of Learning Technology. Educational Technology and Society, 3 (4), http://ifets.ieee.org/periodical/_vol4_2000/intro.html.
  • Palloff, R. M., & Pratt, K. (1999). Building Learning Communities in Cyberspace: effective strategies for the online classroom, San Francisco: Jossey-Bass.
  • Patton, M. Q. (1990). Qualitative Evaluation and Research Methods, California: Sage.
  • Pedler, M (1977). Action Learning in Practice, 2nd edition, Brookfield, VT: Gower.
  • Phillips, R. Bain, J., McNaught, C., Rice, M., & Tripp D. (2000). Handbook for Learning-centred Evaluation of Computer-facilitated Learning Projects in Higher Education, http://cleo.murdoch.edu.au/projects/cutsd99/handbook/handbook.htm.
  • Prosser, M., & Trigwell, K. (1999). Understanding Teaching and Learning, Buckingham: Open University Press.
  • Ramsden, P. (1992). Learning to teach in Higher Education, London: Routledge.
  • Revans, R. W. (1982). The Origin and Growth of Action Learning, London: Chartwell Bratt.
  • RMIT University (2000). RMIT University – Strategic Plan 1998-2002, Melbourne.
  • Schon, D. A. (1987). Educating the reflective practitioner, San Francisco: Jossey-Bass.
  • Senge, P. (1991). The Fifth Discipline: the Art and Practice of the Learning Organisation, London: Central Business.
  • Turner, B. A. (1983). The use of grounded theory for the qualitative analysis of organization behaviour. Journal of Management Studies, 20 (3), 333-348.
  • Tyson, T. (1996). Active Learning in a Management School – A Radical Approach. Unpublished paper presented at the 2nd Pacific Rim Conference on ‘First Year in Higher Education- Transition to Active Learning, 3-5 July 1996, Centre for the Study of Higher Education, The University of Melbourne, Australia.
  • Wadsworth, Y. (1991). Everyday evaluation on the run, Melbourne: Action Research Issues Association.

decoration


Copyright message

Copyright by the International Forum of Educational Technology & Society (IFETS). The authors and the forum jointly retain the copyright of the articles. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear the full citation on the first page. Copyrights for components of this work owned by others than IFETS must be honoured. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from the authors of the articles you wish to copy or kinshuk@massey.ac.nz.