Contrasts in learning: a collaborative evaluation by practitioners and students
This evaluation is focussed around an introductory course in organisational behaviour. The course is a core first-year unit for several undergraduate Business programs, as well as being offered to Engineering and Science undergraduates as part of their double-degree program. The initial emphasis of the study—to trial the delivery of lectures online—sits within a broader strategic framework to ‘renew’ (RMIT, 2000) large courses (often involving in excess of six hundred students) for flexible delivery. Other demands upon the lecturers who deliver the course include catering for diverse student demographics and, increasingly, the requirement to deliver the course in a variety of remote and off-shore (in particular Asian) locations: a fully online offering of the Organisational Behaviour course is scheduled for delivery in Vietnam in 2002 as part of a flagship Bachelor of Commerce program.
Introduction to Management, a compulsory course for Engineering students, is a lecture-only offering of the Organisational Behaviour course (in contrast, Business students also attend tutorials). It was decided to conduct the trial of online lectures with this group of eighty Engineering students, rather than the full group of Business students, as the smaller student numbers would be more manageable. As well as accommodating the imperative from policymakers of the University—to ensure that the Bachelor of Commerce was successful commercially—the researchers’ primary aim was to focus on improvement of student learning.
Although the Engineering students were the critical reference group (Wadsworth, 2000) at whom the research was ultimately aimed, the evaluation also involved another group of students: twenty-five third-year students studying a Managing Change course. These Managing Change students worked with the practitioners to gather the evaluation data about the student experience of learning online. In doing so, they were able to undertake experiential learning about change management.
Structure of courses and how the evaluation was facilitated
The collection of evaluation data was linked to assessment. Table 1 shows how the two courses were structured, how students were assessed and how the evaluation was integrated into the students’ assessment.
Table 1: Structure of Courses and Generation of the Evaluation Data
An effort was made to ensure that potential barriers which could disrupt the trial were minimised:
Why were online lectures used?
The educational limitations of lectures, whatever the means of delivery, are well-documented. For Ramsden (1992), lectures tend towards the transmission or telling of facts. Online lectures would fit Laurillard’s (1994) description of “text on-screen … and more unpleasant to read” (p.20) than text in a book (Laurillard, 1993). However, the decision to trial online lectures in place of the face-to-face lectures was based upon several reasons.
First, the evaluation of online lectures was the preliminary study to the redesign of the whole Organisational Behaviour course. The renewed course would be based on a rationale of learning activities to interact with the online lectures. This initial trial – of online lectures only – allowed one component of the planned fully online course to be trialled separately. This allowed the lecturer to progressively undertake the professional development required to understand what was involved in co-ordinating the design and delivery of a fully online course, including effective use of online communication to support student learning.
Secondly, the use of commercial resources dictated a ‘lecture and topic’ format and the decision to use Pearson’s online content was pragmatic. Inglis et al (1999) recommend customisation of commercial online resources, rather than expending vast resources in creating one’s own. This was certainly applicable to the Organisational Behaviour trial; there was not the time or financial resources to develop new online content (although a significant degree of customisation was required to contextualise the American materials for an Australian setting).
Finally, it was thought that using the lecture format would assist students and other stakeholders to make the transition (Bridges, 1966) to ‘online’. Continuing the familiar educational approach of communication of theory (lectures) followed by formative self-assessment (online quizzes) provided a linkage between the known lecture-tutorial setting and the unknown environment of online learning.
Aim of the Evaluation
The aim of the evaluation was to establish whether lectures which used commercial online materials could adequately meet students learning and other needs. The evaluation of Pearson Education’s Organisational Behaviour course was formative: it provided an opportunity to pilot some components of the future fully online course and inform its design. However, it was also to “address and illuminate the complex array of questions” (Patton, 1990, p. 119) involved in a change from face-to-face to online delivery.
The evaluation approach sits on the extreme participatory end of the evaluation of educational technology continuum (Oliver, 2000). The use of a practical form of action research (Kemmis, 2000) enables improved student learning through reflection on practice (Schon, 1987). Given the power relationship between the student and the academic, a key challenge to participatory action research remains: how is it possible to achieve genuine participation with students when they are the subject of the research? The decision for the practitioners to collaborate on the evaluation, with the Managing Change students “as part of a negotiated assessment task” (Phillips, Tripp, Rice, Bain and McNaught, 2000), aimed to make the evaluation truly learner-centred. The Managing Change evaluators, being students, were consideredto be free of any assumptions or bias that the practitioners may bring to the research. Any emerging theory “derived from the data” (Glaser and Strauss, 1967, Turner, 1983) was felt to be grounded in the true experience of the learners, the students studying the online lectures.
Data Collection and Stages of Analysis
Figure 1 shows how data was collected from the Engineering students and the stages of analysing this data. Managing Change students collected evaluative data directly from the Engineering students, using observation, interviews, questionnaires, focus groups and the study of online discussion logs. The data was then analysed according to the research aims established during each syndicate’s project planning. Later, a former Managing Change student used the evaluation reports to summarise the data into themes. The academic then conducted subsequent analysis, informed by a wider study and research in phenomenographic literature.
The evaluation co-ordinator of the university’s Learning Technology Services was invited to give a workshop with the eight Managing Change syndicates. This was based around a planning sheet for the students to design the what and how of their evaluations. The Managing Change curriculum emphasised evaluation as part of the change process. A collaborative action inquiry approach was encouraged by booking the computer lab, dedicated to engineering student use, opposite the Managing Change classroom. Timings were also co-ordinated so that each syndicate could meet their sample of about ten engineering students over lunch.
However Managing Change students , who had all previously studied Organisation Research Methods, were free to choose their research approach. Their aims covered technological, social, educational and organisational issues within the overall aims of the practitioner study. A range of research approaches was adopted with all but one syndicate combining quantitative and qualitative methods. These were also role played effectively using direct quotations in a presentation of the report. Other syndicates were able to make recommendations for improvement based on the highlighting of various issues. However most comprehensive and revealing of the underlying reasons for the non-acceptance of flexible learning by overseas students in particular, was the four cycles of action research carried out. This syndicate made up of overseas students also won the Pearson Education Award, negotiated by the practitioners as incentive for student involvement in the research.
Evaluation of the Findings by Managing Change Students
The evaluation reports by Managing Change students highlighted two main issues: firstly, the technological and logistical issues involved in a change of lecture delivery format, and secondly, the desire of students to learn through social interaction, especially organisational behaviour concepts which are social science based.
The student evaluation data was broadly categorised into themes relating to the technology and themes about the quality of teaching and learning. The technological themes included:
Themes around the teaching and learning quality were couched in different ways in the evaluation reports of the Managing Change students. These issues were categorised as follows:
As reported by the Managing Change students, the Engineering students’ lack of theoretical understanding of organisational behaviour was seen, especially by international students, to be the result of a lack of interaction with a lecturer in a face-to-face setting. Related to this was the use of the discussion board. It was used primarily for posting comments about the experience of online learning, rather than for facilitating a discussion about organisational behaviour. In addition, the quizzes only encouraged students to skim-read the lectures in order to pick out the correct answers.
The significance of these findings, in terms of the emerging theories discussed below, can be seen in the degree and depth of learning for the various parties collaborating in this evaluation.
Contrasts in Learning
Learning by Engineering Students
As suggested by the evaluation data, the Engineering students did not gain a sufficient understanding of the organisational behaviour concepts, as is required to be able to generally apply this theory to practice. The quizzes, intended for formative self-assessment, did not test the development of such understanding. Although the discussion board also contained questions intended to prompt students to apply theory to practice, contributing to these discussions was optional. As the co-ordinator of the course, with no allocated ‘teaching’ time, the academic barely had time to respond to and act upon the comments about the experience of learning online, let alone the time to encourage and motivate students to actively participate in the discussion of lecture content. Ironically, the mentor did have more time to engage in the online discussions and respond to the concerns of the Engineering students, however the mentor was not a ‘tutor’ with expertise in organisational behaviour and did not feel it was appropriate to respond to questions about the online lectures.
The findings suggest that the Engineering students were reacting to changes from face-to-face to online delivery of lectures. From their perspective, the change removed the value of face-to-face lectures (the opportunity to interact, at least minimally, with a teacher). Further, this opportunity for interaction was not satisfactorily replaced with appropriate, online-assisted communication. In addition, many had to contend with technical problems and additional costs of printing (which, for the international students, were on top of tuition fees). Although the ability to be a self-directed learner is essential for a course delivered online, the Engineering students were not given assistance in developing of these skills; and it was acknowledged only in passing that self-directed learning was even required.
Despite the general acknowledgement by students of the flexibility afforded by the medium, these other factors appear to have led to a devaluing of the online experience by the Engineering students, who not surprisingly adopted a surface approach (Biggs, 1999) to learning. In the final examination, the students could reproduce answers to the multiple-choice questions, however, in the short essay questions, they generally had difficulty applying organisational behaviour concepts to practice, which is the key learning objective of the course.
Learning by Managing Change Students
The Managing Change students were able to learn deeply (Biggs, 1999) and actively (Revans, 1982, Pedler, 1997) about managing change through the experience (Kolb, 1984) of evaluating the change. The students’ journals, reflecting on the experience of the evaluation, also documented the transformative learning (Mezirow, 1991, Cranton, 1999) that resulted from the experience. Managing Change students were able to shift from being teacher-centred to student-centred, as the course comprised working collaboratively in self-directed teams with the very close facilitation by the academic, and extremely detailed and regular peer and academic feedback. Several of the syndicates said that it was the most stimulating and intellectually challenging course of their university careers. The quality of the evaluation reports testify to this.
Learning by the Practitioners
For the practitioners, learning led to improved practice and was multi-levelled. Initially, learning for the practitioners was specific to managing the technological and administrative logistics of an online course. Student learning was espoused (Agryis, 1985) as the main objective of the evaluation. However, available time and energy was largely directed at managing the logistics of planning and delivering the first ‘lecture only’ stage of the course redesign (Creese and Kemelfield, 2000). This learning, as has been seen through this evaluation, was important because students’ perception of such contextual issues does affect their approach to and outcomes of learning (Prosser and Trigwell, 1999). Once familiar with the logistical requirements of online delivery, the practitioners were then able to direct their attention to the more important issue of pedagogy.
The key transformational learning was the shift in the practitioners’ understanding, whereby they could fully comprehend the meaning emerging from the student evaluation data. Namely, the students’ need to understand concepts through social interaction and the value of appropriate, interactive use of the computer (Laurillard, 1993, 1994). The lesson here is that online content cannot be developed in isolation, without considering the process of interaction that leads to learning. The decision to conduct the full course redesign in stages, with the first stage focussing on lectures only, meant that the during evaluation the practitioners’ focus was more on logistics than issues student learning. Interactive activities, designed to engage students in learning, were planned for the second stage of the full course redesign, but clearly should have been present even at this early stage, especially given the educational limitations of the lecture format.
Lastly, the practitioners’ experiences are representative of the organisation’s need to learn (Senge,1991) and better manage demands on staff. For an academic to concentrate on issues of teaching and learning, provision of sufficient resources is essential. In addition, there is a need to recognise that repurposing online content from a commercial provider does not necessarily provide time savings. The convenience of having Pearson’s content in an electronic, ‘ready-to-use’ format was significantly outweighed by the problems of implementing this content into a non-compatible learning management system and the need to customise for the local context.
The type of transformative learning, experienced through the partnership between practitioners and the Managing Change students in this evaluation, is the ultimate aim for the redesign of the Organisational Behaviour course. Learning by the practitioners, as a result of the evaluation, has prompted cyclical improvements to the whole course redesign which have enabled this type of learning to occur through the building of collaborative virtual learning communities (Pallof and Pratt, 1999). The main learning activity has been the establishment of virtual peer learning teams as part of the ‘classroom as organisation’ model (Tyson, 1999).The case demonstrates the validity of using students as co-researchers to achieve a truly learner-centred evaluation—without the Managing Change students much of the practitioner learning would not have been possible. The pedagogical message to emerge from the evaluation data was that, technological issues aside, students want to understand concepts fully and need to do this through a process of social interaction. However the practitioners were initially too overwhelmed by logistics to be able to ‘see’ this. Once they were able to comprehend the implication of the student evaluation data, the practitioners could act. The organisational lesson of the evaluation is the need for sufficient support, in terms of resources, so that practitioners can concentrate on their teaching and educational vision during both the design and delivery of an online course.
Copyright by the International Forum of Educational Technology & Society (IFETS). The authors and the forum jointly retain the copyright of the articles. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear the full citation on the first page. Copyrights for components of this work owned by others than IFETS must be honoured. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from the authors of the articles you wish to copy or firstname.lastname@example.org.