A Large-scale ‘local’ evaluation of students’ learning experiences using virtual learning environments
Julie Ann Richardson
In 1997-8 Staffordshire University introduced two Virtual Learning Environments (VLEs), Lotus Learning Space and COSE (Creation of Study Environments), as part of its commitment to distributed learning. An ambitious and wide-reaching evaluation model has now been designed to appraise the quality of students’ learning experiences using these VLEs. The intention has not been to compare the environments, but to focus upon their common elements. The aims of this project are as follows:
Using the vocabulary presented by Oliver (1998), the evaluation can be considered to be a hybrid system, with formative, summative and illuminative elements. The backbone of the model is a number of measuring instruments that were fitted into and around the educational process during one semester, beginning in January 1999.
This paper provides an overview of the process of designing and implementing the model. First, the model and the various instruments used to evaluate the different elements are described. Second, the evaluation method and key findings are discussed. Finally the model is evaluated in light of the processes and findings from the study.
Evaluating learning with VLEs: designing a model
The model is based on the premise that learning should be student-centred, and thus any evaluation should also be student-centred. This requires a model that works outwards towards the factors and influences which contribute to the learning experience. The quality of any learning experience is dependent upon an intricate interaction between the experiences, characteristics and attitudes that a student brings with them, and the attributes of the ‘task environment’ (Pask, 1976). Thus any evaluation must be capable of identifying individual differences that may play an important role as well as those attributes of the task setting (in this case, within Virtual Learning Environments) that may interact with them.
Figure 1 represents the interaction between the student and the task environment. The centre of the model represents the student. At the core are characteristics and factors that are most likely to be stable across different learning experiences. Moving out are the less fixed strategies, skills and knowledge etc. that continually develop and change. Boxes 1 and 2 represent the learning and reading processes that naturally interact and contribute to the experience. The remainder of the model highlights the elements ‘external’ to the student which are likely to influence the quality of their experience; the task attributes; subject and university culture; and tutor experiences/ subject expertise. The arrows between the different elements suggest possible interactions. For instance, the types of embedded support devices (ESDs) provided within the VLE, may be more or less successful depending on a learner’s cognitive style, or motivational orientation.
Figure 1. The interaction between the student and task environment
Because of limitations of space it is not possible to give a full rationale for each of the elements of the model; for a fuller discussion, the reader is referred to Richardson (1999). The emphasis in this paper is on an overview of the model and the key findings from the evaluation. Tables 1 and 2 provide a summary of the student and task environment, the key questions/aims they focus upon, and the corresponding evaluation instruments.
Table 1. Summary of the ‘student’ model and evaluation instruments
The Task Environment
The task environment can be described as five integrated parts, shown in Figure 1 as the content model, support model, teaching and learning path, opportunities for discourse and communication; and opportunities for higher order thinking. All of which contribute to the quality of the learning experience. Table 2 summarises the aims and measures associated with each part.
Table 2. Summary of Task Environment and Evaluation Instruments
Research Overview and Method
As suggested in the introduction, the approach taken to the evaluation can be described as a hybrid system with formative, summative, and illuminative elements; consequently, it draws on several methodologies. It is formative because its aim was not to provide a pre- to post-intervention comparison. The intention was to tease out information regarding the features of students’ learning experiences as they progress, in order to provide a basis for future, more effective use of VLEs as a resource for learning and teaching. This was achieved by placing phenomenographic (Fransella, 1981) interviews of students and tutors at the centre of the data collection process, and using additional instruments as a means of supporting the phenomenographic data and enabling the move from description to explanation. In this respect, the evaluation also has summative elements because it includes judgements based on measures of students’ characteristics such as cognitive style, motivational orientation, and time-management effectiveness.
In order to gain as comprehensive picture of VLE usage as possible, nine modules which were being presented via Lotus Learning Space or COSE were included. All students on the modules were given the opportunity to opt out of the study; 15 did so, leaving a total of 292 participants. Students were told about the evaluation in advance, during introductory lectures, and were then given the questionnaire, which was returned with a response rate of 88%. All tutors involved in developing and running modules on the VLEs were invited to take part (n=29); 5 chose to opt out. For reasons of confidentiality, comparisons between subjects, or between COSE and Lotus Learning Space, are not discussed here, but the authors intend to take this question further in future papers.
The participants were involved in different ways. Some of the instruments were designed for all students to use, e.g. the 5-page questionnaire that included questions on educational background, I.T. competency, time-management practices, motivational orientation, perception of transferable skills and attitudes toward using VLEs. This was administered approximately halfway through the semester. In addition, all students were encouraged to assess themselves using the Cognitive Styles Analysis (Riding, 1991). This test was made available for students throughout the semester. Other measures, such as the in-depth interviews and observations, were restricted to around three students per module (total volunteer student interviewees n= 29; at start and end of the module). In addition to this, all tutors who agreed to take part were interviewed at the start and end of the module.
As the evaluation generated a substantial amount of data from a range of sources, this paper uses a series of key questions to structure the results. This section will consider whether there was evidence that:
Can students’ learning experiences be described as positive?
This central question focuses upon the students’ perceptions of their learning experiences. It is discussed using two sources of data: phenomenographic student interviews and questionnaire responses where students were asked their opinions of learning through VLEs.
As emphasised throughout the paper, the student is at the heart of the evaluation model. To understand as fully as possible their experiences of working with VLEs, a minimum of three students per module were interviewed in depth (average interviews lasted 70mins) following a phenomenographic method (see, e.g., Marton, 1981). The approach to the analysis was adapted from the typical techniques in phenomenographic inquiry, particularly that of Marton (1981). The techniques were combined with some of those used in grounded theory, to produce a systematic template and model for guiding future interview interpretations.
By the end of the interview analysis there were 11 major themes (judged as most frequent and strongest) and 6 minor themes (occurring in no less than half of the interviews) describing students’ experiences. (For more details of this method, see Richardson, 1999.) For the question of whether students had a positive learning experience, the results are drawn from the major themes where students expressed their general experiences. The minor themes are picked up later. Each of the supporting quotations are given a module reference number.
Overall perception of VLEs
The overall perceptions of using virtual learning environments as a means of studying were diverse. For example, some students felt,
(module #3 student)
However, some of the overall impressions were less positive.
(Module #5 student)
This theme was one of the most diverse.
VLEs as supporting materials rather than as replacement
This view appeared throughout the interviews. Most students had a positive perception of the materials being provided through VLEs, however, they clearly expressed a preference for them being ‘supporting’ materials rather than being a ‘replacement’. More specifically, they found the materials useful for directing their self-study, for distributing resources and as a way of enabling them to return to materials (such as lecture notes) as often as they liked, particularly around exam times.
(Module #3 student)
The value of the printed page
The last two comments raise an important point which emerged early in the evaluation. Students prefer to have hard copies ( i.e. non-electronic) of the materials presented on their VLEs, rather than to sit and ‘interact’ with them. These students’ views are typical:
(Module #2 student)
Clearly, this issue has important implications for using this as a teaching medium, and for students’ perceptions of their learning. Part of the rationale for adopting distributed learning through a technological environment, as opposed to paper-based methods, is that they provide wider interactive opportunities and thus make learning more active. As part of the development process, tutors were encouraged to design their materials to encourage students to ‘interact’ with their materials. If students are choosing to print out materials, this is less likely to happen. Some of the students elaborated on reasons for printing it out, referring to VLEs as feeling “unnatural”.
(Module #6 student)
Being part of a learning community
Students were very keen to talk about themselves as part of a learning group/community. This was one of the most negative perceptions from working with VLEs.
(Module #6 student)
Clearly, the idea of community is one that is an important part of university life to students, and appears in various forms throughout the results.
Question #5 asks whether successful electronic communities have evolved. The final major theme is closely related.
Working with other students
All VLE modules have asynchronous courserooms, where students and tutors can discuss topics. This can work either as a replacement to classroom-based discussion or as a support. Students’ experiences of working together in these ways are reflected in the following comments:
(Module #5 student)
(Module #6 student)
The interviews revealed some important issues. Almost all students felt very positively about using virtual learning environments, if the qualities of the traditional environment could be retained. For example, they still wanted face to face seminars and lectures, but also to have supporting materials on their VLE. They also wanted the opportunity to feel a part of a physical group, and favoured face-to-face discussion rather than asynchronous electronic methods.
Prior to the start of the evaluation, pilot interviews were carried out and transcribed. Quotations were selected from these pilot interviews to form the final section of the student questionnaire. Questionnaires were administered approximately halfway through the semester. Respondents were asked to rate whether they strongly agreed, agreed, were neutral, disagreed, or strongly disagreed with each statement. Table 3 provides a summary of the results.
Table 3. Student perceptions of working with VLEs
These results highlight a number of issues comparable to those discovered in the interviews. Firstly, the results support the interview data in terms of whether students find learning in this way of learning flexible (#1, and #6). The fourth statement was also well supported. The number of students who disagreed with the statement was slightly higher than the first, the reason for which may be found in answers to other statements, such as those where students clearly felt that flexibility was restricted due to physical resources.
Another issue is students’ responses to whether or not they feel part of a learning community (statement #3), 31.6% of students strongly disagreed, compared to a total of 21.1% of students who agreed. A third issue is how useful students feel that learning in this way will be for them later on in their chosen careers. Statement #14 was clearly supported.
Finally, with respect to students’ knowledge about their own learning, many of the statements seem to reflect students’ understanding about how ‘they learn and work best’, such as statement #4, “I can make more effective use of my time when I’m learning using VLEs”, and #5, “I like the way the responsibility for my own learning is on me”. Both offer support, albeit partial, for the views expressed in the interviews. Reasons for the differences between the two sources of data this will be explored in more detail later, although these may be reflect individual differences in terms of learning style, cognitive style, motivational orientation, ability etc.
To summarise, it would seem from the interview and questionnaire data that students’ perceptions of working with VLEs depend on several factors, including:
Is there evidence of active learning through VLEs?
David Hunt (1987) refers to passive learning as when students are learning from the outside in and not the inside out. Whilst students may be absorbing material, they may not necessarily be taking part in the active thinking. They may not be making judgements about the relevance of material, or making value judgements, and they may not be deciding what learning material is important based on their own experiences. In other words, they are playing someone else’s game and not constructing their own learning.
This question explores whether or not the materials that have been developed for the two VLEs have promoted active learning. The data for this is derived from the task environment within the evaluation model, and thus a key source of data is the materials themselves. However, as the evaluation emphasises student perceptions as its focus, it is important to draw on the interview data to answer this question as well. The students often described the types of materials they were presented with (see minor theme #1). Earlier, the issue was raised that students frequently printed off the learning materials rather than work on screen. Such comments imply that the materials provided passive support, which is easily transformed into the printed pages they need, want or are used to. Many of the students referred to the materials as being passive in some way.
(Module #4 student)
(Module #1 student)
Thus the students seem to perceive their materials in the same way as traditional ‘passive’ materials, both in the sense that they do not interact with them on-line, and that the form of learning is passive. The latter point is explored below by analysing the module materials.
Analysing the materials: passive vs. active?
The task environment within the model shown earlier is broken down into 5 parts, all of which may help to explain students’ perceptions. Selections of materials from each of 9 modules were collected and analysed. (For a more detailed explanation of the method see Richardson, 1999.) Briefly, the method of analysis consisted of:
The final results are shown in Table 4. These figures represent occurrence of categories. For instance, a page of textual information on a particular topic would be recorded as ‘1' in the ‘subject knowledge’ category. In addition, three other pieces of information were recorded: whether there is evidence of active encouragement of the courseroom, the contact hours for the 2-week period and the style of writing.
Table 4. Content Analysis for VLE Materials over 2-week period
For reasons of confidentiality the titles and subjects of modules have been omitted. The results invite several comments. Firstly, there has been a diversity of approach towards designing modules. It would seem that all nine modules have taken different approaches to using VLEs and made different attempts to encourage active learning. Five modules encouraged the use of the courseroom for discussion and collaboration. This does not appear to be related to the number of contact hours that students still have. Secondly, the selection of materials chosen for analysis contained limited evidence of encouraging higher-order thinking. Five modules contained materials that may be considered to stimulate students thinking beyond the immediate text. Thirdly, the modules fell into 3 categories of content structure. Only one chose to adopt a resource-based learning approach - most reflected an attempt to mirror the courses as they were presented previously. Fourthly, a range of embedded support devices was found across the modules. However their distribution was uneven and appears to influence students’ perceptions. For example, module #1 has been developed primarily as a support mechanism for students that runs in parallel to face-to-face lectures and seminars. Very little course content has been placed on the VLE; the majority of the materials are ‘administrative’ and help the students manage their own learning and the practical projects which form part of their assessment. A student from this module commented that “...the materials really help to tie the whole module together...the guidance notes and summaries really help, and encourages us to maximise the use of resources on our own”. By contrast, module #5 is more focussed on subject knowledge and is more text-based.
Reasons for these differing approaches to the module structures include the demands of particular subjects, tutor expertise, and the resources and support made available to the tutors. The tutor interviews elaborate these points further (see Richardson & Turner, 2000b). The data thus far suggests that students:
Is there evidence of individual differences in the way students approach and work with VLEs?
As suggested earlier, the findings from the student interviews and questionnaires support the idea that the ‘quality’ or ‘elements’ of the task environment are not the only contributory factor to positive learning experiences. There is a strong sense that students’ individual differences play a role. This section provides a series of analyses that explore the potential interactions between students’ perceptions of working with VLEs and the qualities they bring with them to a learning context.
Gender and perception of VLEs
Using the responses in the questionnaire of perception of VLEs (shown in Table 3 ), an overall mean was calculated for perception of VLE for each student. A higher mean represents a more positive perception. A t-test was performed to determine whether male and female students felt differently toward VLEs. The results (see Table 5) show that females responded significantly more negatively (p<0.05) toward VLEs than males. This outcome may be for several reasons (see, e.g. Valcke et al., 1993). Female students may not be as computer literate as male students and therefore less confident, or some elements of working in this environment may not be compatible with the needs of female students. The evaluation has shown that, generally, students are not working collaboratively and have reduced face-to-face contact. Female students may prefer more interactive methods of learning. These questions are addressed below.
Table 5. Gender and Perception of VLEs
I.T. proficiency and perception of VLEs
A mean score was calculated for I.T. awareness from the responses to the IT statements in the questionnaire. Here students were asked to rate themselves on a 3-point Likert scale in statements of I.T. proficiency. A higher mean represents [perception of] a lower I.T. proficiency. For a general look at a possible interaction between students’ IT skill and perception of using VLEs, a Pearson correlation coefficient was obtained. The results (0.177, p= 0.892) suggest there is no interaction between the variables. However, it may be possible that this is due to students rating themselves, rather representing an objective measurement. Earlier, it was suggested that female students had a more negative attitude toward VLEs which may be because female students did not have the same level of IT proficiency as male students. A t-test was performed to evaluate whether female students felt their IT skills were less developed than their male counterparts. The results (Table 6) show a significant difference (p<0.001) which may contribute to female students having a more negative attitude toward VLEs.
Table 6. Gender and I.T. Proficiency
Time-management practices and perception of VLEs
This part of the evaluation uses the responses to section 5 of the questionnaire, which explored the time-management practices of students. The first stage of this analysis was to combine all the responses to the time-management questions and calculate a mean response. A higher mean represents a higher degree of time-management practices. The value of the correlational coefficient for mean time-management skills and mean perception (r=0.177, p<0.009) strongly suggests that students who use more developed time-management practices have a more positive perception toward using VLEs.
Figure 2. Time management practices and perceptions of VLEs
Motivational Orientation and perception of VLEs?
This analysis uses the results from the Motivational Orientation Scales (MOS), used in the student questionnaire. Students mean responses were calculated for items relevant to the different scales. The mean value for each orientation scale was paired with mean perception of VLE and correlation coefficients obtained (Table 7).
Table 7. Motivational Orientation and Perception of VLEs
Firstly, the correlation coefficient and associated probability for task orientation and perception of VLEs support the suggestion that students who are inclined to engage and enjoy independent learning activities which keep them busy have a more positive perception of learning through VLEs. The results also support the interview data, where students who seemed have a stronger understanding and motivation toward independent learning also expressed more positive perceptions of using VLEs.
Secondly, the results for ego orientation suggest that these two groups also have different perceptions of working with VLEs. It is closely related to the concept of extrinsic motivation. Research often suggests that extrinsically motivated individuals prefer to be in the company of others (Dweck & Elliott, 1983), and, as the results later describe, students have been expected to use a virtual learning environment at the expense of spending time with their peers. They have worked primarily alone. The results do not suggest an interaction between avoidance of inferiority, easy superiority and work avoidance scales.
Cognitive styles and VLE perception
The sample was grouped on the basis of their ratios on each cognitive style dimension. The divisions were:
One-way anovas were performed between each dimension and mean VLE perception. The results showed an overall significant interaction between wholist-analytic style and VLE perception (F=4.40, p=0.01). A post-hoc Tukey test revealed significant differences (p<0.01) between the perceptions of the wholist group and analytic group, but not between intermediates and analytics or wholist groups (p>0.05). This implies that the more analytic students were, their more positive their perception of VLEs became. With regards the Verbaliser-Imager dimension, a one-way anova revealed no significant differences between the mean responses of the three V-I groups (p>0.05).
An exploration of some individual differences that may influence the quality of the learning experience revealed:
Are students using the on-line courserooms to discuss their learning?
Issues of whether students are learning together, with their tutors, or generally feeling part of a learning community have arisen throughout the results presented thus far. This section focuses upon students’ and tutors use of the on-line courseroom discussion rooms as a means of learning collaboratively. A conversation analysis was performed on selections of courseroom input during the semester. The analyses covered an eight-week period. The questions that guided the analyses were:
Overall, 387 contributions were recorded and analysed over an 8week period. Out of these contributions, 216 came from students and 171 from tutors.
Table 8. Summary of Contributions to Courseroom Discussions
Table 8 suggests that the majority of contributions came from students (56%). However, further analysis showed that only 1 in 11 students contributed to discussions. Students’ mean number of contributions was 1.71 (SD=16.56), whereas for tutors, the mean number of contributions was 4.83 (23.13). Thus there was a strikingly low level of use by the majority of course members. Even those who do contribute to the courseroom tend to post only one or perhaps two messages. Even the most active group member contributed only 7 times. These figures seem to uphold views expressed by students that they quickly became disappointed when they made a contribution but found they didn’t receive any response. The tutor contributions also vary enormously. As Figure 3 shows, one tutor made 33 contributions with a second tutor contributing 25 times, yet the mean of 4.83 entries indicates many low-use courseroom entries.
Figure 3. Contributions from tutors
Secondly, tutors’ contributions tended to relate to procedural issues, whereas students’ were predominantly of a low-order input, where the student expresses an unsupported opinion (self-statement). Tutors and students contributed equally with expository contributions where reference to other sources substantiates a view. Perhaps the most important finding was the lack of feedback from tutors or students. Tutors gave very little positive feedback to students. The overall impression is that courserooms was being used for ‘housekeeping’, setting questions and providing instructions, with little evidence of actual ‘discussion’ as would be expected in a face-to-face seminar. If tutors are to be seen as role models for leading and facilitating discussions, these figures offer a possible explanation for lack of contribution. In other words, if tutors tend to use the courserooms for ‘housekeeping’, and as methods of course management, then it is likely that students will follow their example.
Time of day usage
As the student interviews highlighted, an important issue has been the availability of computers and rooms. Figure 4 provides a summary of the times of day when students and tutors contributed to the courserooms. It thus provides information regarding the times of highest demand for resources.
Figure 4. Times of contributions
The majority of contributions were made between 12.00am and 2.00pm – presumably during lunch breaks. Although this supports the view that VLEs allow the student (and tutor) to choose when to study, the low level of activity during morning and afternoon ‘working’ time may indicate that students and tutors ‘fill in’ their lunch breaks by entering the courseroom. Furthermore, it may not be regarded as such an important activity that it should warrant prime working time. The different comments about availability of computers by students could also be a reflection of the time of day when they tried to gain access. Those who chose lunchtime, which usually a busy time for open access facilities, may well have had difficulties, whilst others who have not found access a problem may well be choosing less busy periods of the day. Although the flexibility offered by VLEs featured in responses to the questionnaire, very few contributions were made after 5.00 p.m., questioning the traditional notion that students will work online during the evening. There was no evidence that advantage has been taken of the possibility of accessing the VLE from home.
How have tutors facilitated discussions?
The use of the courseroom across modules varied considerably and with different effect. Two researchers independently judged the overall approaches of modules to enhance reliability. It was agreed that the approaches might be described in three groups. Table 9 provides a simple comparison between the approaches.
Table 9. Comparison of tutor inputs on 3 different modules
Combining face-to-face group discussion with on-line presentation/discussion: Tutors in module #9 adopted this method in particular. Throughout the eight-week period students were placed in discussion teams. Via the courseroom, the tutor gave the teams questions to read and discuss ‘outside’ of the courseroom. As part of their discussion, they were required to produce a summary of their discussion and conclusions and present it to the rest of the module participants via the courseroom. The other students then had the opportunity to ask questions and take the discussions forward. Out of all methods for facilitating discussions, this received the most positive feedback from both tutors and students. For example, one student commented,
The course tutor said:
Table 9 shows how this tutor has provided the most constructive feedback to students contributions as well as a significant amount of cognitive support.
Group on-line discussions: This method was very similar to the first, except that the face-to-face part of the discussions was replaced by on-line. Each team of students had their own area of the courseroom to discuss a topic. The tutors had access to each area, to monitor discussion and provide help if necessary. Students were then required to come together and discuss their ideas further. The tutor of module #8 adopted this approach and commented that he found it:
A student from the courseroom explained:
The tutor also made an attempt to provide feedback and cognitive support.
On-line individual presentation/discussion: This method was adopted by the majority of modules and involved the tutor asking a question of some kind and students then responding. This method appears to have had the greatest variance in effectiveness, depending upon the number and types of tutor inputs and the types of student responses. For example the tutor of module #4 only made one contribution to the courseroom at the start of the 8-week period when students were asked to provide their project proposals for the module. This resulted in individual students only presenting a summary of what they intended to do, and neither tutor nor peers offered feedback, discussion etc. By contrast, in module #3, two tutors posed topics for discussion which they expected individual students to respond to as thoughtfully as they could. As students did so, the tutors provided feedback and encouragement and attempted to take the discussions beyond the superficial by pointing them towards useful texts.
Of the different strategies for on-line discussion outlined above, the first approach appears to be the most effective. Students felt able to make quality contributions without the frustration of waiting for replies as they would have had to if it was all on-line. However, the success of any of three approaches increases when tutors devote time and energy to stimulate and keep discussions going.
The evaluation model: moving forward
The model designed for the evaluation has been an effective tool for a formative evaluation. It has provided both insights into the learning processes involved through VLEs, and into their effective evaluation. A strength of the study has been its ability to support an illuminative and holistic methodology with summative and quantitative data, thus allowing the move from a description of what has happened to an explanation. For example, the relationships and factors that influence students’ experiences reported in the interviews were then explored further in terms of their motivation. We are now able to say that a range of individual differences influences students’ experience, and, to a certain extent, these can now play a ‘back-seat’ role in the model. The focus now can be upon developing appropriate strategies to assist tutors to widen the ‘audience’ of their materials, and on offering methods of incorporating the model into the module development and the evaluation processes.
After carrying out content analyses on materials and evaluating their effectiveness it is now possible to add into the model more specific criteria and guidance for the types and range of ESDs etc which should be incorporated. Other changes include further exploration of why students preferred to work from ‘hard copies’ of their materials. Different explanations for this emerged during the interviews, e.g. uncomfortable physical environment, difficulties in reading from a screen. An investigation into these areas is currently being designed and incorporated into the model.
Evaluating the quality of student experiences has been a central constituent of the process of introducing distributed learning through virtual learning environments at Staffordshire University. This paper has provided an overview of an evaluation of this initiative. As emphasised throughout, the power of the model has been its attempt to place the learner at the centre of the evaluative process. Evaluating the ‘quality’ of learning experiences began from the student and worked outwards. Clearly, the holistic nature of the study has generated a substantial amount of data, which could not all be fully analysed here. However, from the data presented in this paper, several key conclusions can be drawn: