Educational Technology & Society 3(4) 2000
ISSN 1436-4522

A Large-scale ‘local’ evaluation of students’ learning experiences using virtual learning environments

Julie Ann Richardson
3rd Floor, Weston Education Centre
Guys, King’s & St. Thomas’ Hospital
Cutcombe Rd., London, SE5 9RJ
United Kingdom
julie.richardson@kcl.ac.uk
Tel: +44 207 848 5718
Fax: +44 207 848 5686

Anthony Turner
Canterbury Christ Church University College
North Holmes Rd., Canterbury, CT1 1QU
United Kingdom
a.e.turner@cant.ac.uk
Tel: +44 1227 782880

 

ABSTRACT

In 1997-8 Staffordshire University introduced two Virtual Learning Environments (VLEs),  Lotus Learning Space, and COSE (Creation of Study Environments), as part of its commitment to distributed learning. A wide-reaching evaluation model has been designed, aimed at appraising the quality of students’ learning experiences using these VLEs. The evaluation can be considered to be a hybrid system with formative, summative and illuminative elements. The backbone of the model is a number of measuring instruments that were fitted around the educational process beginning in Jan 1999. 

This paper provides an overview of the model and its implementation. First, the model and evaluation instruments are described. Second, the method and key findings are discussed. These highlighted that students need to feel more supported in their learning, that they need more cognitive challenges to encourage higher-order thinking and that they prefer to download their materials to hard copy. In addition, tutors need to have a greater awareness of the ways individual differences influence the learning experience and of strategies to facilitate electronic discussions. Generally, there should be a balance between learning on-line and face-to-face learning depending on the experience of tutors, students, and the subject.

Finally the model is evaluated in light of the processes and findings from the study.

Keywords: Virtual learning environments, Individual differences, Quality of learning, Student-centred


Introduction

In 1997-8 Staffordshire University introduced two Virtual Learning Environments (VLEs),  Lotus Learning Space and COSE (Creation of Study Environments), as part of its commitment to distributed learning. An ambitious and wide-reaching evaluation model has now been designed to appraise the quality of students’ learning experiences using these VLEs. The intention has not been to compare the environments, but to focus upon their common elements. The aims of this project are as follows:

  • To design a model for evaluating the quality of students’ learning experiences within virtual learning environments, which can be developed, and eventually used as part of the module development process and quality assurance.
  • To evaluate the quality of learning experiences for students using VLEs.
  • To explore the cognitive, social, and affective dimensions of students’ experiences using VLEs.
  • To ground and centre the evaluation in phenomenographic methodology with the purpose of discovering the factors and issues that are important to students and tutors.
  • To evaluate the experiences of lecturers in their task of facilitating meaningful learning experiences using VLEs.

Using the vocabulary presented by Oliver (1998), the evaluation can be considered to be a hybrid system, with formative, summative and illuminative elements. The backbone of the model is a number of measuring instruments that were fitted into and around the educational process during one semester, beginning in January 1999.

This paper provides an overview of the process of designing and implementing the model. First, the model and the various instruments used to evaluate the different elements are described. Second, the evaluation method and key findings are discussed. Finally the model is evaluated in light of the processes and findings from the study.

 

Evaluating learning with VLEs: designing a model

The model is based on the premise that learning should be student-centred, and thus any evaluation should also be student-centred. This requires a model that works outwards towards the factors and influences which contribute to the learning experience. The quality of any learning experience is dependent upon an intricate interaction between the experiences, characteristics and attitudes that a student brings with them, and the attributes of the ‘task environment’ (Pask, 1976). Thus any evaluation must be capable of identifying individual differences that may play an important role as well as those attributes of the task setting (in this case, within Virtual Learning Environments) that may interact with them.

Figure 1 represents the interaction between the student and the task environment. The centre of the model represents the student. At the core are characteristics and factors that are most likely to be stable across different learning experiences. Moving out are the less fixed strategies, skills and knowledge etc. that continually develop and change.  Boxes 1 and 2 represent the learning and reading processes that naturally interact and contribute to the experience. The remainder of the model highlights the elements ‘external’ to the student which are likely to influence the quality of their experience; the task attributes; subject and university culture; and tutor experiences/ subject expertise. The arrows between the different elements suggest possible interactions. For instance, the types of embedded support devices (ESDs) provided within the VLE, may be more or less successful depending on a learner’s cognitive style, or motivational orientation.

 

Figure 1. The interaction between the student and task environment

 

Because of limitations of space it is not possible to give a full rationale for each of the elements of the model; for a fuller discussion, the reader is referred to Richardson (1999). The emphasis in this paper is on an overview of the model and the key findings from the evaluation. Tables 1 and 2 provide a summary of the student and task environment, the key questions/aims they focus upon, and the corresponding evaluation instruments.

 

The student

Area of model

Aim/Key Questions

Evaluation instrument

The overall student experience of VLEs

 

 

To reveal and record the student experience; attitudes, gains, problems etc. in using VLEs over a semester

 

 

In depth phenomenographic interviews (3 students per module)

15 questions within a single student questionnaire developed by Richardson & Turner (1999) from in-depth pilot interviews. 

Observations of students working on 2 separate occasions

Educational Background/gender

How has educational background been catered for and affected the VLE experience?

Within student questionnaire

Motivational Orientation/Achievement motivation

Are different motivational orientations catered for within VLEs? Do particular students find this method of working more or less motivating?

Motivational Orientation Scales (MOS) Nicholls, Patashnick, Nolen, 1985). 21 questions within the student questionnaire

Time-Management skills

Do VLEs and distributed learning generally place a different set of demands on students in terms of their time-management practices?

13 Questions within student questionnaire. Developed and Piloted by Richardson & Turner (1999)

Transferable Skills

How do students perceive their level of skills in a range of areas, e.g. computer literacy, team working etc.?

10 Questions within student questionnaire.

Cognitive Style

How do students’ cognitive styles interact with their experience of VLEs?

15 min computer-presented test (Riding, 1991). Administered to approx. 120 students

Level of I.T. Competence

How does level of I.T. competence influence students experience of using VLEs?

37 questions within student questionnaire.

Table 1. Summary of the ‘student’ model and evaluation instruments

 

The Task Environment

The task environment can be described as five integrated parts, shown in Figure 1 as the content model, support model, teaching and learning path, opportunities for discourse and communication; and opportunities for higher order thinking. All of which contribute to the quality of the learning experience. Table 2 summarises the aims and measures associated with each part.

Area of model

Aims/Key Questions

Evaluation instrument

The Content Model

How has the module content been structured?

Content Analysis of Module Materials during the semester

 

 

Phenomenographic Interviews with tutors and students

The Support Model

What set of embedded support devices (e.g. summaries, questions, tasks, guidance notes etc.) have been incorporated into the VLE?

Content analysis of module materials at the end of semester.

 

 

Phenomenographic interviews with tutors and students

The Teaching and Learning Path

How do students choose to use the materials?

Observation of students using ‘ScreenCam’

Opportunities for Discourse and Communication

 

What strategies are in place for students to effectively discuss their learning with peers and tutors?

 

Conversation analysis of courseroom discussions.

Phenomenographic interviews with students and tutors

Opportunities for Higher Order Thinking

 

To what extent have principles of higher-order thinking been incorporated into materials, e.g. cognitive conflict, reflection, bridging, reasoning patterns?

 

Analysis of VLE materials

Phenomenographic interviews with students and tutors

Table 2. Summary of Task Environment and Evaluation Instruments

 

Research Overview and Method

As suggested in the introduction, the approach taken to the evaluation can be described as a hybrid system with formative, summative, and illuminative elements; consequently, it draws on several methodologies. It is formative because its aim was not to provide a pre- to post-intervention comparison.  The intention was to tease out information regarding the features of students’ learning experiences as they progress, in order to provide a basis for future, more effective use of VLEs as a resource for learning and teaching. This was achieved by placing phenomenographic (Fransella, 1981) interviews of students and tutors at the centre of the data collection process, and using additional instruments as a means of supporting the phenomenographic data and enabling the move from description to explanation. In this respect, the evaluation also has summative elements because it includes judgements based on measures of students’ characteristics such as cognitive style, motivational orientation, and time-management effectiveness.

 

Method

In order to gain as comprehensive picture of VLE usage as possible, nine modules which were being presented via Lotus Learning Space or COSE were included. All students on the modules were given the opportunity to opt out of the study; 15 did so, leaving a total of 292 participants. Students were told about the evaluation in advance, during introductory lectures, and were then given the questionnaire, which was returned with a response rate of 88%. All tutors involved in developing and running modules on the VLEs were invited to take part (n=29); 5 chose to opt out. For reasons of confidentiality, comparisons between subjects, or between COSE and Lotus Learning Space, are not discussed here, but the authors intend to take this question further in future papers.

The participants were involved in different ways. Some of the instruments were designed for all students to use, e.g. the 5-page questionnaire that included questions on educational background, I.T. competency, time-management practices, motivational orientation, perception of transferable skills and attitudes toward using VLEs. This was administered approximately halfway through the semester. In addition, all students were encouraged to assess themselves using the Cognitive Styles Analysis (Riding, 1991). This test was made available for students throughout the semester. Other measures, such as the in-depth interviews and observations, were restricted to around three students per module (total volunteer student interviewees n= 29; at start and end of the module). In addition to this, all tutors who agreed to take part were interviewed at the start and end of the module.

 

Results

As the evaluation generated a substantial amount of data from a range of sources, this paper uses a series of key questions to structure the results. This section will consider whether there was evidence that:

  • Students had a positive learning experience using VLEs
  • Students were encouraged to learn ‘actively’ using VLEs
  • Individual differences influenced the way students approach and work with VLEs
  • Students used the on-line courserooms to discuss their learning

 

Can students’ learning experiences be described as positive?

This central question focuses upon the students’ perceptions of their learning experiences. It is discussed using two sources of data: phenomenographic student interviews and questionnaire responses where students were asked their opinions of learning through VLEs.

As emphasised throughout the paper, the student is at the heart of the evaluation model. To understand as fully as possible their experiences of working with VLEs, a minimum of three students per module were interviewed in depth (average interviews lasted 70mins) following a phenomenographic method (see, e.g., Marton, 1981). The approach to the analysis was adapted from the typical techniques in phenomenographic inquiry, particularly that of Marton (1981). The techniques were combined with some of those used in grounded theory, to produce a systematic template and model for guiding future interview interpretations.

By the end of the interview analysis there were 11 major themes (judged as most frequent and strongest) and 6 minor themes (occurring in no less than half of the interviews) describing students’ experiences. (For more details of this method, see Richardson, 1999.) For the question of whether students had a positive learning experience, the results are drawn from the major themes where students expressed their general experiences. The minor themes are picked up later. Each of the supporting quotations are given a module reference number.

Major themes:

  1. Overall perception of VLEs.
  2. VLEs as supporting materials vs. replacement.
  3. The value of the printed page.
  4. Being part of a learning community.
  5. Working with other students.

 

Minor themes:

  1. Active vs. passive learning.
  2. Flexibility of access.
  3. The balance that is right for me.
  4. Physical resources that support/restrict my Learning .
  5. I don’t understand why I’m using VLEs.
  6. Preparation and problems in my experience of VLEs.
  7. Knowledge of my own learning needs.

 

Overall perception of VLEs

The overall perceptions of using virtual learning environments as a means of studying were diverse. For example, some students felt,

this is an ideal approach. I like the control it gives me over my learning outcomes... It’s flexible, user friendly, and it gives me the support I need when lecturers are so obviously busy. Don’t get me wrong - I would not advocate learning solely in this way - but running in parallel it works very well.    

(module #3 student)

However, some of the overall impressions were less positive.

I didn’t come to university to sit in front of a computer all day. I came so that I could debate, discuss, have lectures and stuff like . I know that I won’t get the grades I would have otherwise...

(Module #5 student)

This theme was one of the most diverse.

 

VLEs as supporting materials rather than as replacement

This view appeared throughout the interviews. Most students had a positive perception of the materials being provided through VLEs, however, they clearly expressed a preference for them being ‘supporting’ materials rather than being a ‘replacement’. More specifically, they found the materials useful for directing their self-study, for distributing resources and as a way of enabling them to return to materials (such as lecture notes) as often as they liked, particularly around exam times.

The best thing for me is that lotus directs my study to the key areas...know what I mean? So you’re not reading the wrong book or researching the wrong things. I wouldn’t like it if we didn’t have any contact with tutors and other students, but I think of it as a back-up. It saves me lots of time trying to get round the library . . .   

 (Module #3 student)

 

The value of the printed page

The last two comments raise an important point which emerged early in the evaluation. Students prefer to have hard copies ( i.e. non-electronic) of the materials presented on their VLEs, rather than to sit and ‘interact’ with them. These students’ views are typical:

Once I’ve accessed it, I always just print it all off. It’s far too difficult to read from a screen and apart from anything it’s always so noisy with people around you wanting to get on the computer, so I just print them off and then take them home.

(Module #2 student)

Clearly, this issue has important implications for using this as a teaching medium, and for students’ perceptions of their learning. Part of the rationale for adopting distributed learning through a technological environment, as opposed to paper-based methods, is that they provide wider interactive opportunities and thus make learning more active. As part of the development process, tutors were encouraged to design their materials to encourage students to ‘interact’ with their materials. If students are choosing to print out materials, this is less likely to happen. Some of the students elaborated on reasons for printing it out, referring to VLEs as feeling “unnatural”.

Computers give me migraines . . . It’s not just the computers, it’s the room lighting, and you can’t sort of go off and come back because someone else will be in your seat . . It’s a real disadvantage because it costs me so much money to print it off...because [the tutor] puts new stuff on it all the time which means I’ve got more to print off!

(Module #6 student)

 

Being part of a learning community

Students were very keen to talk about themselves as part of a learning group/community. This was one of the most negative perceptions from working with VLEs.

I hate the way I have to tap people on the shoulder in the courtyard restaurant and say are you a [...] student? It shouldn’t be like that... I feel so isolated.

(Module #6 student)

Clearly, the idea of community is one that is an important part of university life to students, and appears in various forms throughout the results.

Question #5 asks whether successful electronic communities have evolved. The final major theme is closely related.

 

Working with other students

All VLE modules have asynchronous courserooms, where students and tutors can discuss topics. This can work either as a replacement to classroom-based discussion or as a support. Students’ experiences of working together in these ways are reflected in the following comments:

Err, the discussion group! I think there was something at the beginning of this year about that. Where people give their views on a topic and air it and other people can view it and reply if they want . . .  But no one ever uses it.

(Module #5 student)

Why can’t we have proper conversations? I don’t like the way we have to wait for people to log on and reply. If you have something important on your mind, while you’re writing an essay, there’s no point in using it because you don’t know when anyone will reply to you.

(Module #6 student)

 

Summary

The interviews revealed some important issues. Almost all students felt very positively about using virtual learning environments, if the qualities of the traditional environment could be retained. For example, they still wanted face to face seminars and lectures, but also to have supporting materials on their VLE. They also wanted the opportunity to feel a part of a physical group, and favoured face-to-face discussion rather than asynchronous electronic methods.

Prior to the start of the evaluation, pilot interviews were carried out and transcribed. Quotations were selected from these pilot interviews to form the final section of the student questionnaire. Questionnaires were administered approximately halfway through the semester.  Respondents were asked to rate whether they strongly agreed, agreed, were neutral, disagreed, or strongly disagreed with each statement. Table 3 provides a summary of the results.

No

Statement

Strongly

agree

agree

neutral

disagree

Strongly

disagree

totals

1

Working with a virtual learning environment means that I can work when and where I want”

26

104

44

35

24

233

2

I find learning in this way useful because I can go over the material as many times as I want...if I go to lecture, I either get it or I don’t!

26

119

56

21

7

229

3

Even though I may not have as much class contact I still feel as though I am part of a learning community”

11

39

50

59

75

234

4

I can make more effective use of my time when I am learning using this kind of computer-aided learning...it’s a bit like a flexi-time system

15

72

79

50

13

229

5

I like the way the responsibility for my leaning is on me...I am more in control. I don’t have to rely on tutors

37

72

54

60

7

230

6

I enjoy the flexibility of it because it allows me to go at my own pace

47

98

51

28

6

230

7

I feel isolated when I am using a VLE....I don’t seem to talk to other students much anymore

42

64

59

53

13

231

8

I am not very confident with my own ideas and am afraid that I am getting lost in my learning

36

53

69

58

14

230

9

I would prefer to have more contact and guidance from tutors

70

84

51

20

6

231

10

I am not ‘computer-minded’ and don’t enjoy sitting in front of a screen for long periods of time...its very boring to read from a screen

39

51

53

54

31

228

11

I can’t afford a computer of my own and the amount of time that I have to spend in college in the computer labs has increased - it is time that I can’t afford

36

40

66

61

27

230

12

I seem to spend lots of time just trying to find a computer that I can use...then feel pressured because others are waiting to use it

34

56

64

59

14

227

13

I don’t think there are enough computers around the university I think that employers will like the fact that we are used to VLEs

50

69

60

37

11

227

14

I think that employers will like the fact we are learning with VLEs

55

112

47

12

3

229

15

I think this kind of computer-aided-learning will be used more and more in the future.

66

96

52

9

4

227

Table 3. Student perceptions of working with VLEs

 

These results highlight a number of issues comparable to those discovered in the interviews. Firstly, the results support the interview data in terms of whether students find learning in this way of learning flexible (#1, and #6). The fourth statement was also well supported. The number of students who disagreed with the statement was slightly higher than the first, the reason for which may be found in answers to other statements, such as those where students clearly felt that flexibility was restricted due to physical resources.

Another issue is students’ responses to whether or not they feel part of a learning community (statement #3), 31.6% of students strongly disagreed,  compared to a total of 21.1% of students who agreed. A third issue is how useful students feel that learning in this way will be for them later on in their chosen careers. Statement #14 was clearly supported.

Finally, with respect to students’ knowledge about their own learning, many of the statements seem to reflect students’ understanding about how ‘they learn and work best’, such as statement #4, “I can make more effective use of my time when I’m learning using VLEs”, and  #5, “I like the way the responsibility for my own learning is on me”. Both offer support, albeit partial, for the views expressed in the interviews. Reasons for the differences between the two sources of data this will be explored in more detail later, although these may be reflect individual differences in terms of learning style, cognitive style, motivational orientation, ability etc.

To summarise, it would seem from the interview and questionnaire data that students’ perceptions of working with VLEs depend on several factors, including:

  • Whether the VLE is used to support rather than replace traditional methods.
  • The amount of opportunities available for students to discuss ideas and work collaboratively.
  • Whether the physical environment has evolved with the changing needs of students working with VLEs, e.g. computer access.
  • Whether learning in this way is actively able to cater for individual learning needs.

 

Is there evidence of active learning through VLEs?

David Hunt (1987) refers to passive learning as when students are learning from the outside in and not the inside out. Whilst students may be absorbing material, they may not necessarily be taking part in the active thinking. They may not be making judgements about the relevance of material, or making value judgements, and they may not be deciding what learning material is important based on their own experiences. In other words, they are playing someone else’s game and not constructing their own learning.

This question explores whether or not the materials that have been developed for the two VLEs have promoted active learning. The data for this is derived from the task environment within the evaluation model, and thus a key source of data is the materials themselves. However, as the evaluation emphasises student perceptions as its focus, it is important to draw on the interview data to answer this question as well. The students often described the types of materials they were presented with (see minor theme #1). Earlier, the issue was raised that students frequently printed off the learning materials rather than work on screen. Such comments imply that the materials provided passive support, which is easily transformed into the printed pages they need, want or are used to. Many of the students referred to the materials as being passive in some way.

It’s mainly text really. Which is fine cos that’s what I need. I would like a few more pictures though. I know they can put them on but I think there’s been [technical] problems.

(Module #4 student)

It’s generally the learning outcomes, which is useful. He does try to make us break down what we have to do for the essay in the course room, which can work well.

(Module #1 student)

Thus the students seem to perceive their materials in the same way as traditional ‘passive’ materials, both in the sense that they do not interact with them on-line, and that the form of learning is passive. The latter point is explored below by analysing the module materials.

 

Analysing the materials: passive vs. active?

The task environment within the model shown earlier is broken down into 5 parts, all of which may help to explain students’ perceptions. Selections of materials from each of 9 modules were collected and analysed. (For a more detailed explanation of the method see Richardson, 1999.) Briefly, the method of analysis consisted of:

  1. Materials selected. (Two weeks’ worth of material per module approximately half way through the semester.)
  2. Familiarisation with material
  3. Initial material coded on a page-by-page basis to discover and establish categories. (2 coders worked independently.)
  4. Remaining material coded and frequencies recorded.
  5. Structure of the content plotted and categorised according to a particular model (see Figure 2).

The final results are shown in Table 4. These figures represent occurrence of categories. For instance, a page of textual information on a particular topic would be recorded as ‘1' in the ‘subject knowledge’ category.  In addition, three other pieces of information were recorded: whether there is evidence of active encouragement of the courseroom, the contact hours for the 2-week period and the style of writing.

Frequency over a two week period

VLE Modules

 

1

2

3

4

5

6

7

8

9

Subject knowledge

5

15

6

19

26

16

3

8

3

Procedural information/support

23

5

5

5

10

7

16

5

19

Example to support understanding/case study

0

0

5

3

0

2

0

4

1

Question to test mastery

1

3

2

0

0

0

2

3

0

Advance Organiser/Overview

3

1

5

0

1

5

8

5

5

Advice about how to study the material

2

1

3

0

2

1

2

4

6

Non-collaborative task to develop understanding/apply knowledge

0

5

1

1

1

2

0

2

1

Collaborative task to develop understanding/apply knowledge

5

0

3

0

0

1

3

0

3

Question encouraging HOT

4

0

2

0

0

1

3

0

1

Tutor Feedback

0

1

1

0

0

1

1

0

1

Assessment details

8

3

1

2

3

4

6

7

6

Supporting resource eg. reading lists

0

15

2

5

5

6

5

8

8

Summary of ideas

0

0

5

0

1

3

2

3

0

Aims/objectives

4

1

2

3

3

3

3

3

4

Frequently asked questions

3

0

3

0

1

1

5

0

2

Orientation around materials

0

4

1

1

2

2

0

3

11

Guidance for self-study

11

4

0

0

0

3

5

4

0

External web links

2

9

4

1

2

1

11

1

0

Other media links

0

1

1

0

0

0

3

0

0

Suggestions for getting help

1

2

1

3

0

0

2

0

1

Image/Illustrations

0

5

0

0

0

0

9

1

2

Active encouragement of courseroom to discuss related topics

Y

Y

Y

N

N

Y

N

Y

N

User friendly, “you and I” style of writing?

Y

Y

Y

Y

N

Y

Y

Y

Y

Structure of content (1= Linear; 2=Linear with min branches; 3=Resource-based/branching)

1

1

2

1

1

2

3

2

2

Remaining contact hours

4

˝

˝

˝

2

1

4

Table 4. Content Analysis for VLE Materials over 2-week period

 

For reasons of confidentiality the titles and subjects of modules have been omitted. The results invite several comments. Firstly, there has been a diversity of approach towards designing modules. It would seem that all nine modules have taken different approaches to using VLEs and made different attempts to encourage active learning. Five modules encouraged the use of the courseroom for discussion and collaboration. This does not appear to be related to the number of contact hours that students still have. Secondly, the selection of materials chosen for analysis contained limited evidence of encouraging higher-order thinking. Five modules contained materials that may be considered to stimulate students thinking beyond the immediate text. Thirdly, the modules fell into 3 categories of content structure. Only one chose to adopt a resource-based learning approach - most reflected an attempt to mirror the courses as they were presented previously. Fourthly, a range of embedded support devices was found across the modules. However their distribution was uneven and appears to influence students’ perceptions. For example, module #1 has been developed primarily as a support mechanism for students that runs in parallel to face-to-face lectures and seminars. Very little course content has been placed on the VLE; the majority of the materials are ‘administrative’ and help the students manage their own learning and the practical projects which form part of their assessment. A student from this module commented that “...the materials really help to tie the whole module together...the guidance notes and summaries really help, and encourages us to maximise the use of resources on our own”. By contrast, module #5 is more focussed on subject knowledge and is more text-based.

Reasons for these differing approaches to the module structures include the demands of particular subjects, tutor expertise, and the resources and support made available to the tutors. The tutor interviews elaborate these points further (see Richardson & Turner, 2000b). The data thus far suggests that students:

  • were more ‘actively’ involved in learning on modules which incorporated a range of ESDs.
  • felt more positively toward their VLEs where there was adequate or extensive support, either in terms of additional contact, or in a higher concentration of guidance and support within the learning materials. In other words a ‘tutor replacement’ rather than ‘materials replacement’.
  • didn’t feel they had sufficient ‘challenge’ from the materials alone.

 

Is there evidence of individual differences in the way students approach and work with VLEs?

As suggested earlier, the findings from the student interviews and questionnaires support the idea that the ‘quality’ or ‘elements’ of the task environment are not the only contributory factor to positive learning experiences. There is a strong sense that students’ individual differences play a role. This section provides a series of analyses that explore the potential interactions between students’ perceptions of working with VLEs and the qualities they bring with them to a learning context.

 

Gender and perception of VLEs

Using the responses in the questionnaire of perception of VLEs (shown in Table 3 ), an overall mean was calculated for perception of VLE for each student. A higher mean represents a more positive perception. A t-test was performed to determine whether male and female students felt differently toward VLEs. The results (see Table 5) show that females responded significantly more negatively (p<0.05) toward VLEs than males. This outcome may be for several reasons (see, e.g. Valcke et al., 1993). Female students may not be as computer literate as male students and therefore less confident, or some elements of working in this environment may not be compatible with the needs of female students. The evaluation has shown that, generally, students are not working collaboratively and have reduced face-to-face contact. Female students may prefer more interactive methods of learning. These questions are addressed below.

 

Gender

No.

Mean

S.D.

F

Sig. (p<)

Perception

Female

Male

110

116

39.049

40.71

6.26

5.97

0.94

0.49

Table 5. Gender and Perception of VLEs

 

I.T. proficiency and perception of VLEs

A mean score was calculated for I.T. awareness from the responses to the IT statements in the questionnaire. Here students were asked to rate themselves on a 3-point Likert scale in statements of I.T. proficiency. A higher mean represents [perception of] a lower I.T. proficiency. For a general look at a possible interaction between students’ IT skill and perception of using VLEs, a Pearson correlation coefficient was obtained. The results  (0.177, p= 0.892) suggest there is no interaction between the variables. However, it may be possible that this is due to students rating themselves, rather representing an objective measurement. Earlier, it was suggested that female students had a more negative attitude toward VLEs which may be because female students did not have the same level of IT proficiency as male students. A t-test was performed to evaluate whether female students felt their IT skills were less developed than their male counterparts. The results (Table 6) show a significant difference (p<0.001) which may contribute to female students having a more negative attitude toward VLEs.

 

Gender

No.

Mean

S.D.

F

Sig. (p<)

I.T. proficiency

Female

Male

110

116

1.51

1.34

0.29

0.26

0.43

0.000

Table 6. Gender and I.T. Proficiency

 

Time-management practices and perception of VLEs

This part of the evaluation uses the responses to section 5 of the questionnaire, which explored the time-management practices of students. The first stage of this analysis was to combine all the responses to the time-management questions and calculate a mean response. A higher mean represents a higher degree of time-management practices. The value of the correlational coefficient for mean time-management skills and mean perception (r=0.177, p<0.009) strongly suggests that students who use more developed time-management practices have a more positive perception toward using VLEs.

Figure 2. Time management practices and perceptions of VLEs

 

Motivational Orientation and perception of VLEs?

This analysis uses the results from the Motivational Orientation Scales (MOS), used in the student questionnaire. Students mean responses were calculated for items relevant to the different scales. The mean value for each orientation scale was paired with mean perception of VLE and correlation coefficients obtained (Table 7).

Motivational Orientation Scales (MOS)

 

Task Orientation

Ego Orientation

Avoid Inferiority

Easy Superiority

Work Avoidance

Aims of Scale

To measure an orientation of involving oneself in cognitive activities & keeping busy

To measure an orientation to showing ‘smartness’

To measure an orientation of sensitivity to others’ negative evaluations of one’s ability

To measure an orientation of doing well without effort

To measure an orientation of avoiding hard work

Correl’n with Perception of VLEs

Pearson 0.20**

P<0.003

N=219

Pearson 0.23*

P<0.001

N=222

Pearson 0.05

P<0.45

N=222

Pearson 0.08

P<0.20

N=221

Pearson -0.02

P<0.73

N=221

Table 7. Motivational Orientation and Perception of VLEs

 

Firstly, the correlation coefficient and associated probability for task orientation and perception of VLEs support the suggestion that students who are inclined to engage and enjoy independent learning activities which keep them busy have a more positive perception of learning through VLEs. The results also support the interview data, where students who seemed have a stronger understanding and motivation toward independent learning also expressed more positive perceptions of using VLEs.

Secondly, the results for ego orientation suggest that these two groups also have different perceptions of working with VLEs. It is closely related to the concept of extrinsic motivation. Research often suggests that extrinsically motivated individuals prefer to be in the company of others (Dweck & Elliott, 1983), and, as the results later describe, students have been expected to use a virtual learning environment at the expense of spending time with their peers. They have worked primarily alone. The results do not suggest an interaction between avoidance of inferiority, easy superiority and work avoidance scales.

 

Cognitive styles and VLE perception

The sample was grouped on the basis of their ratios on each cognitive style dimension. The divisions were:

  • Wholists (0.23 to 0.89; N=7 ), Intermediates (0.90 to 1.12; N=45), Analytics (1.13 to 3.52; N=20);
  • Verbal-Imagery dimension: Verbalisers (0.48 to 1.02; N=50), Bimodals (1.03 to 1.14; N=59), Imagers (1.15 to 8.21; N=31).

One-way anovas were performed between each dimension and mean VLE perception. The results showed an overall significant interaction between wholist-analytic style and VLE perception (F=4.40, p=0.01). A post-hoc Tukey test revealed significant differences (p<0.01) between the perceptions of the wholist group and analytic group, but not between intermediates and analytics or wholist groups (p>0.05). This implies that the more analytic students were, their more positive their perception of VLEs became. With regards the Verbaliser-Imager dimension, a one-way anova revealed no significant differences between the mean responses of the three V-I groups (p>0.05). 

 

Summary

An exploration of some individual differences that may influence the quality of the learning experience revealed:

  • Female students have a more negative attitude towards the way they are currently being used.
  • Students with more developed time-management practices have a more positive perception of learning using VLEs.
  • I.T. proficiency as judged by learners themselves does not appear to be related to perception.
  • Female students view themselves as having less well developed I.T. skills than their male peers.
  • Students with a task-orientation have a more positive perception of learning using VLEs
  • Students with and ego-orientation have a more negative perception of VLEs
  • Students with a wholist cognitive style have a more negative perception of VLEs than those with an analytic style.

 

Are students using the on-line courserooms to discuss their learning?

Issues of whether students are learning together, with their tutors, or generally feeling part of a learning community have arisen throughout the results presented thus far. This section focuses upon students’ and tutors use of the on-line courseroom discussion rooms as a means of learning collaboratively. A conversation analysis was performed on selections of courseroom input during the semester. The analyses covered an eight-week period. The questions that guided the analyses were:

  • How many/what types of contributions have been made overall to the on-line discussions?
  • How many of these contributions have come from students, and how many have come from tutors?
  • How many contributions have ‘individual’ students made?
  • How do the modules compare in their use of the courserooms?
  • What times of the day have contributions been made?

Overall, 387 contributions were recorded and analysed over an 8week period. Out of these contributions, 216 came from students and 171 from tutors.

 

Students’

contributions

Tutors’

contributions

Total

Expository

34 (100%)

0 (0%)

34

Self-statement

40 (75%)

13 (25%)

53

Question

12 (39%)

19 (61%)

31

Procedural

21 (30%)

50 (70%)

71

General Feedback

14 (42%)

19 (58%)

33

Positive Feedback

14 (61%)

9 (39%)

23

Negative Feedback

4 (33%)

8 (67%)

12

Assumption

5 (25%)

15 (75%)

20

Cognitive Support

13 (42%)

18 (58%)

31

Reflective statement

41 (100%)

0 (0%)

41

Reconstruction

18 (47%)

20 (53%)

38

Table 8. Summary of Contributions to Courseroom Discussions

 

Table 8 suggests that the majority of contributions came from students (56%). However, further analysis showed that only 1 in 11 students contributed to discussions. Students’ mean number of contributions was 1.71 (SD=16.56), whereas for tutors, the mean number of contributions was 4.83 (23.13). Thus there was a strikingly low level of use by the majority of course members. Even those who do contribute to the courseroom tend to post only one or perhaps two messages. Even the most active group member contributed only 7 times. These figures seem to uphold views expressed by students that they quickly became disappointed when they made a contribution but found they didn’t receive any response. The tutor contributions also vary enormously. As Figure 3 shows, one tutor made 33 contributions with a second tutor contributing 25 times, yet the mean of 4.83 entries indicates many low-use courseroom entries.

Figure 3. Contributions from tutors

 

Secondly, tutors’ contributions tended to relate to procedural issues, whereas students’ were predominantly of a low-order input, where the student expresses an unsupported opinion (self-statement). Tutors and students contributed equally with expository contributions where reference to other sources substantiates a view.  Perhaps the most important finding was the lack of feedback from tutors or students. Tutors gave very little positive feedback to students. The overall impression is that courserooms was being used for ‘housekeeping’, setting questions and providing instructions, with little evidence of actual ‘discussion’ as would be expected in a face-to-face seminar. If tutors are to be seen as role models for leading and facilitating discussions, these figures offer a possible explanation for lack of contribution. In other words, if tutors tend to use the courserooms for ‘housekeeping’, and as methods of course management, then it is likely that students will follow their example.

 

Time of day usage

As the student interviews highlighted, an important issue has been the availability of computers and rooms. Figure 4 provides a summary of the times of day when students and tutors contributed to the courserooms. It thus provides information regarding the times of highest demand for resources.

Figure 4. Times of contributions

 

The majority of contributions were made between 12.00am and 2.00pm – presumably during lunch breaks. Although this supports the view that VLEs allow the student (and tutor) to choose when to study, the low level of activity during morning and afternoon ‘working’ time may indicate that students and tutors ‘fill in’ their lunch breaks by entering the courseroom. Furthermore, it may not be regarded as such an important activity that it should warrant prime working time. The different comments about availability of computers by students could also be a reflection of the time of day when they tried to gain access. Those who chose lunchtime, which usually a busy time for open access facilities, may well have had difficulties, whilst others who have not found access a problem may well be choosing less busy periods of the day. Although the flexibility offered by VLEs featured in responses to the questionnaire, very few contributions were made after 5.00 p.m., questioning the traditional notion that students will work online during the evening. There was no evidence that advantage has been taken of the possibility of accessing the VLE from home.

 

How have tutors facilitated discussions?

The use of the courseroom across modules varied considerably and with different effect. Two researchers independently judged the overall approaches of modules to enhance reliability. It was agreed that the approaches might be described in three groups. Table 9 provides a simple comparison between the approaches.

 

Approach 1

(module #9)

Approach 2 (module #8)

Approach 3

(module#4)

Expository

0

0

34

Self-statement

0

9

4

Question

10

7

2

Procedural

10

10

40

General Feedback

6

13

0

Positive Feedback

8

1

0

Negative Feedback

0

8

0

Assumption

0

7

8

Cognitive Support

16

2

0

Reflective statement

0

0

0

Reconstruction

18

2

0

Table 9. Comparison of tutor inputs on 3 different modules

 

Combining face-to-face group discussion with on-line presentation/discussion: Tutors in module #9 adopted this method in particular. Throughout the eight-week period students were placed in discussion teams. Via the courseroom, the tutor gave the teams questions to read and discuss ‘outside’ of the courseroom. As part of their discussion, they were required to produce a summary of their discussion and conclusions and present it to the rest of the module participants via the courseroom. The other students then had the opportunity to ask questions and take the discussions forward.  Out of all methods for facilitating discussions, this received the most positive feedback from both tutors and students. For example, one student commented,

We’re getting the best of both worlds… small groups for us to discuss ideas and then we feel more confident about presenting them to the other students.

The course tutor said:

My aim was to replicate what should be happening in seminars... and they know where I am if they need me.

Table 9 shows how this tutor has provided the most constructive feedback to students contributions as well as a significant amount of cognitive support.

Group on-line discussions: This method was very similar to the first, except that the face-to-face part of the discussions was replaced by on-line. Each team of students had their own area of the courseroom to discuss a topic. The tutors had access to each area, to monitor discussion and provide help if necessary. Students were then required to come together and discuss their ideas further. The tutor of module #8 adopted this approach and commented that he found it:

...very time-consuming.... it’s difficult because you have to keep on at them to contribute to their teams discussions.

A student from the courseroom explained:

It’s hard because you’ve always got to wait for the responses... and you don’t know when they are going to come... it’s better with a cup of coffee in the coffee bar.

The tutor also made an attempt to provide feedback and cognitive support.

On-line individual presentation/discussion: This method was adopted by the majority of modules and involved the tutor asking a question of some kind and students then responding. This method appears to have had the greatest variance in effectiveness, depending upon the number and types of tutor inputs and the types of student responses. For example the tutor of module #4 only made one contribution to the courseroom at the start of the 8-week period when students were asked to provide their project proposals for the module. This resulted in individual students only presenting a summary of what they intended to do, and neither tutor nor peers offered feedback, discussion etc. By contrast, in module #3, two tutors posed topics for discussion which they expected individual students to respond to as thoughtfully as they could. As students did so, the tutors provided feedback and encouragement and attempted to take the discussions beyond the superficial by pointing them towards useful texts.

 

Summary

Of the different strategies for on-line discussion outlined above, the first approach appears to be the most effective. Students felt able to make quality contributions without the frustration of waiting for replies as they would have had to if it was all on-line. However, the success of any of three approaches increases when tutors devote time and energy to stimulate and keep discussions going.

 

The evaluation model: moving forward

The model designed for the evaluation has been an effective tool for a formative evaluation.  It has provided both insights into the learning processes involved through VLEs, and into their effective evaluation.  A strength of the study has been its ability to support an illuminative and holistic methodology with summative and quantitative data, thus allowing the move from a description of what has happened to an explanation. For example, the relationships and factors that influence students’ experiences reported in the interviews were then explored further in terms of their motivation. We are now able to say that a range of individual differences influences students’ experience, and, to a certain extent, these can now play a ‘back-seat’ role in the model. The focus now can be upon developing appropriate strategies to assist tutors to widen the ‘audience’ of their materials, and on offering methods of incorporating the model into the module development and the evaluation processes. 

After carrying out content analyses on materials and evaluating their effectiveness it is now possible to add into the model more specific criteria and guidance for the types and range of ESDs etc which should be incorporated. Other changes include further exploration of why students preferred to work from ‘hard copies’ of their materials. Different explanations for this emerged during the interviews, e.g. uncomfortable physical environment, difficulties in reading from a screen. An investigation into these areas is currently being designed and incorporated into the model.

 

Conclusion

Evaluating the quality of student experiences has been a central constituent of the process of introducing distributed learning through virtual learning environments at Staffordshire University. This paper has provided an overview of an evaluation of this initiative. As emphasised throughout, the power of the model has been its attempt to place the learner at the centre of the evaluative process. Evaluating the ‘quality’ of learning experiences began from the student and worked outwards. Clearly, the holistic nature of the study has generated a substantial amount of data, which could not all be fully analysed here. However, from the data presented in this paper, several key conclusions can be drawn:

  • When VLEs are designed so that they focus on passive strategies (content-driven, poor support and little use of ESDs) students are critical and highly instrumental in their use of them. When elements of the VLE supports active learning (e.g. successful use of the courseroom) these are much better received.
  • A balance between learning on-line and face-to-face learning is required. It may be there is a need for a transitional period for students and tutors. This would allow tutors to receive greater feedback regarding the VLE materials they have produced, and students would feel they had they opportunity to bring their problems and ideas together with other learners and their tutors.
  • Students need to have more opportunities to discuss their learning with other students non-electronically. The most successful model for encouraging asynchronous discussions involved student teams who met in person to discuss ideas and then presented their findings to the other students for comments. This reduced the frustration that students felt with asynchronous methods.
  • Tutors need to have a greater awareness of how to facilitate electronic discussions.
  • There is no right and wrong way to present the content on a VLE. A linear approach is a particularly good starting point for tutors as it allows them to manage the student experience more easily.  Students can then be guided towards more independent learning. Resource-based-learning approaches give students much more freedom to define their own short term learning goals and manage their own time. However, it was found that students need to have well-developed time-management practices and the skills of independent learning to benefit from this.
  • A range of individual differences influences the quality of the learning experience.
  • The majority of students interviewed expressed a preference for downloading their materials to hard copy.  Subjects that were more ‘content-based’ provide more of an opportunity for students to do this. Other reasons for this preference include the physical discomfort and inconvenience of computer rooms and the difficulty in reading from a screen for long periods.
  • Students report they feel less part of a learning community. Strategies need to be explored such as more successful use of the courserooms to combat this.

 

References

  • Dweck, C. S. & Elliott, E. (1983). Achievement motivation. In E. M. Hetherton (Ed.) Handbook of Child Psychology, vol IV Social and Personality Development, New York: Wiley, 643-692.
  • Fransella, F. (1981). Personal construct psychology, and Repertory grid technique. In Fransella, F. (Ed.) Personality-Theory, Measurement and Research, London: Methuen, 26-58.
  • Hunt, D. E. (1987). Beginning with ourselves: In practice, theory and human affairs, Cambridge, MA: Brookline.
  • Marton, F. (1981). Phenomenography – Describing conception of the world around us. Instructional Science, 10, 177-200.
  • Nicholls, J. G., Patashnick, M. & Nolen, S. B. (1985). Adolescents’ theories of education. Journal of Educational Psychology, 77, 683-692.
  • Oliver, M. (1998). Innovation in the Evaluation of Learning Technology, London: University of North London press.
  • Richardson, J. A (1999). A large-scale ‘local’ evaluation of the quality of students’ learning experiences using virtual learning environments: guidelines, Staffordshire University, England: Learning Development Centre.
  • Richardson, J. A. & Turner, A. E. (2000a). Tutors experiences of learning and teaching through VLEs, Learning Development Centre: Staffordshire University, England.
  • Richardson, J. A. & Turner, A. E. (2000b). Encouraging students to particpate in on-line discussions, Learning Development Centre: Staffordshire University, England.
  • Riding, R. J. (1991). Cognitive Styles Analysis, Birmingham, UK: Birmingham Learning and Training Technology Centre.
  • Svensson, L. & Theman, J. (1983). The relation between categories of description and an interview protocol in a case of Phenomenographic research. Paper presented at the Second Annual Human Science Research Conference, Duqreasne University, Pittsburg, P.A.
  • Valcke, M. M. A., Martens, R. L., Poelmans, P. H. A. G. & Daal, M. M. (1993). The actual use of embedded support devices in self-study materials by students in a distance education setting. Distance Education, 14 (1), 55-84.
  • Vernon, P. E. (1976). Strategies of learning. British Journal Educational Psychology, 46 (12), 8-48.

decoration