Educational Technology & Society 3(4) 2000
ISSN 1436-4522

W3LS: Evaluation framework for World Wide Web learning

Jan van der Veen
DINKEL Educational Centre
University of Twente
p.o.box 217, 7500AE Enschede
The Netherlands
Tel: +31 53 4893273
Fax: +31 53 4893183
j.t.vanderveen@dinkel.utwente.nl

Wim de Boer
Faculty of Educational Science and Technology
University of Twente
p.o.box 217, 7500AE Enschede
The Netherlands
Tel:  +31 53 4893092
w.f.deboer@edte.utwente.nl

Maarten van de Ven
Centre for Didactics and Education Development
Faculty of Technology and Society
Delft University of Technology
Kanaalweg 2B, 2628 EB Delft
The Netherlands
Tel: +31 15 2784686
Fax: +31 15 2784627
m.j.j.m.vandeven@tbm.tudelft.nl

 

ABSTRACT

An evaluation framework for World Wide Web learning environments has been developed. The W3LS (WWW Learning Support) evaluation framework presented in this article is meant to support the evaluation of the actual use of Web learning environments. It indicates how the evaluation can be set up using questionnaires and interviews among other methods. The major evaluation aspects and relevant 'stakeholders' are identified. First results of cases using the W3LS evaluation framework are reported from different Higher Education institutes in the Netherlands. The usability of the framework is evaluated, and future developments in the evaluation of Web learning in Higher Education in the Netherlands are discussed.

Keywords: Web learning environment, Tele-learning, Evaluation framework, Case-studies


Introduction

Web learning environments connect instructors, students and learning resources by means of the World Wide Web. Many institutes for Higher Education have shifted their attention from investing in educational courseware to investing in Web learning environments. Traditional students as well as distance learning students are expected to benefit from this approach. Several studies (e.g. Bauer,1998; Landon, 1999) have suggested methods for selecting the best Web learning environment from the growing list of available tools (such as Lotus Learningspace, WebCT, CourseInfo etc.). Following on from this work, researchers have now shifted their focus to optimising their use in education (e.g. Collis, 1999). A large number of evaluations of particular Web learning environments appear in the current literature. However, comparisons between evaluations and the application of these results in other situations is difficult. Each of these evaluations focuses on different aspects of learning in a Web environment. The W3LS project (van der Veen, 1999) aims to stimulate standardised collection and analysis of evaluation results from the use of Web learning environments in the practice of Higher Education. A consortium of Dutch institutes for Higher Education initiated the project. The results of this project (an evaluation framework, questionnaires, case-study reports and a Web site with an overview of all this) are intended to act as resources for instructors, Web developers, and policy makers who are involved in the implementation and the use of Web learning environments in Higher Education. They can use these results in two ways: the evaluation framework can be used to evaluate their practice and the case studies can be used to build or improve their courses.

The first three case studies have been performed to test the evaluation framework. Here we present the main results by comparing the outcomes of the three case studies. We also use these to assess the usability of the evaluation framework. In 2000-2001, we hope to collect more data from case studies, primarily in the field of Higher Education.

 

Evaluation aspects

The success or failure of an experiment depends on more than one aspect. In this study we discern five aspects: Education, Ease of use, Techniques & maintenance, Organisation and Costs & benefits. Each framework aspect will be addressed consecutively, each with a specific major question

 

Evaluation aspect 1: Education

Main question: Did the learning environment help reach the educational goals?

Overall learning goals like quality and productivity are important, but flexibility and motivational aspects can play an important role. In the design phase of courses, learning activities are scheduled that can help reach these goals. A Web learning environment consists of a set of tools that can be utilised for a range of activities. For each combination of ‘activity & tool’, the success can be determined based upon the gathered evaluation data. The results of this ‘task-medium fit analysis’ are summarised in an activity-tool matrix (see Table 1). Experiences with, for instance, ‘online discussion’ can thus be compared across case-studies. By looking into this level of detail, we can draw conclusions about what parts of the Web learning environment have been successful.

Tool

Activity

Search

Engines

Web file archive

Web discussion group

Chat

Video-

Conference

Email

Looking for information

Positive

neutral

 

 

 

 

Online discussion

 

negative

negative

positive

positive

neutral

Planning

 

 

 

neutral

positive

 

Publishing

 

positive

 

 

 

 

Giving feedback

 

negative

 

 

 

positive

Progress monitoring

 

negative

 

 

 

 

Table 1. "Activity & tool" matrix for an arbitrary course.

 

Evaluation aspect 2: Ease of use

Main question: Is the Web learning environment easy to use?

With respect to usability we focus on aspects of the user interface. Is the user interface intuitive and easy to understand? Can major components be reached efficiently? Can novices get started quite easily? And does it take much time to become an expert? Sweeney (1993) outlines out how usability research can be utilised in the starting phase of projects as a means to test whether users perceive the Web learning environment as was intended.

 

Evaluation aspect 3: Techniques & maintenance

Main question: Did the Web learning environment technically function correctly?

Limited availability of Internet-connected computers, network problems, bad performance and servers that are down can frustrate not only students but also instructors. Sound infrastructure and a well-prepared introduction of the Web learning environment do pay off. For technical staff the installation, maintenance, user administration and security are serious issues that need to be part of the evaluation. In this aspect we also address the flexibility and openness of the learning environment software. Can this software work together with other programs involved in education or administration?

 

Evaluation aspect 4: Organisation

Main question: Was all the required expertise available?

It is important to check if all organisational conditions are fulfilled to make the use of a Web learning environment a success. On the management level an important condition is the presence of an institutional strategy in which the introduction of the Web learning environment should fit. For instructors, the availability of scheduled time, training, support staff, necessary equipment and resources are important factors. In research-oriented universities, it is important to check if there is a payoff for instructors who put their best effort into education. With respect to the students, the organisation should be clear how they expect their students to make use of the Web learning environment. Can they find sufficient Internet connected computers at the department or library? The alternative is that the students are expected to purchase their own up-to-date computer and get connected to the Internet. Finally, sufficient technical and support staff should be trained and prepared.

 

Evaluation aspect 5: Costs & benefits

Main question: Are the costs reasonable, both at the start and after wide-scale implementation?

When considering costs and benefits, it is important to take into account non-financial as well as financial elements. Quite often non-financial economic aspects are key factors in the decision to invest in the introduction of Web learning environments. Many institutes are not even aware of the amount of costs of their ICT project (Davis, 1997). For large-scale implementations, however, an inventory of costs is necessary. Bartolic (1998) gives some Canadian examples of cost-benefit analyses. Alexander & MacKenzie (1998) have performed a cost-benefit analysis on educational ICT projects in Australia. They also addressed non-financial costs and benefits.

 

Data collection

Apart from student appreciation and instructor opinions, important information should be gathered from development staff, technical support staff and from the management level. For all of the evaluation aspects the evaluation framework offers a set of questions, to be answered by all educational stakeholders involved. An example question is shown in table 2.

 

How do you rate the following components of the Web learning environment ?  

Component

Complety useless

Useless

Neutral

Useful

Very useful

Email for submissions

o

o

o

o

o

Email for commenting

o

o

o

o

o

Email for asking questions

o

o

o

o

o

Online discussion

o

o

o

o

o

Maillist for all group members

o

o

o

o

o

File archive

o

o

o

o

o

Group products publishing area

o

o

o

o

o

Questions & answers

o

o

o

o

o

Chat

o

o

o

o

o

Whiteboard

o

o

o

o

o

Agenda

o

o

o

o

o

Search course resources

o

o

o

o

o

Links to interesting Web sites

o

o

o

o

o

Results overview

o

o

o

o

o

Table 2.Example question from the student questionnaire of a W3LS case study.

 

The questions can be used either in questionnaires, which may be specified for each of the stakeholders, or in checklists if one or more stakeholders are interviewed. The questions needed to create W3LS-questionnaires or interview checklists are available online. In Table 3 we show which 'stakeholder' sources we expect to contribute most to the evaluation of each of the identified aspects.

             Aspect

Stakeholder

Education

 

Ease of use

 

Techniques & maintenance

Organisation

 

Costs & benefits

Instructor

primary source

primary source

additional source

additional source

additional source

Student

primary source

primary source

additional source

 

 

Support staff

 

additional source

primary source

additional source

 

Manager

 

 

 

additional source

primary source

Table 3. Evaluation aspects and stakeholders that are supposed to provide information.

 

Case studies using the evaluation framework

Three institutes of Higher Education in the Netherlands have volunteered to perform an evaluation of Web learning using the W3LS evaluation framework. At the Christelijke Hogeschool Noord Nederland of Leeuwarden (CHN) and at the Hogeschool of Enschede, Enschede (HE) Lotus Notes Learningspace was used. The Technical University of Delft used WebCT for their courses. In each case the research focussed on one course, where the actors who were concerned were questioned. First a summary is presented of the evaluation results using the earlier mentioned five evaluation aspects, followed by a review of the usability of the evaluation framework.

Educational aspects

The results indicate that the institutes’ and instructors’ expectations with respect to Web learning environments are high. Instructors, for instance, expect added didactic value as a result of online discussions. The reasons mentioned for using the Web environments are mostly not didactical but are aimed at flexibility for students. Regular students however report limited added value. This can partly be explained by technical problems and lack of Internet connected computers. The highest appreciation was found in the case study of a course taken by professionals combining a job with taking courses. For this setting practical benefits for students are clear. In general, students are mildly positive about this new way of learning. The actual use of the environments gives an indication of the take-up of the Web learning by the learners. Table 4 gives an overview of what elements were most used in the learning environments.

Institution

Part of the environment

Leeuwarden

n=6 (27)

Enschede

n=7 (20)

Delft

n=20 (70)

Email for submitting work

 

X

 

Discussion / announcements

X

X

 

Schedule / roster

X

 

X

Down and uploading files

 

 

X

View pages (of groups)

 

 

X

Search in course materials

X

X

 

Overview of own results

X

X

 

Table 4. Most used elements of the Web learning environment

 

The elements in the table were used at least once a week. Other options (such as Question & Answer, Chat, Whiteboard, WWW-Links, glossary, Video, etc.) were not used as much. Figure 1 shows the start page of one of the courses, with a variety of options.

Figure 1. Homepage of the Delft course using WebCT

 

Ease of use,  technical & maintenance aspects

Students and instructors did sometimes have problems with access to their environments. Users require a high reliability from the Web learning environment. Most system administrators were satisfied with the performance of the systems (Lotus Learning Space and WebCT), while some users complained about long waiting times. This can be due to either the performance of the environment server, or to bandwith limitations. Apart from these problems, both WebCT and Learningspace were considered to be user-friendly.

 

Organisation aspects

There were many differences between the (technical and educational) support available in the three cases studied. In one case an instructor had to do everything by himself; in another case, a structural support group educated the instructor and took care of technical issues.

 

Costs & benefits

The cost and benefits issue is an interesting one. None of the institutions thought of saving money (or time) as one of the goals. They found it more important to use new Web technologies within their education as a way to distinguish themselves from other educational institutes. No real savings are reported, whereas investments are high. Instructors report that they have to put more time into teaching after the introduction of the Web learning environment.

 

Reflections on the evaluation framework

The evaluation of the framework approach shows that all general concerns were adequately addressed. Instructors indicated that the list of questions on general issues was complete. However, they would like to have the opportunity to add questions specifically aimed at particular elements of their courses, such as specific exercises or specific assessment procedures. If local issues are included, the evaluation procedure outcomes will be of greater help to the instructors. The task-medium approach gives a detailed insight into the parts of Web learning environment that are most (and least) appreciated.

Comparing our experience with our project plan, we underestimated the effort needed to collect and analyse all information. Three examples may clarify this remark. Firstly, it is difficult to get hold of student addresses and to stimulate students to return questionnaires. Secondly, it was difficult to get hold of appropriate stakeholders within the institute, such as the educational manager or the computer support specialist. Thirdly, after entering the data, it took us some time to present the information per case study in a standardised way. It is clear to us that in future case studies the evaluation procedure should be organised as soon as possible, preferably well before the actual course starts. One of the first activities must be to identify stakeholders and involve them in the evaluation procedure.

 

Discussion and future developments

The general framework presented here can help structure evaluations of the use of Web learning environments if it is extended to address local concerns. The results of these evaluations may be consulted by people involved in educational change, for example instructors, educational designers and policy makers, but also by people involved in educational research. The first three case studies indicate that adding a Web learning environment to traditional education is not necessarily appreciated by the students. With a larger number of reports on case studies in the nearby future, results may be used to indicate which learning activities can be implemented successfully in Web learning environments and under which conditions.

The follow-up of the research presented here is twofold. On the one hand, more detailed research will be performed in two areas. The first area involves using and evaluating the evaluation procedure in large-scale projects. This will result in adjustments to the questionnaire and to the evaluation procedures. The second area consists of comparative studies in which the W3LS evaluation procedures and questionnaire will be compared to the ones used in other countries, e.g. Canada (Bartolic, 1998), Australia (Alexander, 1999), United States (Ehrmann, 1998) and the United Kingdom (Harvey, 1999). On the other hand, the outcome of the W3LS project is used in practice already. The questionnaire, which consists of five categories of questions, is used as a toolbox to develop particular questionnaires. These questionnaires are completed by adding specific questions, aiming at particular topics of interest in the situation in which they are to be used. Secondly, the members of the W3LS authors will offer their expertise to potential users in order to improve their evaluation procedures.

The outcome of the W3LS project and other evaluation expertise will be offered to potential users via the Edusite. This is the portal of SURF-Educatief (SURF, 2000) to the field of ICT in Higher Education. This portal is aimed at employees of organisations operating in Dutch Higher Education. Furthermore, the portal will function as a means to distribute evaluation reports made by users of the Edusite. Other users may use these reports to improve their evaluation particular procedures. Hopefully, the spin-off of these activities is that evaluation will become an issue right from the very start of initiatives to improve education.

 

Acknowledgements

This project is made possible by a grant from SURF-education, and by the efforts of the universities of Twente, Delft and Maastricht. We thank Wiebe Nijlunsing (van Hall institute, Leeuwarden), Jan Wijbenga (CHN, Leeuwarden) and Co Braspenning (HE, Enschede) for their participation in the case studies.

 

References

Alexander, S. & McKenzie, J. (1998). An Evaluation of Information Technology Projects for University Learning, Committee for University Teaching and Staff Development. Australia,
http://www.canberra.edu.au/CUTSD

Bartolic-Zlomislic, S. (1998). The Costs & Benefits of Telelearning: Two Case Studies, Distance Education & Technology, The University of British Columbia,
http://research.cstudies.ubc.ca/

Bauer, C. & Glasson, B. (1998). A Case Study Evaluation of Two Web-Based Courseware Tools. Proceedings of TeleTeach98, 99-108.

Collis, B. A. (1999). Systems for WWW-Based Course Support: Technical, Pedagogical, and Institutional Options. International Journal of Educational Telecommunications, 5 (4),
http://www.aace.org/pubs/ijet/v5n4.htm.

Davis, N. (1997). Information Technology Assisted Teaching and Learning in Higher Education, HEFCE Research Series, M 11/97. HEFCE, UK,
http://www.hefce.ac.uk.

Ehrmann, S. C. (1998). Studying teaching, learning and technology: a tool kit from the Flashlight Program,
http://www.cti.ac.uk/publ/actlea/al9pdf/ehrmann.pdf

Harvey, J. (1999). Evaluation Cookbook, Learning Technology Dissemination Initiative. Heriot Watt University Edinburgh,
http://www.icbl.hw.ac.uk/ltdi/cookbook.

Landon, B. (1998). Online educational delivery applications: a web tool for comparative analysis,
http://www.ctt.bc.ca/landonline.

SURF (2000). SURF Edusite, the Netherlands,
http://www.surf.nl/edusite.

Sweeney, M., Maguire, M. & Shackel, B.  (1993). Evaluating user-computer interaction: a framework. International Journal of Man-Machine Studies, 38, 689-711. 

Veen, J. T. van der (1999). World Wide Web Learning Support. Project Web-site,
http://www.oc.utwente.nl/w3ls.


decoration