Automated Tutorial and Assignment Assessment
Roger F. Browne
Three developments are significant for the context described in this paper. Firstly, intelligent tutoring systems (ITS), which originated from the computer assisted learning systems of the 1970s, have the ability to respond to the needs of individual students. Secondly, powerful computer-based modelling packages such as Matlab are available that provide immediate feedback when applied to relatively simple problems (that is, at a level appropriate to undergraduate engineering students). Thirdly, the Internet provides an attractive means of managing tutorials and assessments, especially for large classes.
Real-world systems in engineering and technology are typically complicated, so direct experiments on entire systems may not be practical. Examples include:
Clearly these experiments cannot be performed on real-world systems owing to safety and health concerns. In some cases the use of physical models may be appropriate, although computer models now dominate in the engineering industry.
Within the context of a university engineering course, computer modelling has two aims:
Thus, mathematical modelling is an important tool, both within universities and in industry. An introduction to mathematical modelling is provided in “Technological Mathematics” (Massey University Calendar, 2001), a core paper for all options of the Bachelor of Engineering (BE) and Bachelor of Technology (BTech) degrees at Massey University. This paper is available to internal students only, and in recent years the class roll has varied between 150 and 280. Within the context of the Technological Mathematics course the objectives are to provide the students with the ability to formulate mathematical models, code the resultant equations in Matlab, and interpret the outputs of the simulations. Use of simulations as a teaching tool is not considered in this paper as it is the process which is relevant to this discussion. A Matlab-based assignment, which is used both as a tutorial and as part of the course assessment, will be reviewed below. Then some proposed improvements to this process will be outlined.
Computer simulations are important in both education and industry. According to Alessi & Trollip (1991), “In a simulation the student learns by actually performing the activities to be learned in a context that is similar to the real world” (p. 119). The authors observed that, compared to conventional tutorials, simulations provide better student motivation, offer a better transfer of learning, and are more efficient as a learning experience.
The choice of simulation language is important. Various special-purpose languages have been developed for learning through simulation. For example, SMISLE, a System for Multimedia Integrated Simulation Learning Environments has been developed (see de Jong et al., 1994). However Matlab has the advantage that it is fully industrial strength software and provides a rich library of built-in functions.
Peña & Alessi (1999) investigated the effects of three different presentation formats, Microcomputer-based Laboratory (MBL), simulation, and computer-based text, on individuals’ ability to understand concepts in physics. They concluded that the MBL and the simulation presentation formats were of equal effectiveness, and both were more effective than computer-based text in the context of the desired learning actions.
Basis of the mathematical modelling assignment
The objective of the mathematical modelling section of the technological mathematics course is that after completing the course the students will be able to:
The original modelling exercises had been in existence for over ten years. They were not re-evaluated given this established usage. The motivation for providing the automated process described in this paper was:
In its present form the assignment involves the input of numerical parameters from which, through computer-based experimentation, appropriate solutions will be derived. When the assignment was first introduced, plagiarism was a significant problem. Subsequently a scheme was implemented in which a portion of each student's ID (identification number) was used to generate a selection of the input parameters. However, various constraints must be imposed on the values of the parameters, and simple procedures were introduced to allow students to derive the values of their parameters.
The first assignment question involves simulating the flight of a golf ball (which is subject to the forces of gravity and air friction) such that it travels 100 metres before striking the ground (achieving a hole in one on a 100 metre fairway). An infinite number of combinations of initial angle (relative to the horizontal) and initial velocity will achieve this result, but by constraining the angle to be a function of the ID a velocity that is unique to that ID results. The technique initially adopted for determining the value of the initial angle was as follows. Let the last two digits of the ID be ‘pq’. Then the initial angle of the ball is ‘4p.q’ degrees. For instance, an ID of 12345678 will result in an initial angle of 47.8 degrees. The task for the students is to then estimate the appropriate velocity, perform a computer simulation, and use that result to refine their estimate. Usually an answer within the required bounds is obtained within four or five iterations. A typical simulation result is as shown in Figure 1.
Figure 1. A typical golf ball simulation
This has worked well in most cases. However certain input parameters provide fortuitously simple solutions. For instance, an ID ending in ‘80’ will give an initial angle of 48.0 degrees, and the corresponding velocity is 70 (within 0.1 metres/second). If the student uses initial guesses of velocity in multiples of ten then the correct value will be arrived at with very few iterations, and possibly on the first guess. In contrast, an ID ending in ‘76’ results in a velocity of 69.5, which is unlikely to be arrived at by guessing. So a more robust parameter-derivation scheme has being developed.
Using a java applet accessed through an HTML page the students input their ID numbers and have a suitable parameter set returned to them, based on the full eight digits of their ID. A portion of this process is shown in Figure 2. During the automated tutorial, marking correct answers can then be uniquely associated with a given ID. The process of establishing the table of acceptable parameters is fully automated. The system uses the ID as a pointer into the parameter table.
Figure 2. Parameters obtained from a Java applet
The web-based assignment submission has been fully implemented. Currently the marking process is semi-automated, but it is hopeful that a fully automated marking system will be in place soon. The only marking operation that would not be automated is the checking of graphs, but it has been found that virtually 100% of the computer-generated graphs of the assignments submitted are correct if the simulation results are correct. The production of a graph provides the student with visual confirmation that they are on the right track.
During this first year of electronic submission a number of problems have arisen:
In addition to the observations in the previous section, various improvements are under investigation:
In general error identification in faulty answers is likely to provide useful information only if the number of probable (or common) faults is constrained within some bound that can, in principle, be estimated. An incorrect answer arising from an erroneous calculation is ambiguous if it is indistinguishable from the answer arising from a different error for that same calculation, within some specified accuracy. For simulation problems having a random distribution of input parameters and in which the simulation answers are reported to a specified number of digits, the probability that two different errors would result in the same answer are as shown in Table 1.
Table 1. The probability of coincident errors
This table has been derived on the basis that all errors are equiprobable. Although this is only approximately true in the case of the modelling problem described in this paper the values in the table do indicate the levels of confidence with which the causes of erroneous answers can be correctly identified. For instance, if simulation results are considered as being accurate to four significant figures then the probability of two erroneous answers being the same is relatively small (approximately 1 in 60 or 0.0167) if there are 25 or fewer possible sources of error but is much larger (approximately 1 in 10 or 0.1011) in the presence of 100 possible sources of error.
Another consideration is the amount of work involved in identifying the possible errors (this must be coded into the assignment submission software). In highly regular mathematical procedures (such as matrix manipulation) this would be straightforward, but in the assignment considered here the commonly-occurring errors were only identified through the manual marking of assignments. The commonest errrors were incorrect signs. Finally, error identification would be very difficult to implement in the presence of multiple errors.
Web-based assignment submission, combined with automated marking, has the ability to provide a quick turn-around. In addition, it also provides feedback to students (especially when their assignment contains errors) and to tutors (allowing trends and common sources of error to be readily identified). In at least 75% of cases, incorrect answers can be ‘reverse-engineered’ to determine the source of the error. However the implementation of a fully automated system faces many challenges.