Educational Technology & Society 6 (1) 2003
ISSN 1436-4522

The use of SMIRP for the rapid design and implementation of pedagogical constructs: Case study of a question-answer-reference framework

Jean-Claude Bradley and Donald McEachron
Department of Chemistry, School of Biomedical Engineering
DrexelUniversity
Philadelphia, PA19104 USA
Jean-Claude.Bradley@drexel.edu

David Dorsey, Benjamin Samuel, Sundar Babu, Jeremy Boecker, Mohammad Haghkar and Jay Bhatt
Hagerty Library
DrexelUniversity
Philadelphia, PA19104 USA

 

ABSTRACT

The use of SMIRP, a web-based collaborative tool, for an application in an undergraduate and a graduate class is described.  SMIRP was used to rapidly construct a collaborative space where students could work on their assignment, request assistance and view their grades.  The pedagogical construct was based on a question-answer-reference model where students were required to answer a series of questions based only on the material present in references they selected from the open literature.  The answers and grades of all students were visible to all students in real time, although pseudonyms were used to respect student privacy.  Email alerts were provided to the teacher, teaching assistants and in the second class also to the students and a librarian.  Based on the analysis of log files, overall student performance in the class was found to correlate positively with curiosity and negatively with procrastination.  Student expectations of turnaround times for grades and general queries were also analyzed and compared to actual performance.  At the end of both classes a questionnaire module was created and an analysis of student satisfaction and preferences is reported.   The successful implementation of SMIRP in these two classes supports the contention that this collaborative tool is flexible enough for the rapid design and implementation of relatively complex pedagogical constructs, with the possibility of obtaining detailed metrics.

Keywords: SMIRP, Pedagogical Construct, Web-Based, Procrastination, Curiosity


Introduction

The advent of web-enabled technologies has dramatically expanded the modalities of the educational process (Squires & Preece, 1999; Montelpare & Williams, 2000; Niederhauser & Stoddart, 2001). Class content delivery can become more versatile, giving access to class notes, syllabi and multimedia resources, with better ways to search for information contained within the content, and from virtually anywhere.  The receipt of student work, its evaluation, and the communication of the evaluation to the student can also become more convenient, with more opportunities for automated processing.  Not only can web-based education enhance the individualized learning experience (Martinez, 2001; Virvou & Moundridou, 2000), it can also create interaction spaces for students to collaborate on projects either synchronously or asynchronously (Isenhour et al., 2000; Nachmias et al., 2000; Curran, 2002). Although several commercial and non-commercial packages exist that will handle these features, the range of possible pedagogical constructs is generally limited for off-the-shelf products.  The alternative of building a custom application for a given implementation may also be generally prohibitive because of the time and resources required for such an undertaking.

SMIRP is a flexible modular collaborative tool that was originally designed to track and manage the dynamic environment of a discovery-driven laboratory research operation (Bradley & Samuel, 2000). By enabling not only data entry but also modification of the database structure through a simple browser-based interface, SMIRP can allow selected users to adapt their collaborative environment to changing needs.  In this sense, SMIRP has a self-evolving database structure, in contrast to conventional database designs that are generally constructed top-down.  This flexibility was exploited to rapidly construct an open collaborative environment where student, teacher, teaching assistant and librarian could perform their respective tasks for a research based class assignment.

It is the purpose of this report to detail the implementation of SMIRP in two class applications, a graduate course in Human Physiology and a senior undergraduate course in Chronobiology, both at DrexelUniversity.  The same pedagogical construct, as detailed below, was used in both cases.  Email alerts were provided in both classes to feed information to the teacher and class assistants as to the progress of the course.  In the second implementation, the students and a librarian also received email alerts.  At the end of the class a questionnaire was added as an additional SMIRP module and feedback from the students was collected about issues concerning the pedagogical construct and the use of SMIRP to assist in the overall process.  Log files tracking both views and record additions were then used to quantify student behavior and the responsiveness of the teacher and the teaching assistants.  Student behavior was then compared with final class performance to identify correlations.

 

The pedagogical construct

The basic elements of SMIRP consist of modules, parameters and records (Bradley & Samuel, 2000) (see Fig 1). Modules can contain any number of parameters and are interconnected by a special type of parameter, which permits navigation between records from different modules with a single click.  A record is an instance of a module where the parameters are assigned specific values.  The creation of modules, parameters and their interrelationships is done easily and rapidly over a browser interface, generally Internet Explorer 5.  In this particular SMIRP application the modules shown in Figure 2 were created, with the specific functions and permissions shown in Table 1.  A typical screenshot of the SMIRP interface is shown in Figure 3.

The construct was based on a question-answer-reference framework.  The students were given a list of questions to answer by a certain date. They were instructed to not only answer the questions but also to link each answer to one to three electronically available peer-reviewed references that support their answer.  Each answer was graded separately by the professor, who verified that the answer was satisfactorily supported by each reference.  Students were given read access to everything except the personal information of other students.  Thus, they had the ability to see in real time how other students, identified only by pseudonym, were doing and how they were being graded.  Since they were required to answer the same questions but could not use the same references, this set up and environment that was both competitive, in the sense that they were competing for the same references from the open literature, and collaborative, in the sense that they could use real-time information about how the professor graded answers and selection of references.  Email alerts were sent on an hourly basis if any activity had taken place within the previous hour.  One type of alert summarized recent modifications or additions and another alert summarized all activity, including record views.  The teacher and teaching assistants received both types of alerts in both classes.  In the second class the alerts were also sent to the students and to a librarian who was responsible for teaching the students how to search and select peer-reviewed articles.  Another modification in the second class was the substitution of two teaching assistants for a professor to answer general queries and technical issues.  The first assistant was instructed to respond to questions and comments within 24 hours of posting and the second to audit the process at the 48 hour mark.

 


 


Figure 3. Screenshot of a record in the Answer module

 

Module Name

Function

Inter-module links

Student Permissions

1. Students 

Mainly used to associate answers and references to a particular student, identified by pseudonym.

Answers
References
Grades
Questionnaire

Read All

2. Questions

 

A series of  open-ended questions generated by the teacher.  Students were required to answer 8 of 8 questions in the first class and 6 of 12 questions in the second class.

Answers

Read All

3. Answers 

Approx 500 word answers to the questions in Module 2.  It was required that the answers link to at least one and up to 3 references from the reference module. 

Students
Questions
References
Grades

Read All
Add Record
Modify Record (self only)

4. Grades

A list of grading categories (A, B, C, D, F).  In addition the student selected the category “ANSWER FINISHED-PLEASE GRADE” when they were ready for grading.

Answers
References
Grades

Read All

5. References 

Links to peer-reviewed references.  In the first class references were graded, in the second they were not.

Students
Answers
Grades

Read All
Add Record
Modify Record (self only)

6. Comments/ Problems

A location for students to ask general question about the class or technical issues.  This is in addition to the comment parameter in the answers module.

Problem Type

Read All
Add Record
Modify

7. Problem Type

Categories of problems reported by students

Comments/ Problems

Read All

 

8. Student Information 

Contains the real name and contact information of each student. 

Students

Read (self only)
Modify (self only)

9. Questionnaire 

Created and made visible to students at the end of the class.  Provides an evaluation of the assignment, satisfaction with response time. 

Students

Read (self only)
Modify (self only)

Table 1. Summary of module functions, links and student permissions

 

Results and Discussion

Assessment of class operation

Two metrics were used to quantitatively assess the way SMIRP was used to manage the class assignment.  The first relates to the grading of answers.  In both classes the majority of students used SMIRP to verify that their answers were graded (see Table 2).  As measured from the time the students requested to be graded by selecting “ANSWER FINISHED-PLEASE GRADE” from the grading parameter in the “Answers” module, the average number of days students waited before first checking to see if their answer was graded was fairly consistent between the first and second classes, 3.49 and 4.55 days respectively.  The average number of days it actually took to grade ranged from 7.53 to 16.32 days, suggesting that student satisfaction should be improved by reducing grading time to values that are more consistent with student expectations.  A second metric assessed was the turnaround time to address student comments and questions.  There were two places where this was possible.  The first was in the “Questions/Comments” module and the second was in the “Answer” module.  The first related to general questions about technical issues or class policy.  In the first class this was answered mainly by a professor, while in the second case it was answered by one teaching assistant and audited by a second who ensured that the questions were addressed.  The first teaching assistant targeted to answer questions by the 24 hour mark while the second ensured that all questions were answered by the 48 hour mark.  Both teaching assistants used SMIRP’s automated email alerting system to be notified when comments were made.  Average turnaround times in both classes were very close to the 24 hours and results from the student satisfaction survey (see Table 3) revealed that satisfaction in both the turnaround time and quality of response was nearly unchanged and even slightly better in the second class that did not use a professor to address comments.  This demonstrates that effective use of SMIRP’s automated alerting system can be achieved to manage time-sensitive class processes, with quality control, and provide quantitative assessment.

 

 

Class 1

Class 2

Grading Responsiveness

 

 

Days for students to check grades (average)

3.49

4.55

Days to grade (average)

7.53

16.32

Days for students to check grades (median)

1.05

3.15

Days to grade (median)

6.80

11.77

Percentage of answers checked for grading

76.12%

88.89%

 

 

 

Responsiveness to general queries

 

 

Days for response to student queries (average)

1.15

1.04

Days for response to student queries (median)

0.78

1.00

Table 2.  Responsiveness of class operations

 

Correlation of student behavior with overall student performance

This particular pedagogical design provided some opportunities to identify correlations between student behavior and overall class performance in the hopes of uncovering some leading indicators.  Metrics corresponding to procrastination and curiosity were computed.  Procrastination was measured in two ways, by measuring the delay between the start of the class and the first login or the first record creation.  Since the processes of finding references and composing an answer were separated, and because of the competition to reserve unique references, the first instance of record creation is likely to be indicative of the actual moment the student first began working on a question.  Overall student performance was measured by using the final numerical grade of the student at the end of the class.  The SMIRP assignment consisted of only 15% of the student’s final grade.  Figure 4 shows that, when both classes are considered, there exists a correlation for three of the four procrastination assessments, the only exception being the delay in initial record creation for the first class. 

Curiosity was also measured in several ways.  The first was to assess the number of times a student looked at other students’ answers and references.  A second way was to look for the percentage of their own answers a student looked at to see if they were graded.  In both classes and for all analysis methods there is a general positive correlation between curiosity for self or others and final class performance.  The tendency is particularly significant for the most curious of students.  In other words, those few students who were especially curious, particularly so for other students’ references tended to do better than average.  The correlation becomes much less significant within the subgroup of students who were far less curious. 

The open ended structure of this particular pedagogical design is likely to exacerbate inherent tendencies to procrastinate.  For example, it has been shown that more choices can lead to exacerbated procrastination, (O’Donoghue and Rabin, 2001) as in this case where references must be chosen from a virtually infinite collection of articles in the open literature.  This tendency is likely further intensified by the stress of competing for the same articles.  Procrastinators do not do well under these conditions, partially because they shun competitiveness and partially because they have demonstrated a tendency to repeatedly search the same limited set of information instead of broadening their searches (Ferrari & Dovidio, 2001). To illustrate this point we have found that procrastinators will complain that “all the good articles are taken”, while failing to look at answers and references of other students to see how the problem has been solved by others.  In one study, the most important reason for procrastination was found to be the fear of failure, related to the protection of self-image under conditions of low self esteem (Ferrari & Keane, 1998). In this context, the use of procrastination as a self-handicapping tool can been interpreted as an effort to protect self-image by attempting to evade evaluation (Ferrari & Tice, 2000). The reasoning is that the act of self-sabotage can be used as an excuse for the poor performance.  Without such an excuse the procrastinator would have to accept a potentially poor evaluation as being valid.  Other forms of self-sabotage were observed such as attempting to use another student’s reference even though it was made very clear that this was a key component of the assignment and that this would affect the evaluation is an extremely negative way. 

Just as anxiety has been shown to exacerbate procrastination, it has been demonstrated that it inhibits curiosity and learning by competing for attention (Higgins et al, 1992).  The principle motives for curiosity have been identified, with generally increasing importance, as complexity, novelty and uncertainty (Wentworth & Witryol, 1983). It is probably most accurate to characterize the motive of the present curiosity measurement as uncertainty with respect to the suitability of the student’s answers, selection of references and progress in the class.  Since there was no external reward associated with the display of curious behavior in this context, it is probably safe to assume that our characterization of curiosity is valid.  The use of objective metrics derived from SMIRP use in this application is preferable to teacher assessment of student curiosity.  Such subjective assessments have been shown to be unreliable because teachers were unable to unbias their assessment of student curiosity from intellectual capability (Alberti & Witryol, 1994). However, the use of personality inventory tests (Mills & Parker, 1998) and objective measurements of novelty-based curiosity (Alberti & Witryol, 1994) have demonstrated a positive correlation between curiosity and performance.

The results obtained thus far suggest that it may be worthwhile to intervene in the middle of the class for students flagged for demonstrating low curiosity or high procrastination.  It is possible that the simple act of intervention may have significant impact on behavior, especially procrastination.  A clear distinction has been made in the literature between naïve and sophisticated procrastinators (O’Donohue & Rabin, 1999). Naïve procrastinators do not acknowledge the extent of their procrastination and genuinely believe that they will act within a time span that will leave a sufficient amount of time to properly complete the assignment.  Sophisticated procrastinators are much more realistic about their tendencies to procrastinate and, though they may not initiate work at the earliest possible moment, they will tend to start working before it is too late to properly execute the task.  Though these characteristics are considered to be intrinsic to the procrastinator, it may be possible to convert naïve procrastinators to sophisticated by intervening in time and explaining to them in the most objective terms possible that they are behaving in ways that have proven in the past to generate very poor results for this type of assignment.  It may also be productive to ask them to reflect on occasions in the past when they though they would have enough time to complete an assignment but failed to do so properly.  In addition, explaining to them the role of self-sabotage in procrastination may help to mitigate the behavior.  It is quite possible that self-sabotage is largely an unconscious process; one may argue that purposeful self-sabotage would severely restrict the protective effect on self-image that the sabotage would normally afford. 

 


Figure 4. Correlation between procrastination and performance

 


Figure 5. Correlation between curiosity and performance

 

Feedback from questionnaire

At the end of the class, a “Questionnaire” module was added and made available to the students.  The questionnaire was composed of two parts; a quantitative and a qualitative section.  The quantitative section was linked to a “Questionnaire Category” module, which permitted student to answer the quantitative part by selecting from a menu ranging from “I AGREE STRONGLY” to “I DISAGREE STRONGLY”.  The students were given permission to view and modify only their own questionnaire record.  The questionnaire focused on satisfaction with turnaround time, SMIRP features, issues with the library and other preferences.  A nearly identical questionnaire was provided in both classes, with the exception of a question of whether the students would like to be notified by email when they receive a grade.  In the second class this was implemented, consequently they were asked if they found that feature to be useful. 

Table 3 shows the results of the quantitative portion of the questionnaire, with a weighted average computed by assigning a value of  1 to 5 from  “I DISAGREE STRONGLY” to “I AGREE STRONGLY”, respectively.  The results from both classes were generally consistent, with some differences that will be detailed below.

Concerning the reasonableness of the number of questions, an average value of 2.7 was noted in the first class, indicating an answer between “I DISAGREE” and “NEUTRAL”.  In the second class, instead of making the students answer all of the questions, more questions were given and they needed to answer only half of them.  The satisfaction level subsequently increased to 3.7, corresponding to an average rating between “NEUTRAL” and “I AGREE”.

 

Question

Class 1

Class 2

The number of questions and requested answer length were reasonable.

2.7

3.7

The Drexel databases are easy to use.

3.0

3.1

I was able to find the information I needed for this class using Drexel resources.

2.6

2.6

I would like to be notified by email when my answer is graded.

4.2

4.0

I find looking at other students answers and grades helpful in learning.

3.0

3.3

I would like to use SMIRP again in another class.

2.8

2.8

I am genuinely interested in the topic of this class.

4.2

4.6

It is important to me that my answers are anonymous to the other students.

3.6

3.6

My comments and problems were addressed in a timely fashion.

3.7

3.7

My comments and problems were addressed thoroughly.

3.5

3.8

My answers were graded in a timely fashion.

3.3

2.2

My answers were graded fairly.

3.3

4.0

SMIRP worked well with my preferred browser and operating system.

3.9

3.9

Table 3. Summary of quantitative portion of questionnaire (value range 1-5)

 

The ease of online library database use and the ability to find information was unchanged or nearly unchanged between the first and second classes, and fell at and just above “NEUTRAL”.  Reports from the qualitative section of the questionnaire indicate that this represents a major difficulty in completing the assignment. Based on these results a more intensive training of the students should be required prior to running a similar project.

In the first class the students were asked it they would have liked to be notified with email alerts when their answers were graded.  There was a strong preference for this in the first class (average rating 4.2, slightly above “I AGREE”) and subsequent to implementation in the second class, students reported that this was a useful feature (average rating 4.0,  “I AGREE”). 

In both classes students reported being neutral or slightly above neutral in finding looking at other students’ answers to be useful.  However, in both classes students indicated that the anonymity of SMIRP was reasonably important (average rating of 3.6 in both cases, between “NEUTRAL” and “I AGREE”).

In order to ensure that these results were not skewed by the nature of the class content, students were asked how interested they were in the subject matter.  In both classes students reported that they were significantly interested in the material (4.2 and 4.6 respectively).

Satisfaction with respect to how reported problems were addressed both in timeliness and quality of response was also investigated.  For both of these criteria, in both classes, the responses fell between “NEUTRAL” and “I AGREE”.  Thus an average turnaround time of about 24h can be considered to be adequate for student satisfaction with respect to general or technical queries.  Also, the similar responses in the second class indicates that using two teaching assistants notified by email alerts with the quality control system described above does not significantly reduce student satisfaction, compared to queries being addressed primarily by a professor.

Finally, when asked if students would like to use SMIRP again in another class, the same average of 2.8, slightly below “NEUTRAL”, was obtained in both classes.  Given the unusual demands placed on the students to learn a new modality to communicate and complete an assignment, it is encouraging to see that a strong negative response was not given.  However, it also indicates that factors, beyond email alerts of grades and a reduced number of questions from a larger pool of questions, need to be addressed to make this experience more positive overall for students.  Difficulties in using the software were not serious since in both classes students reported an average of 3.9 (just under “I AGREE”) regarding how well SMIRP worked with their browser and operating system. Indications about what these problematic factors might be come from the written portion of the questionnaire.  A recurrent theme concerned the difficulty in finding suitable peer-reviewed references.  However, although this was a frequent complaint, after having gone through the experience, several comments reflected that this was the most useful skill acquired from working on the assignment.  Comments such as “SMIRP also forced me to learn how to use the electronic resources to their full extent” and “It did force me to learn how to use journals very well” are typical of the positive comments obtained.

 

Conclusion

The primary objective of this study was to evaluate the potential of SMIRP, a collaborative web based tool, for the rapid design and implementation of a pedagogical construct.  The successful implementation of a question-answer-reference based construct in two classes supports the contention that this objective was met.   The ability to modify the database structure rapidly over a browser interface was exploited by adding a “Questionnaire” module well after the start of the class.  The email alerting component of SMIRP was used to provide feedback to students, professor, teaching assistants and librarian.  Furthermore, it was possible to successfully incorporate a quality control component as part of the use of the email alerting system to address student queries and comments.  By breaking down a complex assignment into several components that could be recorded separately in SMIRP, detailed metrics of the activity of all individuals involved could be obtained.  An analysis of this activity permitted an evaluation of both class operations and correlations between overall student performance and behavioral characteristics such as procrastination and curiosity.   In the particular pedagogical design implemented in this report, a positive correlation was found between curiosity and performance, while procrastination correlated negatively with performance.

 

Acknowledgements

Support from NSF grant CHE-9875855 is gratefully acknowledged.

 

References

Alberti, E. T., & Witryol, S. L. (1994).The relationship between curiosity and cognitive ability in 3rd grade and 5th grade children. Journal of Genetic Psychology, 155 (2), 129-145.

Bradley, J.-C., & Samuel, B. (2000).SMIRP-A Systems Approach to Laboratory Automation. Journal of the Association for Laboratory Automation, 5 (3), 48-53.

Curran, K. (2002). An online collaboration environment.  Education and Information Technologies, 7 (1), 41-53.

Ferrari, J. R., & Dovidio, J. F. (2001).Behavioral information search by indecisives. Personality and Individual Differences, 30, 1113-1123.

Ferrari, J. R., Keane, S. M., Wolfe, R. N., & Beck, B. L. (1998). The antecedents and consequences of academic excuse-making: Examining individual differences in procrastination. Research in Higher Education, 39 (2), 199-215.

Ferrari, J. R., & Tice, D. M. (2000). Procrastination as a self-handicap for men and women: a task-avoidance strategy in a laboratory setting. Journal of Research in Personality, 34, 73-83.

Higgins, L. F., Qualls, S. H., & Couger, J. D. (1992).The role of emotions in employee creativity. The Journal of Creative Behavior, 26 (2), 119-129.

Isenhour, P., Carroll, J. M., Neale, D. C., Rosson, M. B., & Dunlap, D. R.(2000). The VirtualSchool: An integrated collaborative environment for the classroom. Educational Technology & Society, 3 (3), 74-86.

Martinez, M. (2001).Key Design Considerations for Personalized Learning on the Web. Educational Technology & Society, 4 (1), 26-40.

Mills, C. J., & Parker, W. D. (1998). Cognitive psychological profiles of gifted adolescents from Ireland and the US: cross-societal comparisons. International Journal of Intercultural Relations, 22 (1), 1-16.

Montelpare, W. J., & Williams, A. (2000). Web-based learning: Challenges in using the Internet in the undergraduate curriculum. Education and Information Technologies, 5 (2), 85-101.

Nachmias, R., Mioduser, D., Oren, A., & Ram, J. (2000). Web-Supported Emergent-Collaboration In Higher Education Courses. Educational Technology & Society, 3 (3), 94-104.

Niederhauser, D. S., & Stoddart, T. (2001).Teachers' instructional perspectives and use of educational software. Teaching and Teacher Education, 17 (1), 15-31.

O’Donoghue, T., & Rabin, M. (1999).Doing it now or later. The American Economic Review, 103-124.

O’Donoghue, T., & Rabin, M. (2001).Choice and Procrastination. The Quarterly Journal of Economics, 116 (1), 121-160.

Squires, D., & Preece, J. (1999). Predicting quality in educational software: Evaluating for learning, usability and the synergy between them. Interacting with Computers, 11 (5), 467-483.

Steel, P., Brothen, T., & Wambach, C. (2001).Procrastination and personality, performance, and mood. Personality and Individual Differences, 30 (1), 95-106.

Tice, D. M., & Baumeister,  R. F. (1997). Longitudinal study of procrastination, performance, stress, and health: The costs and benefits of dawdling. Psychological Science, 8 (6), 454-458.

Virvou, M., & Moundridou, M. (2000).A Web-Based Authoring Tool for Algebra-Related Intelligent Tutoring Systems. Educational Technology & Society, 3 (2), 61-70.

Wentworth, N., & Witryol, S. L. (1983). Is variety the better pan of novelty? The Journal of Genetic Psychology, 142, 3-15.


decoration


Copyright message

Copyright by the International Forum of Educational Technology & Society (IFETS). The authors and the forum jointly retain the copyright of the articles. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear the full citation on the first page. Copyrights for components of this work owned by others than IFETS must be honoured. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from the authors of the articles you wish to copy or kinshuk@massey.ac.nz.