Educational Technology & Society 5 (3) 2002
ISSN 1436-4522

A rocky journey toward effective assessment of visualization modules for learning enhancement in Engineering Mechanics

Dan Jensen, Brian Self and  Don Rhymer
US Air Force Academy
Dept. of Engineering Mechanics
2354 Fairchild Dr., Suite 6H2
USAF Academy, CO 80840 USA
Dan.Jensen@usafa.af.mil

John Wood
Colorado State University
Dept. of Mechanical Engineering
A103N – Engineering Building
Ft Collins, CO 80523 USA

Marty Bowe
Director of Information Technology
Canton Ohio Schools
Canton, OH 44704 USA

 

ABSTRACT

Since the Fall of 1998 an assessment study has been in progress to determine the effectiveness of multimedia instructional modules used in a basic engineering class at the United States Air Force Academy.  This paper discusses the progression and results of this assessment study.  Until recently, there has been a lack of instructional material designed to enhance understanding of basic engineering mechanics through the use of visualization.  Therefore, visualization content in this area, as well as quantitative assessment establishing its effectiveness, has been needed.  The visualization content used in this study consists of web-based and PowerPoint presentations designed to enhance understanding of abstract concepts in the course.  Various assessment techniques have been used to evaluate the visual content’s effectiveness.  The 1998 version of the study attempted to correlate too many variables resulting predominately in data which was statistically insignificant.  Both the visualization modules and the assessment plan were refined based on what was learned in the 1998 study.  In 1999, three professors at the USAF Academy ran simultaneous studies using the refined version of the visualization modules.  The assessment results from this study produced two interesting results: 1) the attitude of the professor presenting the visualization module can have a significant impact on the student’s reception of the content and 2) students actually disliked the use of the visualization modules.  Based on additional 1999 assessment data, the hypothesis was formulated that the students’ negative perception was based on three things: 1) they were not aware that this visual content would help them prepare for up-coming exams, 2) the parts of the visual content that gave an overview of an advanced engineering analysis technique called finite elements were intimidating and 3) they appeared to be influenced by one professor’s negative perception of the visualization modules.  In order to test this hypothesis, in the Fall of 2000 the visual content was reused, but the link between this content and the conceptual questions on the exam was emphasized.  In addition, the non-essential content related to finite element analysis was removed and the professor who had a negative perception of the modules chose not to participate in the study.  The latest assessment results indicate the students’ perception of the material has improved significantly in response to these changes.  In addition, the Fall 2000 assessment shows that the visual modules did enhance understanding when compared to a traditional lecture format.  Overall, this three-year assessment project should provide others developing visualization and other content with important information relevant to the assessment processes.

Keywords: Engineering mechanics, assessment, multimedia instruction modules, visualization


Introduction

The Fundamentals of Mechanics course (Fall semester 1998, 1999, 2000) at the USAF Academy was used as a testing ground for assessing the effectiveness of the visual learning aids. The course combines two basic topics in engineering mechanics (statics and strength of materials) at an introductory level and is mandatory for all students at the USAF Academy regardless of major (this will turn out to be significant when interpreting the assessment results).  Typically, the concepts of stress in objects caused by torsion, bending, and combined loading are difficult for students to grasp. For these topics, “visualization modules” were developed to bring an enhanced learning experience into the classroom. 

The initial study[Borchert 99], completed in Fall 1998, attempted to correlate the effects of these visualization modules with a student’s learning preference or personality type.  Learning preferences were determined from an assessment method known as VARK, while the personality type designation was obtained using the Myers-Briggs Type Indicator (MBTI).  The attempt to correlate too much data caused statistically insignificant results for the initial experiment. 

The follow-on work [Bowe 2000] completed in Fall 1999 expanded the sample size to over 65% of the course’s students (about 500 students) and focused solely on the effect of the multimedia visualization modules. As a first assessment technique, student response to each lesson was collected throughout the semester via quick “30-second surveys”. Also, immediately before and after the enhanced learning modules were presented, “quick quizzes” were administered to measure short-term conceptual learning.  This was the second assessment tool.  Additionally, as a third technique, the results of selected midterm exam questions were used to evaluate the longer-term effectiveness of the enhanced learning modules. These assessments produced two main results: 1) students picked up on one professor’s negative perception of the modules, 2) students indicated that they disliked the use of these visualization tools.

The most recent study (Fall 2000) was designed to eliminate students’ negative perception of the multimedia visualization modules and further isolate the modules’ pedagogical effect.  To do so, follow-on research was conducted using the same process, visually reinforcing the same engineering concepts but altering the visualization modules and assessment plan.  It was hypothesized that the students’ negative response to the multi-media presentations was due to three main factors: 1) the students were not aware that concepts presented were testable, 2) the visualizations involved too much detail on an advanced engineering analysis technique called the finite element method (FEM) and 3) the students mimicked the negative perception from one professor. Therefore, the Fall 2000 work reflects data resulting from three changes to the Fall 1999 experiment: 1) the professor who had a negative perception of the visualization modules chose not to participate in the Fall 2000 study, 2) students were clearly informed that these concepts would be covered on the next exam and 3) the extraneous finite element analysis details were removed.  Although these changes may appear minor, such subtleties are shown below to have a substantial effect on the effectiveness of the visualization modules as measured by student perceptions as well as increased learning.   The results and findings of this research are discussed below.

 

Visualization modules for enhanced learning

Background

There is an increasing emphasis being placed on quality instruction in engineering education. This is exemplified by the emphasis given to quality of teaching in promotion decisions [Boyer 95], by the expanding number of institutions focusing on curriculum development [Incorocara 96], by the significant number of publications in this area [Abbanat 94, Brereton 93, Catalano 96, Cooper 96, Crismond 92, Harris 95, Jensen 94, 98 (1&2), 99, 00, Kritz 94, Martin 94, Meyer 94, Oluufa 94, Reamon 97, Regan 96, Rhymer 01, Sheppard 95, Shakerin 01, Tan 95, Wallace 97, 98, Wood 00], by the commitment of the engineering accreditation agency ABET in the assessment area [ABET 00], and by the continuing funding emphasis by the National Science Foundation and other agencies.  Much of this effort to enhance engineering education is focused in the following areas: learning styles, multimedia visualization/simulation, hands-on experiences, use of real-world problems, and assessment techniques. These components form the foundation for the present work in developing visualization modules to enhance course content.

 

Visualization background information

A wide variety of efforts to use computer-based visualization to enhance education have been reported in the literature.  There are a large number of web sites maintained by universities that contain multimedia features, from simple electronic syllabi to interactive simulation [see URL/CD references at the end of the reference section]. Many book companies have formed multimedia divisions, and a number of smaller multimedia production companies are producing CD-ROMs intended to provide visualization enhancement to technical learning.  In addition, many examples of stand-alone software for specific courses have been reported in the literature [see URL/CD references at the end of the reference section].

Results reported from the use of these tools have been mixed. Of the cases inspected by the authors (approximately fifty cases), about half of the researchers reported that the tools did not significantly increase student performance on tests [Reamon 97, Regan 96], while half did report enhancement of students’ performance [Catalano 96, Meyer 94, Wallace 97]. In the cases where student performance did increase, some common components were found in the multimedia tools; they include: 1) the use of specific learning objectives to guide development of the software; 2) the use of student feedback to create updated software versions; 3) the use of open ended problems; 4) the fact that software needed to be interactive and of high quality; and 5) the fact that hands-on exercises often supplemented the material [Catalano 96, Regan 96, Wallace 97].  In addition, some give suggestions on how to restructure the course content if World Wide Web-based tools are used [Wallace 98].

From as assessment standpoint, the preponderance of what is reported in the engineering education literature is, unfortunately, somewhat lacking in detail.  For example, despite the numerous publications in this area, there appear to be no studies on the use of visualization to enhance engineering mechanics of materials courses which are derived from a large, statistically significant data set on which to base an evaluation of their effectiveness.  The literature cited above refers to assessment strategies which are almost entirely qualitative or have very small sample sizes and which lack the necessary control groups to isolate the effect on learning derived from the introduction of multimedia.  In addition, while occasionally effort has been made to reduce the noise and secondary affects apriori, more often in the engineering education literature, this appears to be accomplished the hard way; by trial and error.  To our detriment, this has sometimes been the case in the present study as is documented in this paper.

 

Background: The U.S. Air Force Academy

The Fundamentals of Mechanics course that serves as the testing ground is a mandatory class at the USAF Academy for all cadets, regardless of major.  It is part of a significant group of core classes that the Air Force mandates all USAF Academy graduates pass in an effort to produce a well-rounded, balanced, academic exposure.  This means the majority of cadets taking the course are not mechanical engineering majors, or even in a technical major at all.  Therefore, from the cadets’ perspective, the class and the mechanics taught are not viewed as critical to their degrees and/or to their careers and are possibly not even interesting to many of the cadets.  What results, then, is an attitude that can be summed up as  “we are studying simply to survive academically”.  It appears that the fact that cadets at the USAF Academy comprise the students for this study has an effect on the results.  It is therefore important to note what differences exist between the cadets’ and other students’ learning experience to provide the correct framework for interpreting the results.

The result of such a mentality is that anything that is testable is crucial to cadets while anything that does not have the potential to be on an exam is viewed as extraneous.  Therefore, emphasizing the testability of the concepts in the visualization modules, while sounding minor, may significantly impact student interest in the visual content.

From a faculty standpoint, quality teaching is the first priority at USAF Academy, with research occupying a significantly reduced role.  Consequently, cadets are constantly being “courted” by their instructors to be involved and interested in the subject matter, particularly for a core class.  They are continually being exposed to PowerPoint presentations and computer-based multimedia.  This contributed significantly to the negative perception of one of the professors involved in the 1999 study.

 

Module Descriptions and Use

The visualization modules each focus on one or at most two fundamental concepts.  The modules highlight conceptual material in the following three areas: 1) torsion; 2) bending; and 3) combined loading.  Real world examples were used as the context for the visualization of the mechanics behind these key areas.  The examples included automobile drive shafts, aircraft wings, and human knee joints (see Figure 1). Visualization content for each module involved showing FEM-based color stress plots illustrating the key concepts chosen for each module (see Figure 2). In the use of the module during class, a discussion was held to introduce the module and to describe how it fit the current topic; for example, why the drive shaft is being subjected to torsion.  In the Fall 99 study, no explicit mention was made of the fact that this content would be included on the next exam.  In the 2000 study the testability was explicitly stated.

 



Figure 1: F 16 Wing model as part of the visualization module



Figure 2: Computer model of the F 16 wing


Assessment

Assessment Strategy Introduction

Throughout the study (i.e. from Fall 1998 through present), three different assessment techniques have been used to determine the effectiveness of the modules: 1) 30-second surveys taken after each lecture; 2) quick quizzes taken before and after the modules; and 3) specific exam questions designed to measure students’ understanding of the concepts covered in the modules.  The use of three different assessment tools accomplishes two things.  First, the use of a variety of tools reduces the “noise” in the results simply by creating redundant measures.  Second, the different tools allowed measurement of  different components of effectiveness.  Table 1 shows the different aspects measured by the different assessment tools.

 

ASSESSMENT TOOL

WHAT THE TOOL MEASURES

30-Second Surveys

 

  1. Did students find the lectures which used modules more interesting than the lectures with no modules?
  2. Did students indicate that the lectures with modules were better learning experiences than the lectures without modules?
  3. Did students find the content explained by modules easier to apply than content with no module?
  4. Were the students more motivated to explore topics further if the topic was presented with a module?

Quick Quizzes

  1. Which type of content helped the students answer a conceptual question the most—a visualization module or a classic lecture style with traditional example problems?
  2. Does having different professors potentially affect the results?

Exam Questions

Did the modules help the students answer exam questions in the same content area as the module?

Table 1: Use of the assessment tools

 

Neither the use of multiple assessment instruments nor the specific instruments shown above are unique contributions to the assessment literature.  The reason for documenting the specifics of the assessment strategy is to provide a context for the various attempts to gain understanding into the true potential of the visualization modules. 

 

The 30-Second Surveys

The 30-second survey instrument

The 30-Second Survey currently being used has been iteratively developed over the last seven semesters.  The original survey, used for a previous study [Jensen 98(2)], asked only for MBTI type and overall lecture rating (recall previous studies had been done to correlate effectiveness with a student’s personality type designated by MBTI).  In order to gain additional insight into the effectiveness of the modules, the surveys have been refined to obtain information about the students’ perception of interest, learning, applicability, and motivation for future exploration.  In addition, MBTI types have still been recorded for possible future study.  This survey was given after each lecture and took about 30 seconds for students to complete. Figure 3 shows the content and form.

 

30-Second Survey                                                              EM120 - FALL 1999

Lesson #: _____                    

MBTI Type: _______           

Please rate the following statements on a scale from

1 to 10  (1 - very untrue; 10 - very true):

___ 1. Today’s class kept me interested.

___ 2. Today’s class was a good learning experience.

___ 3. This class prepared me well to apply today’s concepts to problems.

___ 4. This class motivated me to further explore today’s concepts.

Figure 3: 30-Second survey form

 

30-second survey assessment results

In order to measure the effect of the module-based content in a generic manner, the data was reduced as follows.  Average values (and standard deviations) were obtained for each question on the survey for every lecture.  The results for the four questions were averaged for each lecture to produce an “over-all student perception” for each lecture.  The data is plotted for the Fall 1999 and the Fall 2000 studies in Figures 4 and 5, respectively.  It is clear from a visual inspection of Figures 4 and 5 that the perception of the multimedia lectures was much closer to the mean in 2000 than in 1999.

 

Fall 1999 Study

Overall Student Perception for Each Lecture


Figure 4: Fall 1999 30-Second survey results for each lecture

 

Based on these Fall 1999 results (Figure 4), the students were asked for more feedback on the modules to pinpoint the reasons for the more negative responses.  This centred around three major problem areas with the multi-media presentation: 1) the students were not as attentive to the material presented because it was not clear that the concepts were going to be tested, 2) some of the advanced analysis and theory (based on FEM) proved to confuse the students and 3) one of the three professor’s negative perception of the modules affected student perception. As a result of these findings, these problems were addressed in the Fall 2000 study. 

Specifically, in the Fall 2000 study students were clearly told before the visualization modules were presented, that the concepts taught were relevant to the forthcoming exam. The testing would be in the form of multiple-choice questions designed to evaluate students’ conceptual understanding.  As mentioned above, such an emphasis can have an impact on student response and involvement, especially in a USAF Academy core course.  Second, the mathematical and mechanical background to FEM (the advanced analysis technique) was removed from the visualization modules to place more emphasis on the fundamental mechanics concepts.  FEM-developed stress plots were still used to illustrate the mechanics concepts, but without the background and theory which had been labeled by the cadets as counterproductive.  Finally, the professor who had a negative perception of the visualization modules chose not to participate in the Fall 2000 study.  Results of the Fall 2000 study, which reflect the changes just noted, are shown below in Figure 5.

 


Figure 5: Fall 2000 30-Second survey results for each lecture.


Means and standard deviations were then isolated for the lectures containing the multimedia based enhancement modules.  Next, overall averages were found for the lecture-only lessons and for the multimedia lessons.  Tables 2 and 3 show (for Fall 1999 and 2000 semesters, respectively) the overall averages for a normal lecture style lesson compared to those of the multimedia lessons, as well as the number of data points used in the tabulation. Table 2 shows the average drop in “satisfaction” for the multi-media lessons is between .50 and .69 standard deviations for the Fall 1999 study as compared to a drop of only between .19 and .39 standard deviations for the Fall 2000 results.

 

Survey Question

Normal Lecture (1446 Data Points Used)

Multimedia Lecture (173 Data Points Used)

% Change

# of Standard Deviations Change

Q1: Lecture was interesting?

7.91

6.67

-15.6%

-0.64

Q2: Lecture helped me learn?

8.04

6.78

-15.6%

-0.69

Q3: Lecture helped me to apply material?

7.8

6.62

-15.2%

-0.62

Q4: Lecture motivated me to explore subject further?

6.97

5.68

-18.5%

-0.50

Table 2: Fall 1999 Means for the 30-second survey.

 

Survey Question

Normal Lecture (564 Data Points Used)

Multimedia Lecture (93 Data Points Used)

% Change

# of Standard Deviations Change

Q1: Lecture was interesting?

8.11

7.38

-8.9%

-0.39

Q2: Lecture helped me learn?

8.12

7.68

-5.5%

-0.25

Q3: Lecture helped me to apply material?

8.15

7.68

-5.8%

-0.27

Q4: Lecture motivated me to explore subject further?

7.57

7.18

-5.1%

-0.19

Table 3: Fall 2000 Means for the 30-second survey.

 

As evidenced in the tables, although students’ perceptions of the modules rose significantly between 1999 and 2000, it still remained slightly below the mean even in the 2000 study.  A qualitative student assessment was conducted to pinpoint the elements of the multi-media that the students still did not like.  It appears that the primary reason for the remaining negative impression of the modules was that the FEM-based stress plots took significant time and effort to comprehend.  Virtually none of the students had been exposed to FEM, so that the multi-colored stress distribution needed significant instructor explanation before the concept was understood.  While the FEM theory and methodology portions had been removed, the students still looked at each module negatively when they saw colors distributed along an object.  So while the students did not despise the modules, they definitely did not prefer it over standard instruction.  Possibly, if the potential that the modules appear to provide to increase exam performance (as shown in Table 6) was made known, the difficulty in understanding the stress distributions would seem insignificant.

 

The Quick Quizzes

The quick quiz instrument

Immediately before and after the enhanced learning modules were presented, a quick quiz was administered to measure short-term increase in understanding as a result of the module.  The quizzes focused on conceptual understanding of the material and did not require any significant calculations.  The quick quizzes were also administered during the same lesson before and after a classic lecture style class (during which the visualization module was NOT used).  This obviously forms the control group.  A student could receive a 0, 1, or 2 for a grade on the quiz (2 being the best). The results were normalized to indicate the average score (percentage) achieved with and without the multimedia. The results are tabulated below in Tables 4 and 5 to summarize the quick quiz assessment for Fall 1999 and 2000.  The tables’ data includes the number of data points for inferring statistical significance.

 

Quick quiz assessment results

Figure 6 gives insight into the issue of the professor in the 1999 study who had a negative perception of the modules.  The difference in professors’ attitudes appears to have greatly affected the “success” of the multimedia presentation.  The figure shows the quiz score averages during the Fall semester of 1999.  The hollow symbols represent average scores with multimedia, the solid symbols without multimedia.  Each type of symbol represents a different instructor – a circle for Instructor A, triangle for Instructor B, and a square for Instructor C.  Note that Instructor B did not conduct the Bending Quick Quiz, while Instructors A and C did not do the Combined Loading control group (i.e. all their groups were given the multimedia presentation). The horizontal axis delineates between the three different quick quizzes while the vertical axis quantifies the difference between the students’ scores after and before their “treatment”.  The two different “treatments” are the multimedia (mm) or a standard lecture (no-mm).

 


Figure 6: Fall 99 Comparison of results from different professors


In examining these results, it is interesting to note that Instructors A and C both saw better quiz score improvement when using the multimedia presentations.  Both of these instructors supported the visual presentations and thought that they would add to the interest level of the students.  In fact, they thought that the Combined Loading biomechanics example was so motivating that they did not want to run the control group without multimedia.  This enthusiasm for the visual material appears to have positively affected the student’s learning.

This can be contrasted to the quiz scores for Instructor B.  Note that the score improvements for the Torsion and Combined Loading modules were noticeably lower when Instructor B presented the visual-multimedia material.  This instructor was not a strong proponent of the modules, and often complained about the overuse of technology.  While there may have been some positive bias towards the modules for Instructors A and C, there was a negative bias for Instructor B.

Clearly, this type of information must be considered when evaluating any new teaching tool.  Even well constructed, interesting learning modules will fail if they do not fit in well with the teaching methods of the instructor.  If the professor has a negative perception of the learning enhancement tool, the students will likely perceive this.  Similarly, if an instructor shows great enthusiasm for a new tool, this may positively bias the learning of the students.  Therefore, these visualization modules should be tested with as great a number of professors as possible to determine their effectiveness (a strategy which we are in the process of implementing), and quick quiz scores must be analyzed along with subjective surveys and correlated exam results to fully evaluate new teaching tools.

 

 

Number of Data Points

Average Quiz Score Before

Avg. Quiz Score After

% Improvement

Students who saw the Module

152

0.89

1.16

31%

Students who did NOT see the module

118

0.85

1.10

30%

Table 4: Fall 1999 Quick quiz results.

 

Module Subject

 

Number of Data Points

Average Quiz Score Before

Average Quiz Score After

% Improvement

Torsion

Students who saw the module

15

80%

100%

20%

Students who did NOT see the module

21

62%

71%

9%

Bending

Students who saw the module

24

27%

69%

42%

Students who did NOT see the module

15

43%

76%

33%

Combined Loading

Students who saw the module

14

35%

93%

58%

Students who did NOT see the module

14

21%

75%

54%

Table 5: Fall 200 Quick quiz results.

 

The data for 1999 (as shown in Table 4) is inconclusive in terms of showing any positive affect from the visualization modules.  The Fall 2000 data shows with reasonable significance that the multimedia did increase conceptual understanding over instruction without multimedia. 

 

Results of Exam Question

In Fall 2000 an exam question was used to further evaluate the effectiveness of the modules.  This was done in an attempt to get a longer-term assessment of the visualization modules.  As can be seen in Table 6, the percentage of students who correctly answered the exam question was significantly greater (45%) for those who viewed the module than for those who did not (28%).

 

 

Number of Data Points (Students)

% of Students Correctly Answering the Exam Prob.

Students Receiving the Module

40

45%

Students NOT Receiving the Module

635

28%

% Difference

 

23%

Table 6: Fall 200 Final exam results according to content.

 

Conclusion, wider applicability and future work

Two categories of conclusions can be drawn from this work.  First, conclusions regarding the assessment plan and its implementation can be made.  Second, specific conclusions regarding the effectiveness of the visualization modules can be stated.

Regarding the assessment plan and implementation, it is clear in retrospect that some critical details were overlooked in the 1998 and 1999 phases of assessment.  Although extensive background work was done to investigate what other engineering educators had learned regarding the assessment of multimedia, implementation of their “lessons learned” was not sufficient to avoid significant problems.  Specifically, the 1998 study attempted to encompass too many variables with too small a sample size.  Two critical errors encountered in the 1999 study were failure to consider the attitude of the professors involved and failure to go beyond the professor’s course objectives and consider the student’s course objectives as well.  This realization would have provided the insight to make a firm connection between the content and the exam (a lesson only learned in retrospect). 

In terms of the conclusions related to the visual multimedia itself, three primary conclusions can be drawn.  First, the results of this study indicate that students’ perception of the 2000 version of the visual, multimedia driven lectures has been significantly enhanced over previous versions by: 1) emphasizing that the concepts will be tested on exams, 2) minimizing extraneous FEM theory included in the modules and 3) insuring that the professors believe that the visual modules will be helpful. Second, the 2000 study showed an improvement in students’ conceptual understanding was gained through the use of the visualization modules as opposed to use of a traditional lecture format.  This result was validated through the use of quick quizzes given before and after the visual modules were presented or before and after the traditional lecture.   Third, longer-term retention of the conceptual material was also enhanced through the use of the modules as compared to traditional lectures.  This was substantiated with performance results on a specific exam question.

If these results would have been known a priori, it could have saved a significant amount of work.  These hours could have been put into further development of the multimedia or more advanced analysis of the assessment results.  If others who are developing and assessing multimedia for education are made aware of these results, they should be able to be far more efficient than we have been.

This project continues to evolve at USAF Academy and has expanded to a number of other universities.  We are in the process of developing more interactive versions of the visualization modules. These will eventually become commercially available for use in mechanics of materials courses.

 

Acknowledgements

The authors wish to acknowledge the support of the MSC Corporation which has funded much of the module development. Also, support is acknowledged from the Institute for Information and Technology Applications (IITA) at the USAF Academy.   In addition, we acknowledge the support of the Department of Engineering Mechanics at the USAF Academy as well as the financial support of the Dean’s Assessment Funding Program.

 

References

  • Abbanat, R., Gramoll, K., & Craig, J. (1994). Use of Multimedia Development Software for Engineering Courseware. Proceeding of the ASEE Annual Conference, 1217-1222.
  • ABET accreditation document for ABET (2000). http://www.abet.org/eac/eac2000.htm.
  • Borchert, R., Jensen, D., & Yates, D. (1999). Development and Assessment of Hands-on and Visualization Modules for Enhancement of Learning in Mechanics. Paper presented at the ASEE Annual Conference, June, Charlotte, NC.
  • Bowe, M., Jensen, D., Feland, J., & Self, B. (2000). When Multimedia Doesn’t Work:  An Assessment of Visualization Modules for Learning Enhancement in Mechanics. Paper presented at the ASEE Annual Conference, June, St. Louis, MO.
  • Boyer, E. L. (1995). Assessing Scholarship. ASEE Prism, 4 (7), 22-26.
  • Brereton, M. F., Greeno, J., Lewis, J., Linde, C., & Leifer, L. (1993). An Exploration of Engineering Learning. Paper presented at the ASME Design Theory and Methodology Conference, September, Albuquerque, NM.
  • Catalano, G. D., & Tonso, K. L. (1996). The Sunrayce ’95 Idea: Adding Hands-on Design to an Engineering Curriculum. Journal of Engineering Education, 85 (3), 193-199.
  • Cooper, S. C., & Miller, G. R. (1996). A Suite of Computer-Based Tools for Teaching Mechanics of Materials. Computer Applications in Engineering Education, 4 (1), 41-49.
  • Crismond, D., & Wilson, D. G. (1992). Design and Evaluation of Multimedia Program: Assess MIT’s EDICS Program. Proceeding of the ASEE Frontiers in Education Conference, 656-661.
  • Harris, T. A., & Jacobs, H. R. (1995). On Effective Methods to Teach Mechanical Design. Journal of Engineering Education, 84 (4), 343-349.
  • Incropera, F. P., & Fox, R. W. (1996). Revising a Mechanical Engineering Curriculum: The Implementation Process. Journal of Engineering Education, 85 (3), 233-238.
  • Jensen, D. D. (1994). Using MSC-PATRAN for Pre and Post Processing for Specialized FEM Codes which are not in the Standard MSC-PATRAN Library. Paper presented at the MSC World Conference, June, New Port Beach, CA.
  • Jensen, D. D., & Pramono , E. (1998). A Method for Teaching Finite Elements Which Combines the Advantages of Commercial Pre and Post -Processing with Student Written Software. Computer Applications in Engineering Education, 6 (2), 105-114.
  • Jensen, D. D., Murphy, M. D., & Wood, K. L. (1998). Evaluation and Refinement of a Restructured Introduction to Engineering Design Course Using Student Surveys and MBTI Data. Paper presented at the ASEE Annual Conference, June, Seattle, WA.
  • Jensen, D., & Borchert, R. (1999). MSC-Patran Used to Improve Education by Providing Visualization of Stress Concepts. MSC World, February.
  • Jensen, D. D., & Bowe, M. J. (1999). Hands-on Experiences to Enhance Learning of Design: Effectiveness in a Redesign Context When Correlated with MBTI and VARK Types. Paper presented at the ASEE Annual Conference, Charlotte, N.C..
  • Jensen, D., & Wood, K. (2000). Incorporating Learning Styles to Enhance Mechanical Engineering Curricula by Restructuring Courses, Increasing Hands-on Activities, & Improving Team Dynamics. Presented at the ASME Annual Conference, November, Orlando, FL.
  • Kriz, R. (1994). Data Visualization and its role in Multimedia-Based Design Education. Paper presented at the ASME Design Theory and Methodology Conference, Minneapolis, MN.
  • Martin, P. T. (1994). An Overview of Multimedia for the Teaching of Engineering Education. Proceeding of the ASEE Annual Conference, 988-991.
  • Meyer, D. G., & Krzyzkowski, R. A. (1994). Experience Using the Video Jockey System for the Instructional Multimedia Delivery. Proceeding of the ASEE Frontiers in Education Conference, 262-266.
  • Oloufa, A.A. (1994). Bringing the Real World to the Classroom with Multimedia. Proceeding of the ASEE Annual Conference, 2742-2745.
  • Reamon, D., & Sheppard, S. (1997). The Role of Simulation Software in an Ideal Learning Environment. Paper presented at the ASME Design Engineering Technical Conferences, September 14-17, Sacramento, CA.
  • Regan, M., & Sheppard, S. (1996). Interactive Multimedia Courseware and the Hands-on Learning Experience: An Assessment. Journal of Engineering Education, 85 (2), 123-131.
  • Rhymer, D., & Jensen, D. (2001). An Assessment of Visualization Modules for Learning Enhancement in Mechanics. Paper presented at the ASEE Annual Conference, June, Albuquerque NM.
  • Shakerin, S., & Jensen, D. (2001). Enhancement of Mechanics Education by Means of Photoelasticity and the Finite Element Method. International Journal of Mechanical Engineering Education, 29 (4).
  • Sheppard, S., & Regan, D. (1995). Bicycle Multimedia Courseware: Formative In-depth assessment Report, Center for Design Research Internal Report, Stanford University.
  • Tan, F. L., & Fok, S. C. (1995). Development of Engineering Courseware for University Undergraduate Teaching Using Computer Animation. Computer Applications in Engineering Education, 3 (2), 121-126.
  • Wallace, D. R., & Mutooni, P. (1997). A Comparative Evaluation of World Wide Web-Based and Classroom Teaching. Journal of Engineering Education, 86 (3), 211-219.
  • Wallace, D. R., & Weiner, S. T. (1998). How Might Classroom Time Be Used Given WWW-Based Lectures. Journal of Engineering Education, 87 (3), 237-248.
  • Wood, K., Jensen, D., Bezdek, J., & Otto, K. (2001). Reverse Engineering and Redesign: Courses to Incrementally and Systematically Teach Design. Journal of Engineering Education, 90 (3).

 

Samples of URLs and CDs for University and other Multimedia Projects

  • ndsu (North Dakota State Univ.), The WWW Instructional Project, http://www.ndsu.nodak.edu/~wwwinstr/home.html
  • RPI (Rensselaer Polytechnic Institute), The Rensselaer Studio Courses, http://ciue.rpi.edu/studio/studio.htm
  • MSU (Mississippi State Univ.) Aerospace Structural Analysis, http://www.ae.msstate.edu/~masoud/Teaching/SA2/Course.html
  • Swafford, M., Brown, D., (The Univ of Illinois), The Mallard Project, http://www.cen.uiuc.edu/Mallard
  • MIT(Massachusetts Institute of Technology), Mechanical Engineering Hypermedia Project, http://hyperweb.mit.edu:800/curhyp.html
  • UT (Univ of Texas, Austin), The World Lecture Hall, http://www.utexas.edu/world/lecture
  • UCB (University of California at Berkeley), Integrating Calculus, Chemistry, Physics and Engineering Education through Technology Enhanced Visualization, Simulation and Design Cases and Outcomes Assessment, http://hart.berkeley.edu/~aagogino/GE.fund/GE.final.html#section6
  • The MacNeal Schwendler Corp, Exploring MSC/Patran, part # P3V7.5 ZZZ SM-Pat301-CD, Created by Engineering Multimedia Inc., MSC Corp., 2975 Red Hill Ave., Costa Mesa, CA 92626, 1997.
  • Sheppard, S.D., Regan, M., Tan, S., “Drill Stack and Bike Dissection CD-ROM version 4.3.1,” http://www.needs.org
  • Yu, D., Agogino, A.M., “Virtual Disk Drive Design Studio CD-ROM version 1.1,” http://www.needs.org
  • Gramoll. K., Charlton, J., Raharja, K., Weaver, M., Tenisci, J., Verigan, C., “Mars Navigator CD version 1.0.1, http://www.needs.org
  • Crown, Stephen W., “Engineering Graphics”, 1999 Premier Award Winner, John Wiley & AutoDesk Sponsored, http://www.needs.org
  • Polaha, Megann, Ingraffea, A.R., “Cracking Dams”, 1999 Premier Award Winner, John Wiley & AutoDesk Sponsored, http://www.needs.org
  • Philpot, Timothy, “MDSolids”, 1998 Premier Award Winner, John Wiley & AutoDesk Sponsored, http://www.needs.org
  • Raju, P.K., Sankar, C.S., “Della Steam Plant”, ”, 1998 Premier Award Winner, John Wiley & AutoDesk Sponsored, http://www.needs.org
  • Hanry, Robert, “Seve-unh”, ”, 1998 Premier Award Winner, John Wiley & AutoDesk Sponsored, http://www.needs.org
  • Hibbler, R.C., Schiavone, P., Guarino, J., “Statics Study Pack”, CD included with Engineering Mechanics, Statics, John Wiley & Sons, 2001.
  • Miller, G.R., Cooper, S.C., “Visual Mechanics, Beams & Stress States”, PWS Publishing Company, 1998.
  • Craig, R. R., Philpot, T., “MDSolids CD to accompany Mechanics of Materials, John Wiley and Sons, 2000.

decoration


Copyright message

Copyright by the International Forum of Educational Technology & Society (IFETS). The authors and the forum jointly retain the copyright of the articles. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear the full citation on the first page. Copyrights for components of this work owned by others than IFETS must be honoured. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from the authors of the articles you wish to copy or kinshuk@massey.ac.nz.