SAUM Additional Online Case Studies & Appendices


The Evolution of an Assessment System

 

By Barbara M. Moskal

Mathematical and Computer Sciences Department

Colorado School of Mines

Golden, CO 80401

bmoskal@mines.edu

Abstract

In the late 1990s, the Colorado School of Mines (CSM) began to prepare for a visit by the Accreditation Board for Engineering and Technology (ABET).  Eight of CSM engineering departments are ABET accredited.  The pending visit stimulated assessment reform across the university.  Both accredited departments and their support departments needed to have an assessment plan in place by the academic year 2000-2001.  This paper describes the efforts of the Mathematical and Computer Sciences (MCS) Department at CSM in the development of a departmental assessment plan.  Since a natural link exists between the MCS assessment plan and the university assessment plan, some discussion also will be dedicated to a description of the university assessment system.

I.                   Background

In the late 1990s, CSM began to prepare for a visit by the Accreditation Board for Engineering and Technology (ABET) that was to take place in the academic year 2000-2001.  The eight engineering departments were to be reviewed using ABET's newly revised engineering criteria [1].  A major difference between the old and the new criteria was that the new criteria required that accredited departments directly demonstrate what students know and can do.  All of the engineering departments and the departments that support core-engineering courses needed to develop and implement an assessment plan that measured student outcomes. 

In response to the approaching visit, the universities assessment committee (which was originally established in the late 1980s) began to meet on a biweekly basis [2].  This committee consisted of faculty representatives from each of the accredited departments and each of the support departments.  The goal of this committee was to establish an internally consistent university assessment system.  This system was to consist of a common set of university goals and objects, individualized departmental goals and objectives, and plans for assessing both institutional and departmental goals and objectives.  Each of these components was to be constructed in a manner that complimented the other components. 

The primary purpose of this paper is to describe the efforts of the Mathematical and Computer Sciences Department (MCS) in the assessment development process.  Throughout this discussion, an effort will be made to link the MCS efforts to the broader university assessment system.  Before beginning this discussion, a brief description of CSM, the MCS department's role within CSM and the student population is provided.

II.                University Statistics

CSM is a public research institution in applied science and engineering.  In the academic year 2001-2002, entering freshman averaged 1230 on the SAT, 27 on the ACT and 3.7 for their high school grade point average.  The student body consists of approximately 2500 undergraduate and 800 graduate students.  Colorado residents comprise approximately 72% of the student population while foreign-born students comprise approximately 12% of the student population.  Females and minorities make-up approximately 25% and 12% of the student population, respectively. 

Eight academic departments, Chemical Engineering, Engineering, Engineering Physics, Geological Engineering, Geophysical Engineering, Metallurgical and Materials Engineering, Mining Engineering, and Petroleum Engineering, are accredited by the Accreditation Board for Engineering and Technology (ABET).  Approximately 67% of the undergraduate students are completing degrees within these departments.  All undergraduate majors within the school have a minimum of 12 required credit hours in courses that are offered through the Mathematical and Computer Sciences (MCS) department.  Additionally, approximately 12% of the undergraduate students have declared a major in MCS.  In terms of majors, MCS is the second largest department on campus.

III.             Assessment Process

According to the Mathematics Association of America Subcommittee on Assessment, assessment is a cycle that consists of the following five phases [3]: 1) articulation of goals and objectives, 2) development of strategies for reaching goals and objectives, 3) selecting instruments to evaluate the attainment of goals and objectives, 4) gathering, analyzing and interpreting data to determine the extent to which goals and objectives have been reached, and 5) using the results of assessment for program improvement.  When the final phase is reached, the assessment cycle begins again.  This conceptualization of the assessment process is consistent with other literature on assessment and is applicable at classroom, departmental or university level [4, 5, 6]. 

The majority of departments within CSM, including MCS, used the Olds and Miller Assessment Matrix (available at http://www.mines.edu/fs_home/rlmiller/matrix.htm) in the creation of their departmental assessment plan [7, 8, 9].  The adoption of this common matrix across campus has provided consistency across the departments and programs.  An example of a portion of the MCS department's assessment matrix is displayed in Table 1.

This matrix is consistent with the MAA recommendations and provides a framework for both constructing and documenting the assessment process.  The first step in the process is the establishment of departmental goals and objectives.  The goals are recorded above the matrix while the objectives comprise the first column of the matrix.  This is followed by the statement of the “Performance Criteria”, which is the statement of an observable student performance by which it can be determined whether a given objective has been reached.  “Implementation Strategy” refers to the student learning activities that support the attainment of given criteria.  “Evaluation Methods” specifies the measurement instruments used to collect the evidence as to whether the performance criteria have been reached.  “Timeline” indicates when each evaluation method will be implemented and “Feedback” indicates how the acquired information will be disseminated and used. 

IV.              Mathematical and Computer Sciences Departmental Assessment Plan

This section is divided into several subsections.  The first section discusses the importance of having someone or some group that is responsible for the development and implementation of the departmental assessment plan.  This section further describes the preferred qualifications of that individual.  The second section explains the process that was used in MCS to create a common set of goals and objectives that were consist with the goals of the broader university and ABET's engineering criteria.  The third section describes the original assessment plan that was implemented in the fall of 1998.  This section concludes with a discussion of how and why the MCS department's assessment plan was revised in the academic year 2001-2002.

Assessment Specialist(s).  Although a departmental assessment plan should result from the collaborative efforts of the entire faculty, someone or some group needs to be responsible for ensuring that the given plan is implemented.  In the MCS department, this person is the departmental assessment specialist.  The MCS departmental assessment specialist is a tenure-tracked assistant professor who works closely with the department's director of undergraduate studies.  The MCS departmental assessment specialist's background includes a doctorate in Mathematics Education with a minor in Quantitative Research Methodology and a Master's degree in Mathematics.  The Quantitative Research Methodology portion of her degree provides her with the necessary measurement knowledge to develop and implement


Table 1

Portion of the Mathematical and Computer Sciences Departments Student Assessment Plan

(Complete plan available at: http://www.mines.edu/Academic/assess/Plan.html)

G1: Students will demonstrate technical expertise within mathematics/computer science by:

 Objectives

Performance Criteria (PC)

Implementation Strategy

Evaluation Method (EM)

Timeline (TL)

Feedback (FB)

O1: Designing and implementing solutions to practical problems in science and engineering.

 

PC1: Students in Calculus for Scientists and Engineers (CSE) I, II and III will complete common exams that assess this objective.  All students will pass the calculus sequence prior to graduation.

PC2: Students in Programing Concepts and Data Structures will learn to use computer programs to solve problems.  All majors in MACS will pass these courses prior to graduation.

PC3: All MACS majors will pass field session prior to graduation.  Field session requires that the student apply mathematics/computer science to the solution of original complex problems in the field. 

PC4: At least 80% of graduating seniors will agree with the statement, "My MACS degree prepared me well to solve problems that I am likely to encounter at work".

Core Coursework

Major Coursework

Field Session

EM1: PC1 will be evaluated by instructors of the calculus sequence.

EM2: PC2 will be evaluated by instructors of Programming Concepts and Data Structures.

EM3: PC3 will be evaluated by the Field Session instructors.

EM4: PC4 will be evaluated through the senior survey.

 

TL1: EM1 implemented in F'97.

TL 2: EM2 implemented in F'97

TL3: EM3 implemented in F'97

TL4: EM4 implemented in S'99

FB1: Verbal reports will be given to the undergraduate committee and the department head concerning student achievements within the respective courses at the end of each semester.

FB2: Degree audit completed prior to graduation to ensure that all students completed requirements of degree.

FB3: A written summary of the results of the senior survey will be given to the department head at the end of each semester.

 


a departmental assessment plan while her background in mathematics and mathematics education provides her with the appropriate knowledge to teach courses within the department. 

Since an individual with similar qualifications is not easily found, this approach may not be possible at other universities.  An alternative approach would be to hire a measurement consultant from a School of Education and have this individual advice the department's assessment specialist.  This external consultant should not be responsible for the development and implementation of the department's assessment plan.  The creation of an appropriate assessment plan requires knowledge of the department, the departments' faculty, the department's students and the subject area.  The individual that becomes the departmental assessment specialist should be someone that the department's faculty trusts and believes would make fair and appropriate interpretations of the assessment results.  This will increase the likelihood that the results will be used for curriculum improvement.

Since the "evaluation" component of the assessment process is not always well received, it is also highly desirable that the assessment specialist be a tenured member of the faculty.  Occasionally, the assessment specialist needs to deliver bad news, placing an untenured faculty member in a precarious position [4].  Another problem that a departmental assessment specialist may face is pressure to make an undesirable result look better.  This pressure is less likely to be problem for a tenured faculty member who is not worried about future tenure decisions.  This difficulty is avoided in the MCS department by having the assessment specialist report to the department head and the director of undergraduate studies.  Both of these individuals have tenure and the appropriate influence to ensure that the assessment results are appropriately used. 

Release time should also be considered when appointing a departmental assessment specialist.  As will be illustrated in the sections that follow, the establishment and implementation of a departmental assessment plan is time intensive.  Balancing this process with a full class load and a research agenda is not realistic.

Goals and Objectives.  The first step in the development of any assessment plan is the establishment of goals and objectives.  "Goals" are broad statements of expected student outcomes and "objectives" divide a goal into circumstances that suggest whether a given goal has been reached [10].  In the construction of the MCS department's assessment plan, careful consideration has been given to the University Mission Statement, the university's goals, and the ABET engineering criteria [7].  Another essential component of establishing an appropriate set of goals and objectives is acquiring input from the faculty.  Faculty will not work to assist students in reaching goals and objectives that they do not believe are important. 

Departments may be tempted to use the current curriculum to motivate the development of student goals and objectives.  Although this method will result in the perfect alignment of the goals and objectives with the curriculum, it will also result in a missed opportunity for improving the curriculum.  Clearly defining what students should know and be able to do provides a framework upon which the appropriateness of the existing curriculum can be examined.  The process of determining what is important should not be constrained by what currently exists.  If the current curriculum does not support the development of the desired knowledge and skills, recognition of this inconsistency is likely to suggest manners in which the curriculum may be improved.

In order to support the development of a set of goals and objectives and to acquire faculty support for the goals and objectives, each member of the full-time MCS faculty was interviewed in the fall of 1997.  They were asked: 1) What competencies do you think students should have after completing the mathematics core courses?, 2) What competencies do you think students should have after completing their major courses in mathematics?, and 3) What competencies do you think students should have after completing their major courses in computer science?  The reader will notice that each of these questions refers to student competencies rather than student goals and objectives.  This phrasing stimulated the faculty to identify specific knowledge and skills.  Using the specific information that the faculty provided, the departmental assessment specialist created broader statements of goals that captured these competencies.  A feature of this interview was that the faculty were not directed to consider the current curriculum, but rather they were asked to indicate the competencies that students should have upon completion of their course work.

Following this interview, a departmental sub-committee was formed that consisted of the head of the department, a mathematician, a computer scientist, a mathematics education expert and an assessment specialist.  Based on the faculty responses to the interview, the requirements of ABET and the University Mission Statement, the sub-committee drafted four sets of departmental goals and objectives: 1) a general statement for all students of mathematics and computer science, 2) a statement for the core mathematics courses, 3) a statement for the major mathematics courses, and 4) a statement for the computer science major courses.  The fulltime faculty approved a version of these goals and objectives later that year.

Development and Implementation.  Once a program has goals and objectives, the next step is the development of a plan that supports the attainment of those goals and objectives.  This plan should clearly address the five phases of the assessment cycle (e.g., articulate goals and objectives, develop strategies for reaching goals and objectives, select instruments to evaluate the attainment of goals and objectives, gathering, analyzing and interpreting data to determine the extent to which goals and objectives have been reached, and using the results of assessment for program improvement).  As was stated previously, the MCS department has used the Olds and Miller Assessment Matrix [9] to develop and document the assessment process.   

The original MCS departmental assessment plan (which is not displayed in this document) primarily relied upon three evaluation instruments: Course Evaluations, Senior Survey and the Faculty Survey.  The Course Evaluation is a traditional end of the semester instrument in which the students' report the extent to which they found classroom instruction effective for learning [11].  The Senior Survey asks graduating seniors to reflect on their undergraduate experiences and report the extent to which they felt that these experiences had helped them in reaching the departmental goals.  The Faculty Survey was designed to elicit information from instructors concerning the classroom opportunities that they provided students that supported the attainment of the departmental goals.  An assumption that underlies the original plan was that the faculty were already providing the appropriate opportunities to students and that the students were already achieving the departmental goals.  As will be discussed in the section that follows, this was not necessarily the case.

The specified performance criteria in the original plan consisted of cut-off values that indicated the percentage of individuals that needed to respond to a given question in a particular manner to indicate that the objective had been reached.  For example, one of the departments' goals is that all students will develop effective oral communication skills.  A performance criterion for this objective was that 65% of the mathematics instructors would indicate that they had evaluated this objective in their classroom and had judged that more than 75% of their students were proficient with respect to this objective.

The original assessment plan was introduced in 1997 and was designed to be implemented over a five year period.  The introduction of new instruments was spaced throughout this period.  Some of the instruments were not administered ever semester.  For example, the Faculty Survey was scheduled to be administered every three semesters.  Since CSM is on a two semester academic calendar, the three semester schedule ensured that data was regularly collected during both the spring and the fall semesters.  Spacing the introduction and use of new techniques allowed the department to focus upon the implementation of specific methodology within a given year or semester.  Had the department introduced all of the new measurements in the first year, the department would have been overwhelmed and the assessment plan would have failed. 

A primary concern that has been expressed by the MAA [3] and others [4, 5] with regard to assessment is ensuring that the information acquired through assessment is used for program improvement.  At the end of each semester, the department's assessment specialist reviews the collected assessment information and submits a written report to the department head.  This report contains recommendations on how the department may improve its programs and its assessment system. 

To assist the MCS department in documenting how the information that is acquired through assessment is used, a feedback matrix was developed [2].  This matrix indicates in which semester the information was collected, the source of the information, the concern that this information raised, the department's response to that concern and the efforts to follow-up on whether the response was successful in addressing the concern.  A portion of how the Mathematical and Computer Science Department has used assessment information for improvement purposes is shown in Table 2.  The complete feedback matrix can be found at: http://www.mines.edu/Academic/assess/Feedback.html.

Revision.  Periodically, the department's assessment plan should be reviewed to ensure that it continues to be consistent with departmental needs.  In the academic year 2001-2002, the department's Undergraduate Curriculum Committee reviewed the department's assessment plan.  Based on this review, several changes have been made.

Although the committee felt that the current list of departmental goals and objectives continued to capture the desired student outcomes, questions were raised with respect to the phrasing of the goals and objectives.  The original list consisted of short phrases that implied desired student outcomes.  This list was revised to directly indicate what students needed to demonstrate in order to suggest that a given goal had been reached.  Both the original and the revised list of general student goals and objectives are displayed in Table 3. 

After reviewing the revised list of goals and objectives, a member of the undergraduate curriculum committee raised a critical question,  "What is the faculty's responsibility in assisting students in reaching these goals and objectives?"  This question stimulated a discussion that led to the re-examination of the assumption that the MCS courses were already addressing the department's goals and objectives.  Results of the senior survey and the faculty survey indicated that this assumption was not correct.  In response to these concerns, two decisions were made.  First, a list of faculty goals and objectives were developed to parallel the students' goals and objectives.  This list, which indicates the faculty's responsibility in assisting students in reaching the student goals and objectives, is displayed in Table 4.  The department's assessment plan was also up-dated to include the evaluation of the faculty goals and objectives.

The second decision that was made was to specify in which courses specific goals and objectives would be assessed.  This process was greatly simplified by the existence of "coordinated courses" within the department.  Coordinated courses have multiple sections, which are taught by different instructors.  A lead faculty member coordinates these sections and holds regular meetings at which instructors have the opportunity to share instructional strategies and to create common assignments and/or exams.  Under the revised assessment plan, the lead faculty member has the additional responsible of ensuring that the designated program objectives are assessed through common assignments and/or exams.


Table 2

A Portion of the MCS Feedback Matrix

(Complete matrix available at: http://www.mines.edu/Academic/assess/Feedback.html)

Semester 

Source 

Concern 

Response

Follow-up

Spring '00

Course evaluations

 

A set of open-ended questions were added to the course evaluations in 1997.  The average faculty rating on each question in 1997 was compared to the average faculty rating in 2000.  The faculty ratings have increased since the changes have been implemented. 

Current faculty evaluation system will be maintained.

Open-ended questions continue to be used as part of the faculty evaluation.

Spring '00

Senior Survey

The senior survey indicated that many of the graduating seniors felt that they had inadequate skills in written communication. 

Acting Department Head and the Coordinator for the Probability and Statistics course attended a summer workshop on how to introduce writing in summer workshops.

Writing assignments have been added to the Probability and Statistics course.

Fall '00

Course evaluations

 

 

Questions were raised as to the extent to which faculty use the information that is provided by students in response to the faculty evaluations. 

A set of questions was developed that asks faculty to examine the student evaluations and to write a response as to how they would use the information to improve the course.

Faculty continue to respond in writing to the student evaluations each semester.   Department head reviews the response.

Spring '01

Course evaluations

Feedback from Engineering  Division

 

Questions were raised about the appropriateness of the content of the Probability and Statistics for Engineers Course.

A new book was selected that better meets the needs of engineers.  Additionally, new labs were created with the same purpose in mind.

Internal funds were sought and acquired to support the improvement of this course in an appropriate manner.  As part of this effort, a survey has been developed and is administered each year concerning students' experiences in the course.

Fall '01

General Review of Assessment System

A general review of our assessment system indicated that a number of our instruments and methods are out of date.

An effort was begun to revise the goals, objectives and overall system in a manner that is appropriate to our current needs.

The revised goals, objectives and overall system were reviewed and approved by the Undergraduate Committee in the Spring '02.

 


Table 3

General Statement of Student Goals and Objectives.

Original Statement

Revised Statement

G1: Develop technical expertise within mathematics/computer science

O1: Design and implement solutions to practical problems in science and engineering

O2: Use appropriate technology as a tool to solve problems in mathematics/computer science

O3: Create efficient algorithms and well structured programs

G2: Develop breadth and depth of knowledge within mathematics/computer science

O4: Extend course material to solve original problems

O5: Apply knowledge of mathematics/computer science

O6: Identify, formulate and solve mathematics/computer science problems

O7: Analyze and interpret data

G3: Develop an understanding and appreciation for the relationship of mathematics/computer science to other fields

O8: Apply mathematics/computer science to solve problems in other fields

O9: Work cooperatively in multi-disciplinary teams

O10: Choose appropriate technology to solve problems in other disciplines

G4: Communicate mathematics/computer science effectively

O11: Communicate orally

O12: Communicate in writing

O13: Work cooperatively in teams

O14: Create well documented programs

O15: Understand and interpret written material in mathematics/computer science

G1: Students will demonstrate technical expertise within mathematics/computer science by:

O1: Designing and implementing solutions to practical problems in science and engineering,

O2: Using appropriate technology as a tool to solve problems in mathematics/computer science, and

O3: Creating efficient algorithms and well structured computer programs.

G2: Students will demonstrate a breadth and depth of knowledge within mathematics/computer science by:

O4: Extending course material to solve original problems,

O5: Applying knowledge of mathematics/computer science to the solution of problems,

O6: Identifying, formulating and solving mathematics/computer science problems, and

O7: Analyzing and interpreting statistical data.

G3: Students will demonstrate an understanding and appreciation for the relationship of mathematics/computer science to other fields by:

O8: Applying mathematics/computer science to solve problems in other fields,

O9: Working in cooperative multi-disciplinary teams, and

O10: Choosing appropriate technology to solve problems in other disciplines.

G4: Students will demonstrate an ability to communicate mathematics/computer science effectively by:

O11: Giving oral presentations,

O12: Completing written explanations,

O13: Interacting effectively in cooperative teams,

O14: Creating well documented programs, and

O15: Understanding and interpreting written material in mathematics/computer science.

G: Goals, O: Objective


Table 4

Faculty Goals and Objectives

G1: Faculty will demonstrate technical expertise within mathematics/computer science by:

O1: Providing clear, technical explanations of mathematics/computer science concepts to students,

O2: Using appropriate technology as a tool to illustrate to students how to solve mathematics/computer science problems, and

O3: Providing examples of how mathematics/computer science can be applied to the solution of problems in other fields.

G2: Faculty will support the students attainment of the goals and objectives outlined above by providing the students the opportunity to:

O4: Solve original problems, some of which are drawn from other fields,

O5: Use technology as a tool in solution of mathematics/computer science problems, 

O6:  Design algorithms and structured programs,

O7: Identify, formulate and solve  mathematics/computer science problems,

O8: Interact in cooperative teams,

O9: Give oral presentations,

O10: Communicate in writing, and

O11: Interpret written material in mathematics/computer science.

G3: Faculty will evaluate the students attainment of the above goals and objectives outlined above  by creating assessments for the evaluations of students ability to:  

O12: Solve original problems, some of which are drawn from other fields,  

O13: Use technology as a tool in solution of mathematics/computer science problems,

O14: Design algorithms and well structured programs,

O15: Identify, formulate and solve  mathematics/computer science problems,

O16: Interact in cooperative teams,

O17: Give oral presentations,

O18: Communicate in writing, and

O19: Interpret written material in mathematics/computer science.

G: Goals, O: Objectives


The final question that was raised with respect to the current assessment plan was the overuse of self-report surveys.  Although self-report is an easy manner in which to acquire information, it is not necessarily the most reliable method of data collection.  The decision was made to shift the focus to measurement techniques that directly assess student performances.  This resulted in the current student assessment plan, which is dependent upon assessments that are completed in coordinated courses.  Self-report instruments were not completely eliminated from the revised plan.  Students are still asked to complete course evaluations and the senior survey.  This information is used to supplement the direct measurement of the student outcomes.  The faculty survey has been replaced by a direct review of classroom assessment materials.  Given that this is the first year of implementation for the revised plan, the new methodology has not yet been fully tested.

V.                 Timeline

Developing, implementing and revising an assessment system requires a great deal of time.  Table 5 displays a timeline that summarizes the major activities of the MCS department within a given semester.

 

VI.              Summary

As the previous discussion illustrates, a great deal of time and effort has been dedicated to the development and implementation of the MCS departmental assessment plan.  Revision of this plan is ongoing.  As was discussed earlier, continual improvement of an assessment plan is a natural part of the assessment process.  The following list summarizes the main points that were presented in this paper:

·         Someone within the department needs to take primary responsibility for the implementation of the departmental assessment plan.  Preferably, this individual has tenure and has a background in Educational Measurement.  Since this person will be dedicating time to the development and implementation of the plan, a course release is essential. 

·         Departmental goals and objectives should reflect the collective beliefs of the departmental faculty concerning what students should know and be able to do.  Additionally, there needs to be some consistency among the resultant departmental goals and objectives and the University Mission Statement, the University Goals, and the requirements of accrediting agencies.

·         New methods of assessment should be introduced within the department gradually.  Quality assessment requires time and a rapid introduction of new tools could result in a overload that eventually leads to the breakdown of the system.


Table 5

Timeline Describing the Development, Implementation and Revision of the Assessment Process

Date

Activity

Fall '97

·   University Assessment Committee begins to meet on a biweekly basis.  Workshops on assessment practices are offered to departments across the school.

·   Members of the MCS faculty are interviewed concerning student goals and objectives. 

·   Department Assessment Committee is formed to create an initial set of goals and objectives based on the interviews, the University Mission Statement, the University goals, and ABET's engineering criteria.

Spring '98

·   Workshops continue to be offered by the University Assessment Committee concerning assessment.

·   Initial set of goals and objectives are approved by MCS faculty. 

·   Assessment Specialist begins to draft the departmental assessment plan.

Summer '98

·   On-going efforts in the development of the departmental assessment plan.

Fall '98

·   Faculty reviews and approves a preliminary departmental assessment plan. 

·   Implementation process begins. 

·   Faculty Survey is pilot tested.

·   Senior Survey is pilot tested.

·   Report concerning the progress of the departmental assessment plan is submitted to department head by the assessment specialist.

Spring '99

·   Revisions are made to the departmental assessment plan.

·   Paper work involved in the implementation of the departmental assessment plan is becoming overwhelming. The decision is made to make many of the survey instruments electronic.

Summer '99

·   Assessment report based on results from prior semester is written and submitted to Department Head.

·   The Senior Survey and the Faculty survey are re-designed to be administered via the web.

·   A departmental assessment website is established.

Fall '99

·   Up-dated version of assessment plan is approved by faculty.

·   New versions of the on-line surveys are tested.  Revisions are made.

·   Continued efforts to collect and analyze data.

·   Assessment report submitted to the department head.

Spring '00

·   Continued efforts to collect and analyze data.

·   Preparation begins for ABET visit in the fall.

Summer '00

·   Assessment report submitted to the department head.

·   Continued efforts to prepare for ABET visit.

Fall '00

·   Continued efforts to collect and analyze data.

·   ABET visit takes place.

·   Assessment report submitted to the department head.

Spring '01

·   Continued efforts to collect and analyze data.

Summer '01

·   Assessment report submitted to the department head.

Fall '01

·   Continued efforts to collect and analyze data.

·   Review of Departmental Assessment Program begins.  Recommendations are made for changing the current system. 

·   Assessment report submitted to the department head.

Spring '02

·   Continued efforts to collect and analyze data.

·   New goals and objectives and a new assessment is written.  Major changes include the introduction of faculty goals and objectives and a reduced dependency on self-report instruments. 

·   Revised plan is approved by the faculty. 

 


·         Periodically, a department's assessment plan should be reviewed to ensure that it continues to be consistent with departmental needs.  This type of review is likely to result in the improvement of the assessment plan.  Information that has been gathered through the assessment process should be used to inform this review.

 

Many of the techniques that have been presented here can be easily transported to the needs of other departments and disciplines.  As was discussed earlier, the Olds and Miller assessment matrix is already used across the CSM campus [2].  Further information concerning the MCS department's assessment efforts can be found at the department's assessment webpage: http://www.mines.edu/Academic/assess/.  Additional information concerning the broader CSM effort with regard to assessment can be found at: http://www.mines.edu/Academic/assess/Resource.htm.

VII.           Acknowledgement

The plan discussed here reflects the collaborative effort of the Mathematical and Computer Sciences Department (MCS) at the Colorado School of Mines (CSM).  The author gives special recognition to Dr. Graeme Fairweather, MCS Department Head, and Dr. Barbara Bath, Director of Undergraduate Studies in MCS.  Additionally, special thanks is given to Dr. Barbara Olds and Dr. Ron Miller who designed the assessment matrix that was used to organize the presented assessment plan.


VIII.        References

1.       Engineering Criteria, Accreditation Board of Engineering and Technology, 6th ed., 2000, available at http://www.abet.org/accreditation/accreditation.htm.

2.       Moskal, B., Olds, B. & Miller, R.L., "Scholarship in a University Assessment System," Academic Exchange Quarterly, 6 (1), 2002, pp. 32-37.

3.       Assessment of Student Learning for Improving the Undergraduate Major in Mathematics, Mathematical Association of America, Subcommittee on Assessment, Committee on Undergraduate Program Mathematics, 1995.

4.       Steen, L, "Assessing Assessment," in Gold, B., Keith, S.Z., and Marion, W., eds., Assessment Practices in Undergraduate Mathematics, 1999, pp. 1-6.

5.       Assessment Standards for School Mathematics, National Council of Teachers of Mathematics (NCTM), Reston, Virginia, 1995.

6.       Moskal, B. "An Assessment Model for the Mathematics Classroom," Mathematics Teaching in the Middle School, 6 (3), 2000, pp. 192-194.

7.       Moskal, B. M. & Bath, B.B., "Developing a Departmental Assessment Plan: Issues and Concerns," The Department Chair: A Newsletter for Academic Administrators, 11 (1), 2000, pp. 23-25.

8.       Olds, B. M. & Miller, R., “Assessing a Course or Project,” How Do You Measure Success?  (Designing effective processes for assessing engineering education), American Society for Engineering Education, 1996, pp. 135-44.

9.       Olds, B. M. & Miller, R., "An Assessment Matrix for Evaluating Engineering Programs," Journal of Engineering Education, 87 (2), 1998, pp. 173- 178.

10.   Rogers, G. & Sando, J., Stepping Ahead: An Assessment Plan Development Guide, Rose-Hulman Institute of Technology, Terre Haute, Indiana, 1996. 

11.   Moskal, B., "Student Voices: Improving the Quality of Course Evaluations," Academic Exchange Quarterly, 5 (1), 2001, pp. 72-78.


Barbara M. Moskal

Dr. Barbara M. Moskal received her Ed.D. in Mathematics Education with a minor in Quantitative Research Methodology and her M.A. in Mathematics from the University of Pittsburgh.  Currently, she is an Assistant Professor in the Mathematical and Computer Sciences Department at the Colorado School of Mines.  Her research interests include student assessment, k-12 outreach and equity issues.  She has taught Engineering Calculus I, II and III and Probability and Statistics for Engineers.  In 2000, she received one of the New Faculty Fellowships at the Frontiers in Education Conference.

            Address: Mathematical and Computer Sciences Department, Colorado School of Mines, Golden, CO, 80401; telephone: 303-273-3867; fax: 303-273-3875; e-mail: bmoskal@mines.edu.