![]() |
Additional Online Case Studies & Appendices | |
The
Evolution of an Assessment System By Barbara M. Moskal Mathematical and Computer Sciences Department Colorado School of Mines Golden, CO 80401 bmoskal@mines.edu Abstract In the late 1990s, the Colorado School of Mines (CSM) began to prepare for a visit by the Accreditation Board for Engineering and Technology (ABET). Eight of CSM engineering departments are ABET accredited. The pending visit stimulated assessment reform across the university. Both accredited departments and their support departments needed to have an assessment plan in place by the academic year 2000-2001. This paper describes the efforts of the Mathematical and Computer Sciences (MCS) Department at CSM in the development of a departmental assessment plan. Since a natural link exists between the MCS assessment plan and the university assessment plan, some discussion also will be dedicated to a description of the university assessment system. I. BackgroundIn the late 1990s, CSM began to prepare for a visit by the Accreditation Board for Engineering and Technology (ABET) that was to take place in the academic year 2000-2001. The eight engineering departments were to be reviewed using ABET's newly revised engineering criteria [1]. A major difference between the old and the new criteria was that the new criteria required that accredited departments directly demonstrate what students know and can do. All of the engineering departments and the departments that support core-engineering courses needed to develop and implement an assessment plan that measured student outcomes. In response to the approaching visit, the universities assessment committee (which was originally established in the late 1980s) began to meet on a biweekly basis [2]. This committee consisted of faculty representatives from each of the accredited departments and each of the support departments. The goal of this committee was to establish an internally consistent university assessment system. This system was to consist of a common set of university goals and objects, individualized departmental goals and objectives, and plans for assessing both institutional and departmental goals and objectives. Each of these components was to be constructed in a manner that complimented the other components. The primary purpose of this paper is to describe the efforts of the Mathematical and Computer Sciences Department (MCS) in the assessment development process. Throughout this discussion, an effort will be made to link the MCS efforts to the broader university assessment system. Before beginning this discussion, a brief description of CSM, the MCS department's role within CSM and the student population is provided. II. University StatisticsCSM is a public research institution in applied science and engineering. In the academic year 2001-2002, entering freshman averaged 1230 on the SAT, 27 on the ACT and 3.7 for their high school grade point average. The student body consists of approximately 2500 undergraduate and 800 graduate students. Colorado residents comprise approximately 72% of the student population while foreign-born students comprise approximately 12% of the student population. Females and minorities make-up approximately 25% and 12% of the student population, respectively. Eight academic departments, Chemical Engineering, Engineering, Engineering Physics, Geological Engineering, Geophysical Engineering, Metallurgical and Materials Engineering, Mining Engineering, and Petroleum Engineering, are accredited by the Accreditation Board for Engineering and Technology (ABET). Approximately 67% of the undergraduate students are completing degrees within these departments. All undergraduate majors within the school have a minimum of 12 required credit hours in courses that are offered through the Mathematical and Computer Sciences (MCS) department. Additionally, approximately 12% of the undergraduate students have declared a major in MCS. In terms of majors, MCS is the second largest department on campus. III. Assessment ProcessAccording to the Mathematics Association of America Subcommittee on Assessment, assessment is a cycle that consists of the following five phases [3]: 1) articulation of goals and objectives, 2) development of strategies for reaching goals and objectives, 3) selecting instruments to evaluate the attainment of goals and objectives, 4) gathering, analyzing and interpreting data to determine the extent to which goals and objectives have been reached, and 5) using the results of assessment for program improvement. When the final phase is reached, the assessment cycle begins again. This conceptualization of the assessment process is consistent with other literature on assessment and is applicable at classroom, departmental or university level [4, 5, 6]. The majority of departments within CSM, including MCS, used the Olds and Miller Assessment Matrix (available at http://www.mines.edu/fs_home/rlmiller/matrix.htm) in the creation of their departmental assessment plan [7, 8, 9]. The adoption of this common matrix across campus has provided consistency across the departments and programs. An example of a portion of the MCS department's assessment matrix is displayed in Table 1. This matrix is consistent with the MAA recommendations and provides a framework for both constructing and documenting the assessment process. The first step in the process is the establishment of departmental goals and objectives. The goals are recorded above the matrix while the objectives comprise the first column of the matrix. This is followed by the statement of the “Performance Criteria”, which is the statement of an observable student performance by which it can be determined whether a given objective has been reached. “Implementation Strategy” refers to the student learning activities that support the attainment of given criteria. “Evaluation Methods” specifies the measurement instruments used to collect the evidence as to whether the performance criteria have been reached. “Timeline” indicates when each evaluation method will be implemented and “Feedback” indicates how the acquired information will be disseminated and used. IV. Mathematical and Computer Sciences Departmental Assessment PlanThis section is divided into several subsections. The first section discusses the importance of having someone or some group that is responsible for the development and implementation of the departmental assessment plan. This section further describes the preferred qualifications of that individual. The second section explains the process that was used in MCS to create a common set of goals and objectives that were consist with the goals of the broader university and ABET's engineering criteria. The third section describes the original assessment plan that was implemented in the fall of 1998. This section concludes with a discussion of how and why the MCS department's assessment plan was revised in the academic year 2001-2002. Assessment Specialist(s). Although a departmental assessment plan should result from the collaborative efforts of the entire faculty, someone or some group needs to be responsible for ensuring that the given plan is implemented. In the MCS department, this person is the departmental assessment specialist. The MCS departmental assessment specialist is a tenure-tracked assistant professor who works closely with the department's director of undergraduate studies. The MCS departmental assessment specialist's background includes a doctorate in Mathematics Education with a minor in Quantitative Research Methodology and a Master's degree in Mathematics. The Quantitative Research Methodology portion of her degree provides her with the necessary measurement knowledge to develop and implement Table 1 Portion of the Mathematical and Computer Sciences Departments Student Assessment Plan (Complete plan available at: http://www.mines.edu/Academic/assess/Plan.html)
a departmental assessment plan while her background in mathematics and mathematics education provides her with the appropriate knowledge to teach courses within the department. Since an individual with similar qualifications is not easily found, this approach may not be possible at other universities. An alternative approach would be to hire a measurement consultant from a School of Education and have this individual advice the department's assessment specialist. This external consultant should not be responsible for the development and implementation of the department's assessment plan. The creation of an appropriate assessment plan requires knowledge of the department, the departments' faculty, the department's students and the subject area. The individual that becomes the departmental assessment specialist should be someone that the department's faculty trusts and believes would make fair and appropriate interpretations of the assessment results. This will increase the likelihood that the results will be used for curriculum improvement. Since the "evaluation" component of the assessment process is not always well received, it is also highly desirable that the assessment specialist be a tenured member of the faculty. Occasionally, the assessment specialist needs to deliver bad news, placing an untenured faculty member in a precarious position [4]. Another problem that a departmental assessment specialist may face is pressure to make an undesirable result look better. This pressure is less likely to be problem for a tenured faculty member who is not worried about future tenure decisions. This difficulty is avoided in the MCS department by having the assessment specialist report to the department head and the director of undergraduate studies. Both of these individuals have tenure and the appropriate influence to ensure that the assessment results are appropriately used. Release time should also be considered when appointing a departmental assessment specialist. As will be illustrated in the sections that follow, the establishment and implementation of a departmental assessment plan is time intensive. Balancing this process with a full class load and a research agenda is not realistic. Goals and Objectives. The first step in the development of any assessment plan is the establishment of goals and objectives. "Goals" are broad statements of expected student outcomes and "objectives" divide a goal into circumstances that suggest whether a given goal has been reached [10]. In the construction of the MCS department's assessment plan, careful consideration has been given to the University Mission Statement, the university's goals, and the ABET engineering criteria [7]. Another essential component of establishing an appropriate set of goals and objectives is acquiring input from the faculty. Faculty will not work to assist students in reaching goals and objectives that they do not believe are important. Departments may be tempted to use the current curriculum to motivate the development of student goals and objectives. Although this method will result in the perfect alignment of the goals and objectives with the curriculum, it will also result in a missed opportunity for improving the curriculum. Clearly defining what students should know and be able to do provides a framework upon which the appropriateness of the existing curriculum can be examined. The process of determining what is important should not be constrained by what currently exists. If the current curriculum does not support the development of the desired knowledge and skills, recognition of this inconsistency is likely to suggest manners in which the curriculum may be improved. In order to support the development of a set of goals and objectives and to acquire faculty support for the goals and objectives, each member of the full-time MCS faculty was interviewed in the fall of 1997. They were asked: 1) What competencies do you think students should have after completing the mathematics core courses?, 2) What competencies do you think students should have after completing their major courses in mathematics?, and 3) What competencies do you think students should have after completing their major courses in computer science? The reader will notice that each of these questions refers to student competencies rather than student goals and objectives. This phrasing stimulated the faculty to identify specific knowledge and skills. Using the specific information that the faculty provided, the departmental assessment specialist created broader statements of goals that captured these competencies. A feature of this interview was that the faculty were not directed to consider the current curriculum, but rather they were asked to indicate the competencies that students should have upon completion of their course work. Following this interview, a departmental sub-committee was formed that consisted of the head of the department, a mathematician, a computer scientist, a mathematics education expert and an assessment specialist. Based on the faculty responses to the interview, the requirements of ABET and the University Mission Statement, the sub-committee drafted four sets of departmental goals and objectives: 1) a general statement for all students of mathematics and computer science, 2) a statement for the core mathematics courses, 3) a statement for the major mathematics courses, and 4) a statement for the computer science major courses. The fulltime faculty approved a version of these goals and objectives later that year. Development and Implementation. Once a program has goals and objectives, the next step is the development of a plan that supports the attainment of those goals and objectives. This plan should clearly address the five phases of the assessment cycle (e.g., articulate goals and objectives, develop strategies for reaching goals and objectives, select instruments to evaluate the attainment of goals and objectives, gathering, analyzing and interpreting data to determine the extent to which goals and objectives have been reached, and using the results of assessment for program improvement). As was stated previously, the MCS department has used the Olds and Miller Assessment Matrix [9] to develop and document the assessment process. The original MCS departmental assessment plan (which is not displayed in this document) primarily relied upon three evaluation instruments: Course Evaluations, Senior Survey and the Faculty Survey. The Course Evaluation is a traditional end of the semester instrument in which the students' report the extent to which they found classroom instruction effective for learning [11]. The Senior Survey asks graduating seniors to reflect on their undergraduate experiences and report the extent to which they felt that these experiences had helped them in reaching the departmental goals. The Faculty Survey was designed to elicit information from instructors concerning the classroom opportunities that they provided students that supported the attainment of the departmental goals. An assumption that underlies the original plan was that the faculty were already providing the appropriate opportunities to students and that the students were already achieving the departmental goals. As will be discussed in the section that follows, this was not necessarily the case. The specified performance criteria in the original plan consisted of cut-off values that indicated the percentage of individuals that needed to respond to a given question in a particular manner to indicate that the objective had been reached. For example, one of the departments' goals is that all students will develop effective oral communication skills. A performance criterion for this objective was that 65% of the mathematics instructors would indicate that they had evaluated this objective in their classroom and had judged that more than 75% of their students were proficient with respect to this objective. The original assessment plan was introduced in 1997 and was designed to be implemented over a five year period. The introduction of new instruments was spaced throughout this period. Some of the instruments were not administered ever semester. For example, the Faculty Survey was scheduled to be administered every three semesters. Since CSM is on a two semester academic calendar, the three semester schedule ensured that data was regularly collected during both the spring and the fall semesters. Spacing the introduction and use of new techniques allowed the department to focus upon the implementation of specific methodology within a given year or semester. Had the department introduced all of the new measurements in the first year, the department would have been overwhelmed and the assessment plan would have failed. A primary concern that has been expressed by the MAA [3] and others [4, 5] with regard to assessment is ensuring that the information acquired through assessment is used for program improvement. At the end of each semester, the department's assessment specialist reviews the collected assessment information and submits a written report to the department head. This report contains recommendations on how the department may improve its programs and its assessment system. To assist the MCS department in documenting how the information that is acquired through assessment is used, a feedback matrix was developed [2]. This matrix indicates in which semester the information was collected, the source of the information, the concern that this information raised, the department's response to that concern and the efforts to follow-up on whether the response was successful in addressing the concern. A portion of how the Mathematical and Computer Science Department has used assessment information for improvement purposes is shown in Table 2. The complete feedback matrix can be found at: http://www.mines.edu/Academic/assess/Feedback.html. Revision. Periodically, the department's assessment plan should be reviewed to ensure that it continues to be consistent with departmental needs. In the academic year 2001-2002, the department's Undergraduate Curriculum Committee reviewed the department's assessment plan. Based on this review, several changes have been made. Although the committee felt that the current list of departmental goals and objectives continued to capture the desired student outcomes, questions were raised with respect to the phrasing of the goals and objectives. The original list consisted of short phrases that implied desired student outcomes. This list was revised to directly indicate what students needed to demonstrate in order to suggest that a given goal had been reached. Both the original and the revised list of general student goals and objectives are displayed in Table 3. After reviewing the revised list of goals and objectives, a member of the undergraduate curriculum committee raised a critical question, "What is the faculty's responsibility in assisting students in reaching these goals and objectives?" This question stimulated a discussion that led to the re-examination of the assumption that the MCS courses were already addressing the department's goals and objectives. Results of the senior survey and the faculty survey indicated that this assumption was not correct. In response to these concerns, two decisions were made. First, a list of faculty goals and objectives were developed to parallel the students' goals and objectives. This list, which indicates the faculty's responsibility in assisting students in reaching the student goals and objectives, is displayed in Table 4. The department's assessment plan was also up-dated to include the evaluation of the faculty goals and objectives. The second decision that was made was to specify in which courses specific goals and objectives would be assessed. This process was greatly simplified by the existence of "coordinated courses" within the department. Coordinated courses have multiple sections, which are taught by different instructors. A lead faculty member coordinates these sections and holds regular meetings at which instructors have the opportunity to share instructional strategies and to create common assignments and/or exams. Under the revised assessment plan, the lead faculty member has the additional responsible of ensuring that the designated program objectives are assessed through common assignments and/or exams. Table 2 A Portion of the MCS Feedback Matrix (Complete matrix available at: http://www.mines.edu/Academic/assess/Feedback.html)
Table 3 General Statement of Student Goals and Objectives.
G: Goals, O: Objective Table 4 Faculty Goals and Objectives
G: Goals, O: Objectives The final question that was raised with respect to the current assessment plan was the overuse of self-report surveys. Although self-report is an easy manner in which to acquire information, it is not necessarily the most reliable method of data collection. The decision was made to shift the focus to measurement techniques that directly assess student performances. This resulted in the current student assessment plan, which is dependent upon assessments that are completed in coordinated courses. Self-report instruments were not completely eliminated from the revised plan. Students are still asked to complete course evaluations and the senior survey. This information is used to supplement the direct measurement of the student outcomes. The faculty survey has been replaced by a direct review of classroom assessment materials. Given that this is the first year of implementation for the revised plan, the new methodology has not yet been fully tested. V. TimelineDeveloping, implementing and revising an assessment system requires a great deal of time. Table 5 displays a timeline that summarizes the major activities of the MCS department within a given semester. VI. SummaryAs the previous discussion illustrates, a great deal of time and effort has been dedicated to the development and implementation of the MCS departmental assessment plan. Revision of this plan is ongoing. As was discussed earlier, continual improvement of an assessment plan is a natural part of the assessment process. The following list summarizes the main points that were presented in this paper: · Someone within the department needs to take primary responsibility for the implementation of the departmental assessment plan. Preferably, this individual has tenure and has a background in Educational Measurement. Since this person will be dedicating time to the development and implementation of the plan, a course release is essential. · Departmental goals and objectives should reflect the collective beliefs of the departmental faculty concerning what students should know and be able to do. Additionally, there needs to be some consistency among the resultant departmental goals and objectives and the University Mission Statement, the University Goals, and the requirements of accrediting agencies. · New methods of assessment should be introduced within the department gradually. Quality assessment requires time and a rapid introduction of new tools could result in a overload that eventually leads to the breakdown of the system. Table 5 Timeline Describing the Development, Implementation and Revision of the Assessment Process
· Periodically, a department's assessment plan should be reviewed to ensure that it continues to be consistent with departmental needs. This type of review is likely to result in the improvement of the assessment plan. Information that has been gathered through the assessment process should be used to inform this review. Many of the techniques that have been presented here can be easily transported to the needs of other departments and disciplines. As was discussed earlier, the Olds and Miller assessment matrix is already used across the CSM campus [2]. Further information concerning the MCS department's assessment efforts can be found at the department's assessment webpage: http://www.mines.edu/Academic/assess/. Additional information concerning the broader CSM effort with regard to assessment can be found at: http://www.mines.edu/Academic/assess/Resource.htm. VII. AcknowledgementThe plan discussed here reflects the collaborative effort of the Mathematical and Computer Sciences Department (MCS) at the Colorado School of Mines (CSM). The author gives special recognition to Dr. Graeme Fairweather, MCS Department Head, and Dr. Barbara Bath, Director of Undergraduate Studies in MCS. Additionally, special thanks is given to Dr. Barbara Olds and Dr. Ron Miller who designed the assessment matrix that was used to organize the presented assessment plan. VIII. References1. Engineering Criteria, Accreditation Board of Engineering and Technology, 6th ed., 2000, available at http://www.abet.org/accreditation/accreditation.htm. 2. Moskal, B., Olds, B. & Miller, R.L., "Scholarship in a University Assessment System," Academic Exchange Quarterly, 6 (1), 2002, pp. 32-37. 3. Assessment of Student Learning for Improving the Undergraduate Major in Mathematics, Mathematical Association of America, Subcommittee on Assessment, Committee on Undergraduate Program Mathematics, 1995. 4. Steen, L, "Assessing Assessment," in Gold, B., Keith, S.Z., and Marion, W., eds., Assessment Practices in Undergraduate Mathematics, 1999, pp. 1-6. 5. Assessment Standards for School Mathematics, National Council of Teachers of Mathematics (NCTM), Reston, Virginia, 1995. 6. Moskal, B. "An Assessment Model for the Mathematics Classroom," Mathematics Teaching in the Middle School, 6 (3), 2000, pp. 192-194. 7. Moskal, B. M. & Bath, B.B., "Developing a Departmental Assessment Plan: Issues and Concerns," The Department Chair: A Newsletter for Academic Administrators, 11 (1), 2000, pp. 23-25. 8. Olds, B. M. & Miller, R., “Assessing a Course or Project,” How Do You Measure Success? (Designing effective processes for assessing engineering education), American Society for Engineering Education, 1996, pp. 135-44. 9. Olds, B. M. & Miller, R., "An Assessment Matrix for Evaluating Engineering Programs," Journal of Engineering Education, 87 (2), 1998, pp. 173- 178. 10. Rogers, G. & Sando, J., Stepping Ahead: An Assessment Plan Development Guide, Rose-Hulman Institute of Technology, Terre Haute, Indiana, 1996. 11. Moskal, B., "Student Voices: Improving the Quality of Course Evaluations," Academic Exchange Quarterly, 5 (1), 2001, pp. 72-78. Barbara M. Moskal Dr. Barbara M. Moskal received her Ed.D. in
Mathematics Education with a minor in Quantitative Research Methodology and her
M.A. in Mathematics from the University of Pittsburgh. Currently, she is an Assistant Professor in
the Mathematical and Computer Sciences Department at the Colorado School of
Mines. Her research interests include
student assessment, k-12 outreach and equity issues. She has taught Engineering Calculus I, II and III and Probability
and Statistics for Engineers. In 2000,
she received one of the New Faculty Fellowships at the Frontiers in Education
Conference. Address: Mathematical and Computer
Sciences Department, Colorado School of Mines, Golden, CO, 80401; telephone:
303-273-3867; fax: 303-273-3875; e-mail: bmoskal@mines.edu. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||