Assessing Essential Academic Skills from the Perspective of the Mathematics Major

Mark Michael

King's College

At a private, church-related liberal arts college in the East a crucial point for assessing student learning occurs midway through a student's four year program. A sophomore-junior diagnostic project which is part of the Discrete Mathematics course, taken by all majors, is the vehicle for the assessment. Each student in the course must complete a substantial expository paper which spans the entire semester on a subject related to the course.

Background and Purpose

King's College, a church-related, liberal arts college with about 1800 full-time undergraduate students, has had an active assessment program for more than a decade. The program is founded on two key principles: (1) Assessment should be embedded in courses. (2) Assessment should measure and foster academic growth throughout a student's four years in college. The entire program is described in [1] and [2], while [4] details one component as it relates to a liberal arts mathematics course. The remaining components are aimed at meeting goal (2).

Of these, the key is the Four-Year Competency Growth Plans, blueprints designed by each department to direct the total academic maturation of its majors. These plans outline expectations of students, criteria for judging student competence, and assessment strategies with regard to several basic academic skills, e.g., effective writing. Expectations of students increase from the freshman year to the senior year. More importantly, the continued growth and assessment of competencies are the responsibilities of a student's major department. In the freshman year, all departments have similar expectations of their majors. In subsequent years, competencies are nurtured and evaluated in discipline-specific ways; mathematics majors are expected to know the conventions for using mathematical symbols in prose, while computer science majors must be able to give a presentation using a computer.

In addition to continual assessment charted by the growth plans, there are two comprehensive, one-time assessment events which take a wide-angle view of a student's development from the perspective of the student's major: the Sophomore-Junior Diagnostic Project and the Senior Integrated Assessment. The former is the subject of this report. It is a graded component of a required major course taken in the sophomore or junior year. Coming roughly midway in a student's college career, the Sophomore-Junior Diagnostic Project is a "checkpoint" in that it identifies anyone whose command of fundamental academic skills indicates a need for remediation. However, the Project is not a "filter;" it is, in fact, a "pump" which impels all students toward greater facility in gathering and communicating information. The variety of forms the Project takes in various departments can be glimpsed in [5].

Method

For Mathematics majors and Computer Science majors, the Project's format does not appear particularly radical. It resides in the Discrete Mathematics course since that course is required in both programs, is primarily taken by sophomores, and is a plentiful source of interesting, accessible topics. The Project revolves around a substantial expository paper on an assigned subject related to the course. It counts as one-fourth of the course grade. But how the Project is evaluated is a radical departure from the traditional term papers our faculty have long assigned in Geometry and Differential Equations courses. Previously, content was the overriding concern. Now the emphasis is on giving the student detailed feedback on all facets of his/her development as an educated person.

Another departure from past departmental practice is that the Project spans the entire semester. It begins with a memo from the instructor to the student indicating the topic chosen for him/her. After the student has begun researching the topic, he/she describes his/her progress in a memo to the instructor. (This sometimes provides advanced warning that a student has difficulty writing and needs to be referred to the King's Writing Center for help.) It is accompanied by a proposed outline of the report and a bibliography which is annotated to indicate the role each reference will play in the report. This assures that the student has gathered resources and examined them sufficiently to begin assimilating information. It also serves to check that the student has not bitten off too much or too little for a report that should be about 10 double-spaced pages.

Requiring that work be revised and resubmitted is common in composition courses, less so in mathematics courses. It is crucial in this exercise. The "initial formal draft" is supposed to meet the same specifications as the final draft, and students are expected to aim for a "finished" product. Nonetheless, a few students have misunderstandings about their subject matter, and all have much to learn about communicating the subject. To supplement the instructor's annotations on the first draft, each student is given detailed, personalized guidance in a conference at which the draft is discussed. This conference is the most powerful learning experience to which I have ever been a party; it is where "assessment as learning" truly comes alive. Recognizing that all students need a second chance, I count the final draft twice as much toward the project grade as the first draft.

Also contributing to the project grade is a formal oral presentation using an overhead projector. As with the written report, students make a first attempt, receive feedback to help them grow, and then give their final presentation. Unlike the first draft of the report, the practice presentation has no impact on a student's grade. In fact, each student chooses whether or not the instructor will be present at the trial run. Most students prefer to speak initially in front of only their classmates, who then make suggestions on how the presentation could be improved.

While we mathematics faculty are increasingly recognizing the importance of fostering and evaluating our students' nonmathematical skills, there is still the problem of our having to make professional judgments about writing and speaking skills without ever having been trained to do so. The solution in our department was to create "crutches": an evaluation form for the written report and another for the oral presentation. Each form has sections on content, organization, and mechanics. Each section has a list of questions, intended not as a true-false check-list but as a way to help the reader or listener focus on certain aspects and then write comments in the space provided. Some of the questions are designed for this particular project ("Are the overhead projector and any additional audio-visual materials used effectively?") while others are not ("Does the speaker articulate clearly?").

Findings

All students — even the best — find writing the Project report to be more challenging than any writing they have done previously. Student Projects have revealed weaknesses not seen in students' proofs or lab reports, which tend to be of a more restricted, predefined structure. In explaining an entire topic, students have many more options and typically much difficulty in organizing material, both globally and at the paragraph level. They also have trouble saying what they really mean when sophisticated concepts are involved.

Peer evaluations of the oral presentations have proved to be surprisingly valuable in several ways. First, by being involved in the evaluation process, students are more conscious of what is expected of them. Second, their perspectives provide a wealth of second opinions; often students catch flaws in presentation mechanics that the instructor might overlook while busy taking notes on content and technical accuracy. Third, students' informal responses to their peers' practice presentations have improved the overall quality of final presentations — partly by putting students more at ease!

Perhaps the most surprising finding, however, is how widely sets of peer evaluations vary in their homogeneity; some presentations elicit similar responses from all the audience, while in other cases it is hard to believe that everyone witnessed the same presentation. Peer evaluations, therefore, present several challenges to the instructor. What do you tell a student when there is no consensus among the audience? How does one judge the "beauty" of a presentation when different beholders see it differently? What are the implications for one's own style of presentation?

Use of Findings

Since the institution of the Sophomore-Junior Diagnostic Project, there has been an increased mathematics faculty awareness of how well each student communicates in the major. As the Projects are the most demanding tests of those skills, they have alerted instructors in other classes to scrutinize student written work. Before, instructors had discussed with each other only their students' mathematical performance. Moreover, there has been an increased faculty appreciation for writing and the special nature of mathemati cal writing. While national movements have elevated the prominence of writing as a tool for learning mathematics, this activity promotes good mathematical writing, something my generation learned by osmosis, if at all.

Furthermore, there is an increased realization that a student's ability to obtain information from a variety of sources — some of which did not exist when I was a student — is an essential skill in a world where the curve of human knowledge is concave up. Previously, some, believing that isolated contemplation of a set of axioms was the only route to mathematical maturity, had said, "Our students shouldn't be going to the library!"

Collectively, student performances on Projects have motivated changes in the administration of the Project. One such change was the addition of the trial run for oral presentations. A more significant change was suggested by students: replace the generic evaluation form used in the general education speech class (a form I tried to use the first time through) with a form specifically designed for these presentations.

While students were always provided detailed guidelines for the projects, I have continually endeavored to refine those handouts in response to student performances. Other forms of aid have also evolved with time. Exemplary models of exposition relevant to the course can be found in MAA journals or publications. For abundant good advice on mathematical writing and numerous miniature case studies, [3] is an outstanding reference. For advice on the use of an overhead projector, the MAA's pamphlet for presenters is useful. In addition, I now make it a point to use an overhead projector in several lectures expressly to demonstrate "How (not) to."

Success Factors

The primary key to the success of the Sophomore-Junior Diagnostic Project is in giving feedback to students rather than a mere grade. When penalties are attached to various errors, two students may earn very similar scores for very different reasons; a traditional grade or score might not distinguish between them, and it will help neither of them.

Another contributor to the Project's success as a learning exercise is the guidance students are given throughout the semester. Students need to know what is expected of them, how to meet our expectations, and how they will be evaluated. The instructor needs to provide specifications, suggestions, examples, and demonstrations, as well as the evaluation forms to be used.

The third factor in the success of the Projects is the follow-up. Best efforts — both ours and theirs — notwithstanding, some students will not be able to remedy their weaknesses within the semester of the Project. In light of this (and contrary to its standard practice), King's allows a course grade of "Incomplete" to be given to a passing student whose Project performance reveals an inadequate mastery of writing skills; the instructor converts the grade to a letter grade the following semester when satisfied that the student, having worked with the College's Writing Center, has remedied his/her deficiencies. This use of the "Incomplete" grade provides the leverage needed to ensure that remediation beyond the duration of the course actually occurs.

The many benefits the Sophomore-Junior Diagnostic Projects bring to students, faculty, and the mathematics program come at a cost, however. The course in which the Project is administered is distinguished from other courses in the amount of time and energy both instructor and student must consume to bring each Project to a successful conclusion. This fact is acknowledged by the College: the course earns four credit-hours while meeting only three hours per week for lectures. (Extra meetings are scheduled for oral presentations.)

For the instructor, this translates into a modicum of overload pay which is not commensurate with the duties in excess of teaching an ordinary Discrete Mathematics course. The primary reward for the extra effort comes from being part of a uniquely intensive, important learning experience.

References

[1] Farmer, D.W. Enhancing Student Learning: Emphasizing Essential Competencies in Academic Programs, King's College Press, Wilkes-Barre, PA, 1988.

[2] Farmer, D.W. "Course-Embedded Assessment: A Teaching Strategy to Improve Student Learning," Assessment Update, 5 (1), 1993, pp. 8, 10-11.

[3] Gillman, L. Writing Mathematics Well, Mathematical Association of America, Washington, DC 1987.

[4] Michael, M. "Using Pre- and Post-Testing in a Liberal Arts Mathematics Course to Improve Teaching and Learning," in this volume, p. 195.

[5] O'Brien, J.P., Bressler, S.L., Ennis, J.F., and Michael, M. "The Sophomore-Junior Diagnostic Project," in Banta, T.W., et al., ed., Assessment in Practice: Putting Principles to Work on College Campuses, Jossey-Bass, San Francisco, 1996, pp. 89-99.

  Topical Index  
Previous in Book Table of Contents Next in Book
 SAUM Home