At a small, private women's liberal arts college in the South student portfolios have become the principal means for assessing the major. Unique to this program, certain courses are designated as portfolio development courses and in these courses the student is asked to reflect in writing on the connection between the material included in the portfolio and the department goals.
Background and Purpose
Columbia College is a small private women's college in Columbia, South Carolina, founded by the United Methodist Church. Although the college has a small graduate program in education, by far the major focus of the institution is on excellence in teaching undergraduates. The college is a liberal arts institution, but we do offer business and education majors. The number of students who major in elementary education, early childhood education, and special education form a significant percentage of the total student body. Also, many of the more traditional liberal arts majors offer secondary certifications in addition to major requirements. In mathematics, the majority of our majors also choose to certify to teach. The large commitment to pre-service teachers reinforces the institutional commitment to excellence in the classroom.
Our mathematics department has been experimenting with assessment of the major for improvement of student learning for about five years. Initially, the primary impetus for assessment was pleasing our accrediting body. As the college has embraced the strategic planning process, more and more departments are using planning and assessment for improvement purposes. Our first attempts centered around efforts to locate a standardized instrument. This search was unsuccessful for several reasons. First, the particular set of courses which our majors take was not represented on any test we examined. Moreover, without the ex pense and inconvenience of a pre-test and post-test administration, we felt that the information from such a measure would tell us more about ability than learning. Finally, if we did the dual testing we would only obtain answers to questions about content knowledge in various mathematical areas. Although there would clearly be value in this data, many unanswered questions would remain. In fact, many of the kinds of classroom experiences which we think are most valuable are not easily evaluated. Consequently, we chose student portfolios as the best type of assessment for our major. It seemed to us that this particular technique had value in appraising outcomes in mathematical communication, technological competence, and collaborative skills that are not measurable through standardized tests.
To assess the major, we first split our departmental goals into two sections, a general section for the department and a specific section for majors. We articulated one goal for math majors, with nine objectives. For each of the objectives we looked for indicators that these objectives had been met by asking the question "How would we recognize success in this objective?" Then we used these indicators as the assessment criteria. Some of our objectives are quite generic and would be appropriate for many mathematics departments, while others are unique to our program.
The goal, objectives, and assessment criteria follow.
Goal: The Mathematics Department of Columbia College is committed to providing mathematics majors the coursework and experiences which will enable them to at tend graduate school, teach at the secondary level or enter a mathematics related career in business, industry or government. It is expected that each mathematics major will be able to:
Our initial effort to require the portfolio for each student was an informal one. We relied on a thorough orientation to the process of creating a portfolio, complete with checklists, required entries and cover sheets for each entry on which the student would write reflectively about the choice of the entry. We included this orientation in the advisement process of each student and stressed the value of the final collection to the student and to the department. Faculty agreed to emphasize the importance of collecting products for the portfolio in classes and to alert students to products which might make good portfolio entries.
We now designate a series of courses required for all majors as portfolio development courses. In these courses products are recommended as potential candidates for inclusion in the collection. Partially completed portfolios are submitted and graded as part of the courses. Perhaps most importantly, some class time in these courses is devoted to the reflective writing about ways in which specific products reflect the achievement of specific goals. Completing this writing in the context of classroom experience has had a very positive impact on the depth of the reflective work.
Our early efforts were a complete failure. No portfolios were produced. We probably should have realized that the portfolios would not actually be created under the informal guidelines, but in our enthusiasm for the process we failed to acknowledge two basic realities of student life: if it is not required, it will not be done and, equally important, if it is not graded, it will not be done well. After two years during which no student portfolios were ever actually collected, we made submission of the completed portfolio a requirement for passing the mandatory senior capstone course. The first semester this rule was in place, we did indeed collect complete portfolios from all seniors. However, it was obvious that these portfolios were the result of last-minute scrounging rather than the culmination of four years of collecting and documenting mathematical learning.
The second major problem was related to the first. Although the department was unanimous in its endorsement of the portfolio in concept, the faculty found it difficult to remember to include discussions related to the portfolio in advisement sessions or in classes. Incorporating the portfolio into specific mathematics classes was an attempt to remedy both situations.
The most striking problem surfaced when we finally collected thoughtfully completed portfolios and began to try to read them to judge the success of our program. We had never articulated our goals for math majors. The department had spent several months and many meetings formulating departmental goals, but the references to student behaviors were too general to be useful. The two relevant goals were "Continue to provide a challenging and thorough curriculum for math majors" and "Encourage the development of student portfolios to document mathematical development." In our inexperience we assumed that weaknesses in the major would be apparent as we perused each student's collection of "best work." Unfortunately, the reality was quite different. In the absence of stated expectations for this work, no conclusions could be reached. A careful revision of the departmental goals for a mathematics major was crafted in response to this finding. The resulting document is included in the first section. Examination of the portfolios pointed to a more difficult potential problem. In our small department, each student is so well known that it is difficult to inject any objectivity into program evaluation based on particular student work. We were forced to acknowledge that an objective instrument of some sort would be necessary to balance the subjective findings in the portfolios.
Use of Findings
One of the most important measures of the usefulness of a departmental strategy for assessment is that information gained can cause changes in the major, the department or the process. The changes made in the process include a revision of the goals and objectives, the addition of portfolio development in several required classes and the use of a post-graduation survey to determine graduates' perspective on the adequacy of their mathematical preparation for their post-graduation experiences. However, with the modified process now in place we have been able to recognize other changes that need to be made.
Examination of statistical entries in the portfolio indicated that there was not a sufficient depth of student understanding in this area. As a result we have split the single course which our students took in probability and statistics into a required series of two courses in an effort to cover the related topics in greater depth and to spend more time on the applications.
One graduate expressed her lack of confidence in entering graduate school in mathematics on the questionnaire. We have consequently added a course in Advanced Calculus to our offerings in an effort to help the students in our program who are considering graduate school as an option. Comments by several graduates that they did not understand their career options well have caused us to add a special section of the freshman orientation section strictly for mathematics majors. This course incorporates new majors into the department as well as the school and has several sessions designed to help them understand the options that mathematics majors have at our college and in their careers.
The assessment process is not a static one. Not only
should change come about as a result of specific
assessment outcomes, the process itself is always subject to change.
The first series of changes made in our plan were the result
of deficiencies in the plan. The next changes will be
attempts to gain more information and to continue to improve
our practices. We have recently added objective instruments
to the data we are collecting. The instruments are a test
to measure each student's ability in formal operations
and abstract logical thought and a test to determine the stage
of intellectual development of each student. We have
given these measures to each freshman math major for the
past two years and plan to administer them again in the
capstone course. After an initial evaluation of the first complete
data set, we will probably add an objective about
intellectual development to our plan with these instruments as
the potential means for assessment. We have recently
begun another search for a content test to add to the objective
data gathered at the beginning and the end of a student's
college experience. There are many more of these types of
measures now available and we hope to be able to identify one
that closely matches our curriculum. With the consideration of
a standardized content exam as an assessment tool, there is
a sense in which we have come full circle. However, the
spiral imagery is a better analogy. As we continue to refine
and improve the process, standardized tests are being
examined from a different perspective. Rather than abdicating
our assessment obligations to a standardized test, we have
now reached the point where the information that can be
provided by that kind of tool has a useful position in our