SAUM Sessions at National Meetings

Assessment of Student Learning in Undergraduate Mathematics: Works in Progress

MAA Poster Session
Joint Mathematics Meetings in Phoenix, Arizona
January 9, 2004

 
Poster Session Description || Poster Guidelines

Organizers Bill Haver and Bernie Madison
  1. Assessing Allegheny College's Lower Level Mathematics Courses
    Presenters: Ronald Harrell and Tamara Lakins
    Allegheny College
    View Abstract | View Picture

  2. Expanding Assessment to Program and Discipline Level
    Presenters: Jeffery Berg and David Heddens
    Arapahoe Community College
    View Abstract | View Picture

  3. Assessing Developmental Mathematics
    Presenters: Roy Cavanaugh, Brian Karasek, and Daniel Russow
    Arizona Western College
    View Abstract | View Picture

  4. Departmental Assessment: A Continuous Process
    Presenters: Barbara M. Moskal and Alyn Rockwood
    Colorado School of Mines
    View Abstract

  5. Assessment at the Department Level
    Presenters: John George, Tom Brown, and Regina Aragon
    Eastern New Mexico University
    View Abstract
    | View Picture

  6. Using Assessment in Program Evaluation
    Presenters: David Carothers and J. Robert Hanson
    James Madison University
    View Abstract | View Picture

  7. Assessing the Mathematics Major
    Presenters: Tony D Weathers, Carolyn Yackel, and Curtis Herink
    Mercer University
    View Abstract | View Picture

  8. Jump-starting Program Assessment
    Presenters: Jeff Scroggs
    North Carolina State University
    View Abstract | View Picture

  9. Assessment of Quantitative Literacy
    Presenters: Jorge Calvo, Dogan Comez, and William Martin
    North Dakota State University
    View Abstract | View Picture

  10. Program Assessment for Accreditation
    Presenters: Jesus Jimenez and Maria Zack
    Point Loma Nazarene University
    View Abstract | View Picture

  11. Assessment of the Calculus I, II and Linear Algebra courses
    Presenters: Mohamed Lotfy, Eleanor Storey, and Jennifer Mauldin
    School for Professional Studies, Regis University
    View Abstract | View Picture

  12. Revamping Assessment
    Presenters: Mary Ann Hovis and Judy Giffin
    Rhodes State College
    View Abstract
    | View Picture

  13. Assessment for Accreditation
    Presenters: James Schaefer
    Rhode Island College
    View Abstract | View Picture

  14. Precalculus in Transition: A Preliminary Report
    Presenters: Trisha Bergthold and Ho Kuen Ng
    San Jose State University
    View Abstract
    | View Picture

  15. Year Two: Assessing Notebook Computer Use in Two Freshman Core Mathematics Programs
    Presenters: Michael Huber and Alex Heidenberg
    United States Military Academy
    View Abstract
    | View Picture

  16. Using Practice Test in Assessment of Teacher Preparation Program
    Presenters: Jerzy Mogilski (presenter), Jorge E. Navarro, Zhong L. Xu
    University of Texas at Brownsville
    View Abstract | View Picture

  17. Assessment of the Mathematics Major: UW Oshkosh Starts from Scratch
    Presenters: John Koker, Jennifer Szydlik, and Steve Szydlik
    University of Wisconsin Oshkosh
    View Abstract | View Picture

  18. Assessment in Freshman and Sophomore Courses
    Presenters: James Miller and Eddie Fuller
    West Virginia University
    View Abstract | View Picture


Abstracts:

Assessing Allegheny College's Lower Level Mathematics Courses
Ronald Harrell and Tamara Lakins
Allegheny College
rharrell@allegheny.edu, tlakins@allegheny.edu

As a result of a thorough study of the effectiveness of its introductory calculus and precalculus courses and with the goal of improving student learning, Allegheny College has recently designed and implemented a new set of courses, consisting of an algebra-based modeling course, a standard precalculus course, and a two-course sequence in calculus for the social/life sciences. The effectiveness of these courses is assessed each semester by a combination of writing projects, test questions on final exams, and a discussion of the findings and a subsequent report prepared by instructors in those courses. Results of the first semester's assessment will be presented.

Back to Top



Expanding Assessment to Program and Discipline Level
Presenters: Jeffrey Berg and David Heddens
Arapahoe Community College
jeff.berg@arapahoe.edu and david.heddens@arapahoe.edu

During the last two years, the department has participated in a college-wide effort to expand assessment of student academic achievement to the program and discipline level. At the poster session, ACC representatives present an overview of departmental assessment activities during the last two years. The departmental assessment activities have centered on three studies using an entrance exam and common final exam in College Algebra and fall most directly into the "general education courses" focus area identified by SAUM. Representatives will provide information on departmental mission, student learning outcomes, methods used to assess student learning outcomes, results produced by assessment methods, insights from assessment results, use of results to improve student learning, future direction based on lessons learned, and involvement in the second SAUM assessment workshop series.

Representatives also seek guidance and suggestions from other poster session presenters and participants in similar assessment environments to learn from their experiences, identify successful alternative methods, identify potential pitfalls, exchange and generate ideas

So far, assessment results have mainly indicated ways to improve the assessment process. The department is particularly interested in advice from session presenters and participants on how to create and strengthen methods for using results to improve student learning.

Back to Top


Assessing Developmental Mathematics
Presenters: Roy Cavanaugh, Brian Karasek, and Daniel Russow
Arizona Western College
Daniel.russow@azwestern.edu.


Our team, here at Arizona Western College, decided to assess our developmental mathematics program. We have approximately 900 students who pass through these courses every semester. The focus of our assessment is to determine whether our students are adequately prepared to succeed in these courses upon entrance. We came up with a list of objectives that we felt students should have mastered to be successful in each of our developmental courses and then developed an assessment tool for each course to see if students had these objectives mastered. Assessment took place during the second class meeting of the semester. Besides assessing our objectives, we also asked how the students got into the course (placement, prerequisite, or instructor permission) and the time elapsed since their last math course. Our poster will show our assessment tool, the purpose of our assessment, summarize our results, and will share any conclusions that we have made based on our assessment.

Back to Top


Departmental Assessment: A Continuous Process
Presenters: Barbara M. Moskal and Alyn Rockwood
Colorado School of Mines

Bmoskal@mines.edu, Alynrock@mines.edu


According to the Mathematics Association of America's (MAA) Committee on the Undergraduate Program in Mathematics (CUPM) in collaboration with the MAA's Assessment Subcommittee, assessment is a cycle that consists of the following five phases: 1) articulating goals and objectives, 2) developing strategies for reaching goals and objectives, 3) selecting instruments to evaluate the attainment of goals and objectives, 4) gathering, analyzing and interpreting data to determine the extent to which goals and objectives have been reached, and 5) using the results of assessment for program improvement. When the final phase is reached, the assessment cycle begins again. The phases within this cycle provide a framework for developing a departmental assessment plan.

The Mathematical and Computer Science Department (MCS) at the Colorado School of Mines (CSM) has developed and implemented a departmental assessment plan based on the above described framework. This plan was initially implemented in the academic year of 1998-1999, and thus, has been tested through several assessment cycles. As a result of MCS's departmental assessment efforts, a great deal of information has been collected concerning our students' knowledge and this information has been used to improve the department's curriculum and programs.

This poster presentation will provide information on the following:

  1. A review of the framework that has been developed for undergraduate assessment of mathematics by the CUPM and the MAA;
  2. An illustration of how this framework has been used by MCS to develop a departmental assessment plan;
  3. Examples of the assessment tools and resources that have been developed and used by MCS for departmental assessment purposes.

Back to Top


Assessment at the Department Level
Eastern New Mexico University
Department of Mathematical Sciences

Presenters

John George Tom Brown Regina Aragon
John.George@enmu.edu Tom.Brown@enmu.edu Regina.Aragon@enmu.edu

In an effort to improve student learning, the Department of Mathematical Sciences at Eastern New Mexico University has developed a matrix of goals, objectives and competencies for mathematics majors. The purpose of this study is to assess the learning goal of communication with an emphasis on the students' ability to write mathematically in different contexts.

Ideally there should be a progression of writing skills as students move through their mathematics courses. The writing skills of mathematics majors will be assessed in three different course levels; Calculus I, Foundations of Higher Mathematics, and Abstract Algebra. These core courses are chosen since they are taken by all students majoring in mathematics and mathematics with a secondary licensure.

During the spring 2003 semester, three problems from the Abstract Algebra final were used to assess students' ability to communicate in writing. This assessment measured the students' ability to write a proof or explanation that clearly demonstrates (1) an understanding of the logic of mathematics; (2) an understanding of interrelationships amongst ideas; (3) an appreciation of concise, thorough exposition; (4) insight into the subject; and (5) an ability to work well with definitions.

In the fall 2003 semester, questions will be included in the final exams of both sections of Calculus I and the final exam of Foundations of Higher Mathematics to assess students' ability to write well in mathematics. For Calculus I the student should be able to (1) use notation correctly; (2) use proper vocabulary; and (3) paraphrase key concepts in their own words. For Foundations of Higher Mathematics, a student should be able to (1) use proper vocabulary; (2) paraphrase key concepts in their own words; (3) critique the writing of others; and (4) write a proof.

This presentation will include the assessment instrument, the list of competencies, the data that was collected, the revised list of competencies that are easier to assess, an analysis of the data in reference to the original list of competencies and the revised list, and a discussion about how this study will be used to begin a dialogue in the department for pedagogical and ancillary changes.

Back to Top


Assessment and Program Evaluation
James Madison University

J. Robert Hanson, hansonjr@jmu.edu
David Carothers, carothdc@jmu.edu

The Department of Mathematics and Statistics at James Madison University makes use of several assessment tools in evaluation of its program, including analysis of embedded questions on course examinations and an extensive placement examination program. We will present a summary of past experience as well as ongoing plans for expanding assessment methods in introductory courses. These will include use of placement subscores and other techniques in order to effort to identify and enhance key factors related to persistence in sequenced mathematics courses.

Back to Top


Assessing the Mathematics Major
Mercer University

Tony D Weathers Carolyn Yackel Curtis Herink.
weathers_td@mercer.edu yackel_ca@mercer.edu herink_cd@mercer.edu

In preparation for a SACS accreditation visit in 2005, the Mercer University mathematics department has been charged with assessing our major program with distinction made between the BA and BS options. In response, we have followed the guidelines established by the Mathematics Association of America's (MAA) Committee on the Undergraduate Program in Mathematics (CUPM) in collaboration with the MAA's Assessment Subcommittee. Thus, we approached the program as follows: formulate goals, select a proper subset of the goals to assess, select assessment methods, implement the methods, interpret the results, and make program changes based on results. On this poster, we present a carefully crafted list of goals for students graduating with a major in mathematics. This list contains genuine distinctions between goals for those earning a BA and those earning a BS degree, though inevitably there is some overlap. In addition, we document our first assessment cycle, including sharing our means of assessment. We have several difficulties arising from the fact that our department has fewer than ten graduating majors each year. The most perplexing of these is how to extract useful information from a very small sample size. We present one possible solution.

Back to Top



Jump-starting Program Assessment
North Carolina State University
Jeff Scroggs
scroggs@math.ncsu.edu

NCSU has made major improvements to its undergraduate programs assessment over the last several years. These changes are institution-wide, and have strongly impacted undergraduate programs in all departments. To complicate matters, the university is in the process of re-accreditation. Thus, there is a tendency to focus on reporting for the near term situation; However, faculty ownership of this process happens best when the emphasis is on the continuous use of results, not on reporting (that happens once every 7-10years).

This presentation will focus on the assessment and evaluation processes for math programs assessment in the context of the institution-wide process.

Back to Top



Assessment of Quantitative Literacy
North Dakota State University
Jorge Calvo, Dogan Comez, and William Martin
william.martin@ndsu.nodak.edu

One focus of the assessment activities at North Dakota State University (NDSU) is on quantitative literacy in mathematics. This reflects our view that quantitative literacy should not be separated from the general literacy in mathematics and that it should be addressed as a normal part of teaching of mathematics at all levels.

We start by asking which skills are seen as important to individuals in specific settings, then examine the extent to which our programs seem to have developed these essential capabilities. In our delivery of mathematics courses and the subsequent assessment activities we focus on the skills for:

  1. Analyzing and interpreting data (in various forms or context), reason carefully and logically to reach sound decisions
  2. Using mathematical concepts in real-world settings to model, put in mathematical context, develop strategies to solve, and interpret the outcome meaningfully
  3. Critical and logical thinking

The instruments used in general assessment of student learning are prepared to reflect these goals. The process is designed to be faculty-driven. That is, it is collaborative and collegial and less threatening than external methods-it is supportive and dependent on faculty involvement. The feedback loop is flexible and is based on communication and discussions with all involved parties: the faculty teaching the course, the department, the administration (if found necessary) and the students themselves.

Back to Top


Program Assessment for Accreditation
Department of Mathematics and Computer Science
Point Loma Nazarene University
Jesus Jimenez: jjimenez@ptloma.edu and Maria Zack: mzack@ptloma.edu

Our department has just completed a five-year department review. This review is part of our institutions "assessment process" for accreditation. Our assessment process is organized around the institution's mission of teaching, shaping and sending.

The assessment process involved several instruments and produced a roughly 100 page evaluation document. The assessment instruments included:

  • The ETS Major Field Exams in Mathematics and Computer Science
  • An Alumni Survey
  • A comparison with national curriculum standards (MAA, ACM, AIS, AITP, ABET)
  • An external review by colleagues in Mathematics and Computer Science Departments in institutions whose mission is similar to ours (Calvin College in Grand Rapids, MI and Westmont College in Santa Barbara, CA)
  • An evaluation of our student placement exam data
  • A conversation with our "client" departments for the Science and Business calculus course.

Findings from the assessment process were used in several ways:

  • The ETS exams results identified weaknesses in our Mathematics and Computer Science curricula. We have modified our curricula to compensate for these weaknesses.
  • The alumni survey confirmed that the changes we introduced in our last department review (1999) have accomplished our desired outcome. The alumni input also confirmed that our intended curriculum changes conform to what they believe is necessary for success in their disciplines.
  • The comparison with national curriculum standards helped us to fine-tune our course offerings as we introduce BS degrees in Mathematics and in Computer Science. These standards were also used to craft our new BS in Information Systems.
  • The external review helped refine the curriculum for our majors and provided us with an out side set of eyes to double check that there was nothing significant that we had overlooked.
  • The analysis of five years of longitudinal placement data indicates that our placement exam is placing students accurately in the corresponding courses.
  • The conversation with "client" departments has produced a modification in the service calculus course structure (a five unit course was split into a four unit lecture course and a one unit Maple laboratory, with only some of the students being required to take the lab).

Back to Top


Assessment of the Calculus I, II and Linear Algebra courses
School for Professional Studies, Regis University
Mohamed Lotfy, mlotfy@regis.edu, Eleanor Storey, Eleanor.Storey@frontrange.edu,
and Jennifer Mauldin, jmauldin@regis.edu

Background

The School for Professional Studies at Regis University provides accelerated classroom and online courses using an 8-week format. The Computer Science and Mathematics department provides eleven (11) courses that are used in the core studies and also as mathematics minor in the computer science major. Since most of the students are adult students that transfer their previous mathematics credits from other institutions, students struggle in the calculus and linear algebra courses. These courses are currently offered only in classroom format with defined goals and competencies for the entire course and each learning outcome. At the end of each course evaluation forms from students and faculty are collected to evaluate the course.

Goals:

  • Identify the learning topics that are not easily comprehended.
  • Embed assessment mechanisms to verify competencies and ensure that the student is doing his/her own work.
  • Develop online solutions to aid the student's comprehension
  • Expand the mathematics curriculum to include online courses that include the assessment measures and online solutions described above

Plan:

The department decided to assess the learning for the three math courses Calc I, II and Linear Algebra. A pilot classroom assessment will be performed for the above-mentioned courses in the current and next academic year (6 semesters). The following steps define the assessment process:

  • Use a Pre/Post assessment
  • Currently a pre- and post-tests for the Calc I course have been developed and were first administered in the fall semester 2003. The first set of data has been gathered but it is too early to draw conclusions.
  • The pre/post quizzes will be timed and have about 10-20 questions and will not be counted in final grade for the class.
  • Students will do the pretest as a part of first night assignment.
  • Students have to take the posttest in order to get a final grade.
  • The tests will be based on the prerequisites of current courses to see if students are prepared for the course they are about to start.
  • Develop a rubric for evaluating student performance.

Outcomes:

The assessment process will help the department understand how well we teach students who take algebra and other math courses at Regis University and will enable us to develop opportunities for transfer students to refresh and rebuild their math skills.

As the department develops online versions of these three (3) courses, the assessment process will aid in identifying the most effective sequence of topics and assignments for the online courses. It will also help define which areas to emphasize in the online courses to ensure that the student comprehends and master the desired competencies.

A final outcome will be to make the assessment process an integral part of all of the remaining math courses.

Back to Top


Rhode Island College
Department of Mathematics and Computer Science
Presenter: James A Schaefer, jschaefer@ric.edu

Background and Goals
Rhode Island College (RIC) had an accreditation visit from New England Association of Schools & Colleges, Inc. (NEASC) in 2000 which included in its report of 1 May 2001 (http://www.ric.edu/neasc_report/Neascreport.html) this statement about assessment in undergraduate education:

Like all dimensions of the College, undergraduate programs and instruction would benefit from a more systematic and on-going process of review, evaluation, and planning, beginning on the departmental level but moving up to the College level as well. Faculty must be engaged fully in this planning process. Assessment must also be an important factor in this process. While pockets of assessment currently exist, more work needs to be done in the area of student outcomes. One of the greatest benefits of comprehensive planning is that some of the academic benefits and achievements that have already been put in place would be assured for all students.

So, RIC and, thus, the Mathematics and Computer Science Department ("Department") were given a mandate to improve planning and assessment.

Description: What Did We Do?

  • During 2001-2002, under the leadership of Dr. Edward McDowell, the department Committee on Computer Offerings (COCO) wrote learning objectives for almost all of the courses in the computer science major. During spring 2003, under the leadership of Dr. Kathryn Sanders, the committee prepared more general objectives for the program, including both the B.A. program, and the new B.S. program.
  • During 2001-2002, under the leadership of Dr. Vivian LaFerla, the Mathematics Education Committee planned the content portfolios that would be required of all mathematics secondary education students. The portfolios include specially assigned problems from parts of exams, and extra out-of-class assignments, as described below, as well as an algebra and trigonometry exam. We are planning to extend these ideas as part of our assessment activities.
  • The Course Coordination Committee (CCC) under the leadership of Dr. Raimundo Kovac wrote learning objectives for individual Mathematics courses in [2002].
  • Two assessment committees were formed, one for the secondary education mathematics major in 2001, and a second, created in fall 2003, for the standard mathematics major and for the B.A. and B.S. in computer science.
  • Goals and objectives are still in flux, but something must be implemented for spring 2004, so even tentative goals and objectives will be included in philosophy statements used by faculty, but it is not clear if these will be included in the syllabi given to students.

Insights: What Did We Learn?

  • The Department has accepted that assessment will not go away, but does not openly admit that assessment can be used to improve education. Assessment is a task to be done.

Back to Top


Rhodes State College
Presenters: Mary Ann Hovis and Judy Giffin
email: Hovis.MA@rhodesstate.edu

Background and goals:
Rhodes State College decided to revamp their Assessment process. The initial goal was to determine and assess student learning outcomes.

Description: What did we do?
The first step in our assessment process was to construct a list of professional attributes from the cognitive, psychomotor, and affective domains of learning that a graduate will have achieved as a result of mathematical learning experiences. We then completed a table which consisted of the top fifteen student learning outcomes, with the appropriate learning domain and type of learning outcome, which the Mathematics Department currently produces. Next, we defined the appropriate learning scope (boundaries) of the program. We then identified strengths (and reasons for them), opportunities for improvement, and ways to change the opportunities into strengths. Next we created a standards table and included clear performance criteria that accounted for the quality of the program's learning outcomes. We identified factor(s) (measurable attributes) for each criterion. We determined the vehicles (where and when data will be collected) for capturing the data associated with each factor. We identified the key instruments or tools (rubrics) to measure each factor, and listed these in the table.

Two of the four performance criteria that we are currently assessing:
Students are able to:

  1. Apply mathematics in a problem solving situation demonstrating critical thinking; and
  2. Utilize appropriate technologies to perform mathematics effectively.

Baseline data was gathered winter and spring quarters 2003 to identify the current standards for two of the performance criterion. From the baseline data future standards were established.

The course of action to aid in the achievement of the future standards involves developing an example of a perfectly solved problem and communicating this information to all of the instructors for all of the Mathematics courses.

Insights: What did we learn?
In identifying the current standards for these performance criterions, the Mathematics Department developed common finals for several of the Math Courses. Many inconsistencies between instructors emerged. To alleviate some of these, a joint meeting of the instructors for each course will be held at the beginning of the quarter to review expectations for each course.

Back to Top


Precalculus in Transition: A Preliminary Report
San Jose State University

Trisha Bergthold
bergthold@math.sjsu.edu
Ho Kuen Ng
ng@math.sjsu.edu

 

Background and Goals
For several semesters, the SJSU Mathematics Department has been concerned about extraordinarily low student achievement in its precalculus course, Math 19. In each of the past several semesters, 40-45% of the 400-500 students who took this five-unit course earned Ds or Fs. All of these students must repeat the course if they wish to take calculus or some other course for which Math 19 is a prerequisite. The financial implications of this to the university are significant: it costs money, time and space to accommodate such a large number of repeat attempts to earn at least a C- in the course.

Two main questions arose. Are students being inappropriately placed in this class? Are the scope and sequence of topics appropriate? Our assessment of factors influencing low student achievement in Math 19 began by addressing these questions.

Description: What Did We Do?
We investigated placement practices by analyzing Math 19 course grades versus our current entry-level mathematics (ELM) exam scores and (for those students who took the calculus placement exam) versus calculus placement exam (CPE) scores. In addition, we conducted a beginning-of-semester diagnostic assessment and survey in each Math 19 section during Fall 2003.

By the time we present this poster, we will have re-examined the scope and sequence of topics in the entire trigonometry-precalculus-calculus I sequence (Math 8, 19, and 30 or 30P) by conducting a survey of faculty currently teaching precalculus and calculus courses, followed by a focus group discussion of the results.

Insights: What Did We Learn?
First, there is no correlation between Math 19 course grades and ELM scores. Second, there does appear to be a correlation between Math 19 course grades and CPE scores. We are currently considering whether to require a minimum score on the CPE to be eligible to take Math 19.

Although we have not yet collected the data on scope and sequence of topics, we know that quite a few faculty members have expressed the opinion that the current Math 19 syllabus is overly broad and insufficiently deep. We hope that the faculty survey coupled with focus group feedback will assist us in modifying the syllabi of these courses.

Back to Top


Year Two: Assessing Notebook Computer Use
in Two Freshman Core Mathematics Programs
Michael Huber and Alex Heidenberg
United States Military Academy

Since January 2002, the United States Military Academy has participated in the MAA's NSF project SAUM, studying assessment of new technology in the freshman curriculum; specifically, the notebook computer. The Class of 2007 is the second consecutive class which brings a laptop with a computer algebra system to the mathematics classroom every day. Students now have the capability to explore and discover mathematics and scientific concepts at a deeper level. The Department of Mathematical Sciences fosters a curriculum environment that hopes to develop creative and confident problem solvers. After one year, we have made a few changes in the way we both teach and assess the students, based on lessons learned, and this poster will outline our efforts in determining the success of this new curriculum after a second year. Given the modeling approach of our core mathematics programs, our assessment of students' individual work (quizzes, midterm examinations, fundamental skills examinations, final examinations) and group work (graded out-of-class projects, suggested problems, etc.) has been altered, to include assessments of student ability to use technology in the problem-solving process. We have also incorporated a student attitudes survey and we hope to include those results on the poster. Additionally, we will display the detailed rubric used in our assessment plan.

Back to Top


Using Practice Test in Assessment of Teacher Preparation Program
University of Texas at Brownsville
Jerzy Mogilski (presenter), Jorge E. Navarro, and Zhong L. Xu
jkm@utb.edu

Anyone seeking educator certification in Texas must pass examinations of professional knowledge and subject matter knowledge approved by the State Board for Educator Certification. Moreover, The State Board for Educator Certification (SBEC) adopts the accreditation standards for programs that prepare educators. The accreditation standards are based on the certification exam pass rates and must be met annually by each educator preparation program. To be rated "Accredited" a program must achieve at least 70% first-year pass rate or an 80% cumulative pass rate. Otherwise, SBEC rates programs as "Accredited under review" or "Not accredited."
The University of Texas at Brownsville is a young university established in 1992 with an enrollment of more than 10,000 students. The Department of Mathematics offers B.S. in Mathematics. The mathematics majors can choose degree programs with non-teaching and teaching options. Our study, which started in the fall of 1997, was motivated by a major concern of the administration of the University and the Department of Mathematics about the performance of our students in the certification exam for secondary mathematics teachers. This poster presentation will show how a test closely correlated with the SBEC certification exam was effectively used as an assessment tool for teacher preparation program and how the assessment findings were used in order to help our students in passing the certification exam.

Back to Top


Assessment of the Mathematics Major: UW Oshkosh Starts from Scratch
University of Wisconsin Oshkosh
John Koker (koker@uwosh.edu)
Jennifer Szydlik (szydlik@uwosh.edu)
Steve Szydlik(szydliks@uwosh.edu)

The University of Wisconsin Oshkosh is a public, four-year, regional comprehensive institution with a focus on teacher education. The Mathematics department graduates approximately 50 students per year, in several different emphases. Regardless of emphasis, however, all mathematics majors are required to take a common core of 6 courses before specialization, including courses in calculus, linear algebra, statistics, and abstract reasoning. Our team from UW Oshkosh participated in a SAUM Assessment with a goal of creating an assessment plan for the major.

Our team developed both short and long-term objectives for our department's assessment plan, and is in the process of carrying them out. In the short term, we organized a 2-morning workshop for our department to focus on revising goals for our majors and on writing specific content and process objectives for each of the six courses in our core. On the first day of this May workshop, the entire membership of the department engaged in a discussion of broad goals such as communication, problem solving, technology, modeling, validation and connections. Toward each of these goals we addressed two questions:

  1. What does the goal mean?
  2. How will we recognize achievement of the goal in our core courses when we see it?

In addition, we prioritized the goals and discussed whether we had other goals for our students that were not on an initial list.

On the second day of the workshop, department members wrote specific content and process objectives for each of our core courses. Faculty teams were assigned to each course, and the course objectives they developed were presented to the department. Process objectives were also discussed, using the six general program goals.

Following the department workshop, our team developed consensus summary documents for both the broad goals for our majors and for the individual course objectives. Our poster will focus on both the process of consensus building and the resulting documents that give us a framework on which to base assessment of the major. We will also discuss our department assessment plans, including the challenges we have faced in assessing student satisfaction via an alumni survey.

Back to Top


Assessment in Freshman and Sophomore Courses
West Virginia University
James Miller
miller@math.wvu.edu

The Department of Mathematics at West Virginia University is exploring methods to improve the performance of our students in our freshman and sophomore courses. These courses have about 3,000 students each semester. For the initial assessment, the pre-calculus course was selected. This course has students continuing on to our calculus course and courses in other departments within the university that requires a course in College Algebra and/or Trigonometry, so the knowledge that we gain from this study can be extended to other lower level course.

To start the study, a survey was taken to insure that the topics for the pre-calculus course formed the core concepts and skills need to success in calculus. The topics are also consistent with the needs and requirements of other departments within the university. The method of assessment for this year is to develop and use gateway examinations.

The poster presentation will:

  • Cover the mission of the course.
  • Illustrate the topics covered in the gateway exams.
  • Summarize the results for the Fall 2003 semester.

Back to Top