SAUM Additional Online Case Studies & Appendices


Prospectus for SAUM Case Study

Paradise Valley Community College

Larry Burgess and Rick Vaughn

 

Abstract

            As appropriate when article is completed.

 

Background

            In this section we will detail some demographics of our campus and student population. In particular, we will discuss one of the unique challenges we face “student swirl”. As one of ten colleges in the Maricopa Community College District, we do not have a well-defined student population. Students come and go and often complete their degree sequences by taking classes at 3 or 4 different colleges. We will also discuss our history of assessment and our desire to implement a meaningful program, not just something to satisfy external constituents.

 

Description

            After a brief description of the development process, we will share our current assessment program:

 

Our assessment plan is evolving into a 4-faceted approach.

 

1.      We have created a database that goes back five years.  Every section of every class we have offered is entered.  The data includes instructor name and status (adjunct or full-time), delivery method (Academic Systems, Flex Express, etc), number of students enrolled, number of students dropped or withdrawn, number of students with grades of A, B, C and D or F.  Each semester from now on, we will enter the same data so that we can produce reports of a purely statistical nature regarding completion rates, etc.

 

2.      We are assessing course competencies on an individual course basis through the use of focus groups of faculty members teaching the specified course.  To date, these efforts have consisted of

·        choosing a few course competencies,

·        designing multiple choice questions over them,

·        embedding the questions in semester final exams,

·        compiling the percentage of correct responses,

·        identifying which, if any, of the competencies we need to try to improve on,

·        giving instructors free rein to choose their own methods of improving success on the chosen competencies,

·        re-testing the competencies the following semester and comparing results,

·        meeting to discuss the methods that produced the best results, and

·        repeating the process for another set of competencies.

 

3.      We are also assessing mathematical learning outcomes that are more generic in nature.  That is, they transcend individual course boundaries.  For example, “Describe a trend indicated in a chart or graph and make predictions based on that trend.”  As you can see, this learning outcome is as applicable to an Introductory Algebra student as to a Calculus student.  Our method of assessing these learning outcomes has been to provide instructors with the scoring rubric which contains the outcomes and ask them to assign a project, test, or other means of their choosing to check their students’ progress.  Some focus groups have chosen to write just one such instrument for all the sections of that class.  Samples of student work are then randomly chosen from each section, copied and returned to the instructor.  The randomly chosen samples are “sanitized” (names, class, etc. removed) and the entire collection is scored by a sub-committee of the Student Academic Achievement Assessment Committee as part of the campus-wide assessment plan.  The next step for this aspect of assessment is to refine the learning outcomes.  We have been using ones contained on the scoring rubric which we obtained from another source.  We want to make sure the learning outcomes are what we at PVCC want to see for our students.  Rick and I hope to begin drafting/revising these outcomes during the Richmond meetings.

 

4.      The fourth segment of our plan will be a student questionnaire designed to gauge their level of satisfaction with class times, instructional formats, etc.  This questionnaire is still in the planning stage.

 

Insights

 

Producing this plan is and has been one of our major departmental goals.  We still have considerable fine-tuning that must be done before we have a mature, effective assessment plan. We will share additional details about our successes and failure to date and plans for the future. For example, we have been very fortunate to have very good faculty buy-in. Our focus groups have led to some immediate improvements, although it is difficult to document exactly what we did differently. The database is yielding interesting information, but we have yet to act on it. The rubric has mostly been an exercise. We are still looking for a good way to use the data we are producing.

 

Acknowledgements

 

As appropriate.