SAUM About SAUM


Year 2 Annual Report
Supporting Assessment in Undergraduate Mathematics (SAUM)
NSF/EHR/DUE/ASA Grant # DUE-0127694
January 1, 2002 to December 31, 2004
November 2003

Mathematical Association of America (MAA) with a sub-award to the University of Arkansas
Co-PI and Project Director: Bernard Madison
PI: Michael Pearson
Senior Personnel: Bonnie Gold, William Haver, William Marion, Thomas Rishel, and Lynn Steen
External Evaluator: Peter Ewell


I.           Introduction
II.        Stimulating Thought and Discussion
III.        Expanding and Updating Case Studies
IV.     Constructing the Web Site
V.      Synthesizing Case Studies and Assessment Tools
VI.    Development of Workshops
VII.   Dissemination
VIII.  Personnel Changes
IX.   Budget Changes

X. Appendix 1 – preliminary evaluation of section forums
XI.  Appendix 2 – preliminary evaluation of workshops

I.                   Introduction

The purpose of SAUM is to support faculty members and departments in efforts to assess student learning in coherent blocks of courses in undergraduate mathematical sciences or in innovations that may consist of a single course.  Initially, the targeted blocks of courses were the undergraduate major, courses for future teachers, placement programs or developmental courses, and general education courses.  During the first two years of the project, another block has emerged – the mathematics courses for mathematics intensive majors.  In addition, assessment of learning in a single course is of particular interest if the course involves innovations or is being assessed for comparison across institutions.   

The work of the project as proposed was in the following areas:

v     Stimulating Discussion and Thought
v     Expanding and Updating Case Studies
v     Constructing the Web Site
v     Synthesizing Case Studies and Assessment Tools
v     Development of Workshops
v     Dissemination of Total Project

This report will address progress in each of these six areas and will note expansions of the original plan.  Preliminary evaluation information is included in Appendices 1 and 2.    

II.                Stimulating Discussion and Thought

v     Mailings to 3000 Departments and Chief Academic Officers

During the first few months of SAUM, copies of MAA Notes #49, Assessment Programs in Undergraduate Mathematics, were mailed to 3000 departments of mathematics in two-year and four-year colleges and universities along with a letter describing SAUM.  At the same time, letters describing SAUM were sent to chief academic officers of 3000 two-year or four-year colleges and universities encouraging them to support assessment activities in their departments of mathematics.

 

v     Section Forums

The use of section forums was added to the project after the proposal in order to ramp up interest in assessment. Section forums were devised as a way to encourage mathematics faculty to build assessment programs and were offered to each of the 29 MAA Sections (each covering a geographical region of the US).  We have conducted forums in sixteen sections and have another scheduled.  See the list below.  The format for these 90-minute sessions varied, but generally consisted of an introductory talk about assessment and SAUM, a middle portion that presented some specific assessment tool, and a final segment that reported on assessment activities in the particular MAA section.  Some evaluation information on the section forums is attached as Appendix 1. So far, over 300 faculty have attended section forums.

Section Date SAUM Leader Section Contact

Florida March 1-2, 2002 Tom Rishel Marcelle Bessman
Louisiana-Mississippi March 1-2, 2002 Bonnie Gold Frank Serio
Kentucky April 5-6, 2002 Bonnie Gold William Fenton
Oklahoma-Arkansas April 5-6, 2002 Bernie Madison Carolyn Eoff
Iowa April 5-6, 2002 Sandy Keith Mark Johnson
Michigan May 10-11, 2002 Bernie Madison John Mooningham
Pacific Northwest June 20-22, 2002 Bonnie Gold Ken Ross
Rocky Mountain April 12-13, 2002 Bill Marion
Southern California Oct. 12, 2002 Bonnie Gold Mario Martelli
Intermountain Oct. 25-26, 2002 Bill Marion
Northeastern Nov. 22-23, 2002 Bonnie Gold Ockle Johnson
Indiana Mar. 28-29, 2003 Bill Marion
Illinois Mar. 28-29, 2003 Tom Rishel Carol Schmidt
Allegheny Mountain April 4-5, 2003 Tom Rishel
Michael Pearson
 
Metro New York May 3, 2003 Bonnie Gold
Bill Marion
North Central Oct. 24-25, 2003 William Martin Walter Sizer
Southeastern Section (sched.)   March 26, 2004 Bernie Madison Martha Abell

v     Panels

·        Panel on SAUM and assessment at Mathfest 2002 in Burlington, VT.  Panelists were Bernard Madison, Bonnie Gold, and Brian Hopkins

·        Panel on Assessment at Joint Mathematics Meetings in Baltimore, MD on January 16, 2003. SAUM-connected panelists were Bernard Madison (SAUM PD), William Marion (SAUM Senior Personnel), William Martin (Participant in Workshop #1 and leader in #2 and #3), and Barbara Moskal (participant in workshop #3).  This panel was organized by the MAA’s Project NExT.

v     Paper or Poster Sessions

·        Contributed Paper Session on Assessment of Student Learning in Undergraduate Mathematics at MathFest 2003 on August 1-2, 2003, at Boulder, CO.  Sponsored by SAUM and organized by Bill Marion, William Haver, and Bernard Madison.  Of the twelve contributed papers, ten were from participants in SAUM workshops. 

·        Poster Session at Joint Mathematics Meetings in Phoenix, AZ, January 9, 2004.  Sponsored by SAUM and organized by William Haver and Bernard Madison.  As of this writing, eight posters have been accepted for presentation, all from participants in SAUM workshops. 

·        Invited paper Session at Joint Mathematics Meetings in Phoenix, AZ, January 9, 2004. Sponsored by SAUM and organized by William Haver and Bernard Madison.  Twelve papers are scheduled, all by teams of participants in SAUM workshops.

v     Mini-course on Assessment. Two of the SAUM senior personnel, William Marion and Bonnie Gold, will lead a four-hour mini-course on Developing Your Department’s Assessment Plan, at the Joint Mathematics Meetings in Phoenix, AZ, on January 8 and 10, 2004.

v     Assessment Reception.  Responding to requests from participants, we have scheduled a reception for all participants in the first three SAUM workshops on Wednesday, January 7, 2004, 6 – 7 pm.  The purpose of this reception is to promote networking and to make announcements about assessment activities at the Joint Mathematics Meetings and other events.

v      Interdisciplinary Forums.  Bernard Madison has made two presentations to interdisciplinary forums on assessment as a part of the work of SAUM.  One was to the Project Kaleidoscope Roundtable for the Future, Assessment in the Service of Student Learning, held at Duke University on March 1-3, 2002.  The session title was “Exploring the role and impact of assessment practices at the institutional level.”  Bernard Madison was a member of the Steering Committee for this Roundtable.   The second presentation was to the conference of the Association of American Colleges and Universities on General Education and the Assessment of Student Learning, February 21-23, 2002.  The session title was “Quantitative Reasoning and Numeracy: The Third R is No Longer Arithmetic.”

III.             Expanding and Updating Case Studies 

We expect to produce approximately 50 new case studies on assessment programs in undergraduate mathematics.  We now have nine that have been edited to the point that we have posted them on the SAUM web site.  Each of the forty teams in the first three SAUM workshops is expected to produce a case study, and other case studies have been solicited.  In addition, teams from the fourth (planned) workshop will write case studies.

Case study writing and editing will be a major SAUM activity in 2004, as planned. 

IV.              Constructing the Website

The SAUM web site is at http://www.maa.org/SAUM.  The site serves two major purposes: (1) Project communication; and (2) Resources on assessment.  At the present time, the web site is getting over 50,000 hits per month.  The content of the site includes the following:

v     Information for getting started with assessment.  This includes the MAA guidelines on assessment and some articles on the nature and history of assessment.

v     Information about SAUM including a file of Powerpoint slides for making presentations about SAUM.

v     The full text of MAA Notes #49, Assessment Programs in Undergraduate Mathematics, a volume of 72 case studies of assessment programs in colleges and universities.

v     Frequently asked questions about assessment.  Thirty-two frequently asked questions are given along with brief answers and references for further reading.

v     Bibliography on assessment.  This has four sections with 216 entries, many of which are annotated: 1. Assessment web sites; 2. Assessment of Mathematics: Policy and Philosophy; 3. Assessment of Mathematics: Case Studies; and 4. Postsecondary Assessment: Policy and Best Practices. 

v     Information about SAUM workshops.  This includes announcements, application forms, and schedules for upcoming workshops as well as information about on-going workshops. 

v     New case studies.  All workshop teams are required to write a case study describing their assessment program.  In addition, we have solicited other case studies.  After the case studies are edited a second time, they are posted.  Some will be selected for inclusion in a new volume of case studies.

v     Assessment links.  This is an extensive list of links – many more than in the bibliography – to web sites on assessment.

v     Sessions at national meetings.  This gives information about paper sessions or other sessions on assessment scheduled at one of the two annual national mathematics meetings.

v     Section forums.  This gives the schedule of section forums along with the forms used for evaluating the forums. 

v     The final component of the web site is a revolving set of headlines that are used to call attention to new materials or upcoming events. 

V.                 Synthesizing Case Studies and Assessment Tools

The purpose of this planned activity is to distill out of the case studies in MAA Notes #49 and the new case studies features of assessments that focus on a particular area of undergraduate mathematics.  Although selections are not final at this point, we are likely to synthesize the case studies on assessing the major, general education, and developmental mathematics.  We also plan some syntheses of other aspects on assessment such as assessment tools – portfolios, for example. 

Writing syntheses will be a major activity in 2004, as planned. 

VI.              Development of Workshops

Before the grant was announced for SAUM we began planning the first workshop, which was to be funded by the MAA PREP program (funded by DUE-CCLI-ND), but was subsequently sponsored and funded jointly by SAUM in order to make it possible to increase the number of participants from 20 to 33. Since then we have convened workshops #2 and #3, and a fourth is scheduled to begin in March 2004.  The workshops turned out to be more expensive than we had anticipated in the proposed budget, but with cooperative funding in workshop #1 and planned cooperative funding for workshop #4, we will accommodate significantly more participants for more workshop days than we had proposed.  We proposed 240 workshop participant-days in four workshops for 20 participants each for 3 face-to-face days.  The first three workshops averaged 5 face-to-face days and totaled 332 workshop participant days.   The original budget allowed $110 per participant day for room and board, which is too low for accommodations other than those at universities where dormitory rooms are used and rental for workshop facilities are low. Nonetheless, we have managed within the budgeted funds by some joint efforts with the MAA PREP program.  The data on the workshops are:

Workshop #1 – 16 teams

Face-to-face sessions (6.5 days total):       

January 10-11, 2002 in San Diego, CA (33 participants from teams)

May 22-25, 2002, Virginia Commonwealth University, Richmond, VA (29 participants) 

January 18-20, 2003 – Burkshire Marriott Hotel and Conference Center, Towson, MD (42 participants, joint with workshop #2)

Leaders: Bernard Madison, William Haver, Bonnie Gold, Sandra Keith, and William Marion.

Guest Presenters: Peter Ewell and Lynn Steen

Workshop #2 – 10 teams

Face-to-face sessions (4 days total): 

July 29-31, 2002 – Sheraton Hotel, Burlington, VT  (16 participants from teams)

January 18-20 – Burkshire Marriott Hotel and Conference Center, Towson, MD (42 participants, joint with workshop #1)

Leaders: Bernard Madison, William Haver, William Martin, Kathy Safford-Ramus, and Rick Vaughn.  The last three leaders were participants in workshop #1.

Workshop #3 – 14 teams

Face-to face sessions (4 days total):

March 14-16, 2003 - Paradise Valley Community College, Phoenix, AZ (32 participants)

January 5-6, 2004 – Wyndham Hotel, Phoenix, AZ

Leaders: William Martin, Rick Vaughn, and Laurie Hopkins.  All were participants in earlier workshops. 
Guest presenter: Peter Ewell

Workshop #4 (Planned)

Will focus on assessment of learning in the mathematics major and include a session on portfolio development and evaluation and a session on research results on learning. 

Face-to-face session in the Atlanta, GA, area in March 2004, and a second session in the Atlanta area in January 2005.

Online Workshop

An outline of an online workshop is being developed and implementation is planned sometime in 2004 after we have assessed the experience of the face-to-face workshops.

Evaluation of the workshops is proceeding.  The best evaluation session so far was a de-briefing at the Towson session, which combined the teams from workshops #1 and #2.  A summary of that session by the external evaluator Peter Ewell is attached at the end of this report as Appendix 2.      

VII.           Dissemination

We have disseminated widely the contents of MAA Notes #49 – to 3000 mathematics departments in hard copy and the entire contents is on the SAUM web site.  We will disseminate the new volume of case studies and syntheses in like manner.  As described above, educational activities on assessment have been extensive, spanning the US and involving hundreds of faculty from the mathematical sciences in workshops, forums, paper sessions, or panels on assessment.  The SAUM web site is being hit over 50,000 times per month at the present time.

VIII.        SAUM Personnel Changes

Thomas Rishel was the original SAUM PI, partly because of his position as Director of Programs and Services at MAA.  In June 2002, Tom left the MAA position and Michael Pearson assumed the position.  Consequently, Michael became the SAUM PI, but Thomas Rishel continued as one of SAUM’s senior personnel from his position with Cornell University. 

One of the original SAUM senior personnel, Sandra Keith, dropped out of the project in May 2002.  We have added the following people as workshop leaders, forum presenters, and case study editors: William Martin (North Dakota State University), Kathy Safford-Ramus (St. Peter’s College), Rick Vaughn (Paradise Valley Community College), Laurie Hopkins (Columbia College), and Richard Jardine (Keene State College).

IX.              Budget Changes

The budget funds to pay consulting stipends to Sandra Keith (approximately $8,500) will be used to pay the new senior personnel (see above) for assuming the responsibilities planned for Sandra Keith.  We will likely request that approximately $20,000 in unexpended stipends for senior personnel be shifted to support the planned workshop #4.   

X.            Appendix 1 Section Forums Evaluations

SAUM Section Meeting Evaluation Results to Date

By Peter Ewell

Eleven SAUM workshops were held at Section Meetings in 2002 with a total documented attendance of 203 (the number completing the Pre-Assessment).  At each of these, participants were requested to complete a brief Pre-Assessment survey with items concentrated primarily on the current status of their institution and department with respect to assessment, and a Post-Assessment survey reacting to the workshop itself. 

Pre-Assessment Results.  The Pre-Assessment was intended as much to gather baseline data about the condition of assessment among departments and institutions as to gather evaluative information about the workshops themselves.  While clearly not drawn from a random sample of mathematics departments, results do suggest the approximate level of activity present (and they likely overestimate this because of the selection bias associated with attending a session on assessment in the first place). 

The following results for the 203 responses obtained are notable:

·        Institutional coverage is reasonable, but favors four-year institutions (only about 13% of respondents are from community colleges while these comprise 28% of the national total of institutions).  Conversely, research universities are slightly over-represented (about 10% of respondents while these comprise about 5% of institutions).  Other categories appear reasonable.  These results may be more in keeping with the national distribution of mathematics faculty. 

·        Respondents included a good representation of department chairs (20%) and others potentially involved in assessment at a level greater than line faculty including members of departmental committees (13%), institutional committees (7%), faculty governance (8%), or accreditation committees (6%).  80% were full-time faculty and respondents tend to be veteran faculty with many years of teaching experience.

·        About half of the mathematics departments represented indicate that they are engaged in assessment to some significant extent, having established an assessment committee (48%) or already assessing at least one aspect of the curriculum (40% assessing the mathematics major being the most popular choice).  Not surprisingly, formal engagement in assessment appears to vary by size and type of institution, with smaller and less complex institutions reporting less activity.

·        Most appear to be doing this in response to a larger institutional effort.  About half cite institutional requirements (46%) or accreditation requirements (53%) as primary motives.  Some 70% of institutions are reported to have formal assessment committees, with a further 9% in the works.  Again, this varies by type of institution.

·        The mathematics major is the most frequent target of assessment, with assessment of service courses and remedial courses running at about half the level of frequency as the major.  Other areas noted as desired objects of assessment that were not on the survey included courses intended to prepare future teachers and on-line courses.

These results seem intuitively reasonable, though there is no baseline data to check them against.  They do correspond roughly with national data on institutional engagement in assessment generally, in which about three-quarters of institutions report having established formal structures, but only about a third have established regular data-gathering procedures—mostly in response to accreditation.

Post-Assessment Results.  The primary goals of the Section meetings were to set a context for assessment at the level of the individual mathematics department and to inform participants about the SAUM project.  These objectives appear to have been met very well.  Among the 167 responses to post-assessment surveys at the eleven section meetings, the following results were notable:

·        Participants most valued the concrete aspects of the presentation.  More than 80% agreed (strongly or somewhat) that the sessions provided them with a useful network of resources and contacts and that it gave them an opportunity to think concretely about what they should do at their own institutions.  Verbal comments particularly highlighted the advantage of interchange and knowing about new resources.  But almost as many valued the context provided (national context and assessment language).  Convincing participants “that assessment is important” was less successful as a session outcome (though still cited by a majority), and may not be a good posture for the future.  [Indeed, verbal comments tended to emphasize the positive aspects of SAUM providing resources, but not “advocating” for assessment per se.]

·        Most valued aspects of the Section workshops, as indicated by participants’ verbal responses, were a) the opportunity for discussion and give-and-take about various assessment-related issues, b) knowledge of the resources SAUM was developing and, c) a succinct overview of the topic (though many said they already knew most of this).  A strong theme in these comments was the value of talking about assessment at the departmental and discipline level instead of the usual broad campus-level discussions that many participants have experienced in the past.

·        Most needed topics for future workshops included a) concrete examples of techniques and, b) examples of how various organizational/political obstacles were met and overcome (e.g. how to get faculty, students, Deans on board; how to make the process efficient, etc.).  The demand was strong for complete case studies that are both concrete and can serve as exemplars that prove that assessment at the department level can actually be done.

This pattern of results indicates that the basic goals of the Section meetings within the project are being met.  They also strongly validate the need for the kinds of products that SAUM is in the process of developing—concrete case studies of departmental effort and web-based resources on particular approaches and techniques that are readily available and presented in the language of the discipline.

XI.            Appendix 2 Workshops Evaluations

Summary of Comments at Concluding Session

SAUM Workshops 1 and 2 (Towson)

By Peter Ewell

Two multi-meeting SAUM Workshops concluded at the Burkshire Marriott Hotel and Conference Center in Towson MD on January 18-20, 2003.  These included a) Workshop 1, which first met in San Diego CA on January 10-11, 2002 and met a second time in Richmond on May 22-25, 2002 and, b) Workshop 2, which first met in Burlington VT on July 29-31, 2002 and had no second meeting.  The last session of the combined Towson Workshop featured an open discussion of the Workshop experience, and provided considerable useful information about its effectiveness and how it might be improved.  Salient points made by participants included the following:

What Worked.  Participants who spoke were overwhelmingly satisfied with the experience, citing that it had enabled them to make real progress in building an assessment project at the departmental level on their own campuses and find colleagues elsewhere that could support them.  Specific aspects of the Workshop that received favorable comment included the following:

  • Multiple workshops meant that participants could learn things, go back to their campuses and apply them, then return to demonstrate and share.  Virtually everyone who provided comments on the multiple-workshop design maintained that a format featuring several encounters over time provided a far better learning experience than a “one-shot” workshop.
  • The team basis for participation meant automatic colleagueship and mutual support.  Having a team present also meant that it was possible to attend multiple sessions and pool knowledge gained.  Simply being together away from the pressures of everyday campus work to plan next steps was also deemed helpful.
  • The need to present campus progress at each workshop provided peer pressure to keep the process moving.  Teams knew that were going to have to present something publicly, so worked hard to have progress to report. 
  • Working with other campuses helped build a feeling of being part of a larger “movement” that had momentum.  This was especially important for faculty who felt they had been “thrown into a leadership position on assessment” with little real preparation for this role.  Knowing that others were in the same position and sharing approaches to what to do about it was important.
  • The Workshop helped legitimize the work of developing assessment at participating campuses when the team got to work back home.  The sense of being part of a larger, recognized, and NSF-funded project was important in convincing others that this effort was important.
  • The “mentorship” aspect of the project was especially well-received.  Participants who spoke uniformly felt that the continuing formal and informal guidance provided by the steering committee member assigned to work with their team was an especially beneficial aspect of the workshop experience.
  • Participants appreciated the opportunity to hear from people who could talk about the national scene—presence and presentation by people like Lynn Steen and Peter Ewell were seen as validating the project from the outside and providing a view of assessment from a wider perspective.
  • The social experience of being together for several days with shared meals and informal conversation helped the process of sharing ideas.  A lot of “outside of class” conversation was felt to be important in both generating new ideas and building solidarity.

What Could Be Improved.  Although overwhelmingly satisfied, participants did advance a number of specific suggestions about what could have been done better.  Among these were the following:

  • Participants in the Burlington (2-session) workshop format would have liked to have experienced a longer 3-session format like the San Diego group.  Some felt that they were just getting started and needed more time and peer support to develop their projects.  Attitudes toward assessment had also changed by later meetings.  As one participant put it, “earlier workshops were about responding to outside pressure…Towson was more about doing this for ourselves.”  This shift required time to accomplish—outside pressures were deemed important in getting things going, but sustaining them needs the kind of internal motivation that was only developed over time through collective action.
  • Some participants felt the sessions were a bit too crowded at Towson.  There was simply not enough time available to get into discussions in any depth, though it was recognized that there was a lot to cover as well.
  • Some participants at Towson missed a formal opportunity to meet with others from similar campuses—small liberal arts colleges, research universities, community colleges, etc.  This was a feature of the Richmond meeting and was seen as valuable.
  • Participants wanted more formal opportunities to meet with outside resource people.  Although Peter and Lynn were willing to meet with teams informally (and did so with several), the arrangements for this were not announced in advance or scheduled as they were with steering committee “mentors.”
  • Although most who spoke about the topic believed that the three-meeting format was superior to the two-meeting format, some found it beneficial to hear from projects that were at different “stages” of development.  Towson was particularly useful for some members of the Burlington (2-session) group because they could see the progress made by (and could seek advice from) members of the San Diego (3-sesssion) group.  Some kind of interchange between teams at different stages was therefore thought to be a useful design feature for future workshops.

Next Steps for the Project.  In addition to specific suggestions about how the workshops might be improved, participants noted a number of actions that they thought might improve the effectiveness of the SAUM project as a whole.  These included the following:

  • Many participants emphasized the need to “spread the word” in multiple formats.  They felt that the biggest shifts they had experienced were about their attitudes toward assessment, and that this was difficult to communicate to others back on campus who had not experienced such a shift.  Emphasizing attitude and faculty ownership in all forms of communication was thought to be important.
  • The planned case studies and other resources planned as a result of the project were positively supported.  As one speaker put it, “we need another Math Notes #49.”
  • SAUM (MAA) might consider developing teams of people experienced with assessment to visit campuses on request to evaluate departmental assessment efforts and consult with those responsible for them.  On-site peer visitors could not only provide advice but could also help lend legitimacy to campus-based efforts.
  • MAA might work in partnership with organizations like ASA and those related to Quality Assurance to disseminate findings and build additional support.
  • Although several venues for presenting campus work are already available, some participants mentioned the need to expand available opportunities to present results around particular areas of interest in assessment—for example, oral communications in mathematics.  These could also form the basis of for a formally-constituted Special Interest Group (SIG) in MAA that could sponsor its own gatherings and continue the work.
  • Some participants noted the need for a “primer for the skeptical” that would raise and address the most basic questions about assessment in an engaging manner.  This could be developed from participant experiences and be presented in the format of Frequently Asked Questions (FAQ) or in some other accessible manner.  [Project leaders noted that an FAQ to be posted on the website was already in the works.]
  • With the initial round of Section meetings almost over, some participants felt that SAUM could do more with the Sections on a proactive basis—for example, workshop participants could work actively to disseminate their experiences at their own Section meetings next year through contributed papers or similar sessions.
  • Some participants felt that formal efforts should be undertaken to keep former workshop participants in touch with one another through email exchanges, the project website, etc.  A mechanism to link the project website with individual departmental websites that might contain exhibits, show progress, etc. might also be useful.
  • The need for external benchmark information against which to compare local assessment results was mentioned by some participants.  For example, the project might consider raising the matter of reviving support for existing MAA common examinations like the Calculus Readiness Test to provide campuses with standard benchmark information, for use on a voluntary basis.