![]() |
Additional Online Case Studies & Appendices | |
Prospectus: Assessment of Quantitative Reasoning in
Applied Psychology at Portland State University Robert R. Sinclair, Ph.D. Dalton Miller-Jones, Ph.D. Jennifer Sommers, MA Portland State University Department of Psychology Portland, OR 97207 (503) 725-3965 Sinclair@pdx.edu Psychology has a long
history of interested in assessment in a wide variety of contexts. Similarly, psychologists have played a
leading role in much of the scholarly research on learning and behavioral
change. However, only within the last decade have psychologists begun to pay
serious attention to assessment of learning in undergraduate psychology
programs. Perhaps the most tangible
demonstration of this interest is the recent release of Undergraduate Psychology Major Learning Goals and Outcomes: A Report,
a task force study endorsed by the American Psychological Association (Murray, 2002). The study identified learning outcomes for
10 educational goals in psychology [available on-line at
www.apa.org/ed/pcue/taskforcereport.pdf].
Quantitative literacy (QL) plays a prominent role in several goals
described in the report. For example,
the Research Methods in Psychology
goal explicitly focuses on data analysis and interpretation. Similarly, the Values in Psychology goal emphasizes the utility of the scientific
method and the value of using empirical evidence to make decisions. Many psychology programs
heavily rely on math departments to provide their statistical training. This most commonly occurs at the
undergraduate level, but graduate level programs also either encourage or
require students to take quantitative methods courses taught by math
faculty. In either case, QL is a
critical component of the psychology major, since advanced courses assume
students have a grasp of basic statistical concepts and an understanding of how
to apply them to psychology. Thus,
mathematics departments often play a critical psychology training. Consequently, strong QL assessment efforts
provide both psychology and math programs with useful data about whether their
statistics training accomplishes each program’s educational objectives. In
1998, the Portland State University psychology program responded to a request
by the Dean of the College of Arts and Sciences that all departments identify
learning goals/objectives their majors should have at graduation. Part of the
motivation for this request was the knowledge that the next round of higher
education accreditation review, at the time 5 years in the future, would
require a focus on authentic indicators of student learning, not just the
traditional set of input data (e.g., the number and kinds of classes taught,
student enrollments). In response to this challenge, the psychology department
crafted an assessment vision involving tracking student learning from our
initial introductory courses, through our research methods and experimental
psychology courses, to our advanced seminars in industrial/organizational,
applied developmental, and applied social psychology. This design enables us to “practice what we preach” by basing our
programmatic decisions on powerful empirical data. Method Our initiative began with a series of workshops in which
psychology faculty generated a approximately 50 valued learning outcomes. These outcomes were organized into 9 broad
learning goals that closely resembled the goals suggested by the APA task force
report described above. Faculty also
indicated which learning outcomes and goals pertained to each of their
courses. We used summary ratings (by faculty)
of these outcomes and goals to establish assessment priorities. We then organized the learning goals into
four categories: Theories and Issues, Student Engagement, Application of
Psychology, and Research Methodology and Statistics. Consistent with our
description above, the faculty ratings suggested that Research Methodology and
Statistics should be the highest priority topic in the assessment
initiative. Our decision to focus on QL
issues also mirrors one of the key ability areas identified by Portland State
University’s faculty senate for our graduating seniors: Quantitative
Reasoning and Representation –
ability to deepen understanding of the value and need for this type of
reasoning, the ability to understand the graphical presentation of data, and to
transform information into quantitative and graphical representations. As we conceive of it,
research methodology includes four topics: Research design, (e.g., use of
experimental, observational, questionnaire strategies), Scientific method,
Reliability and validity (in psychological measurement), and Statistics. QL knowledge forms the foundation of several
of these topics. For example, the statistics area focuses on three concepts:
central tendency, variation, and association.
Upon graduation, we expect students to be able to conduct and present
the findings of basic statistical tests in each of these areas as well as to
interpret and critique presentations of these tests in published empirical
literature. Our first programmatic
assessment efforts involved the development of a 20-item multiple choice exam
covering topics related to research methodology as described above. The test
sampled items from all four topical areas with a substantial portion of the
test covering QL topics. The test was administered during two academic terms to
over 800 students taking a wide array of undergraduate psychology classes (from
freshman to senior level). The classes
were sampled strategically to provide indications of students’ progress as they
entered the major (i.e., at the beginning of the courses in our introductory
sequence), toward the middle of the major, and as they entered senior level
advanced seminars. This strategy
enabled us to compare student performance across the curriculum and to
ascertain whether our training improved students QL knowledge &
skills. Findings Our preliminary statistical
analyses showed generally low levels of performance across all three levels
(most students answering less than 60% of the questions successfully), albeit
with a 14% overall improvement in test scores across the program. The final case will expand on these
analyses, including comparisons of psychology majors with minors and
non-majors, comparisons of students who have and have not taken a 2-course math
statistics sequence or our psychology research methods course, as well as
comparisons of student performance across specific subcomponents of the
test. We also will discuss the
programmatic implications of these data and describe our on-going efforts track
whether our new teaching initiatives are having the desired effects on these
scores. Insights Our preliminary analyses
indicated two distinct conclusions about quantitative reasoning among
psychology majors. First, we noted
generally poor performance on the exams, suggesting the need for increased
attention to these topics in classes.
Second, we noted consistent patterns of improvement across levels of the
major- suggesting that our current curriculum benefit students. Our efforts to address these findings
include the following initiatives:
Conclusions Each year, we make small but
tangible improvements to the depth and breath of our assessment
initiative. Along the way, we have had
many opportunities to learn from our mistakes, and even a few opportunities to
benefit from our successes. Perhaps the
most important thing we have come to appreciate is the importance of, and the
challenges with aligning course content and course assignments with assessment
goals. For example, our decision to
consciously focus on QL issues required us to add additional course time to
that topic AND to cut the amount of time devoted to other topics. These decisions can be complex, emotionally
arousing, and even adversarial if not handled properly. Our proposed case will connect our empirical
data analyses back to some of the strategic implications of assessment and
present additional details about our specific conceptual model, our strategic
plans, and recommendations to other programs considering quantitative literacy
assessments. References Murray, B. (2002). What
psych majors need to know. Monitor on Psychology, July/August 2002. |