David M. Bressoud April, 2009
Student Perceptions of On-line Homework
Evidence of Effectiveness
WeBWorK is one of many web-based homework systems designed for mathematics. As the largest such open source system, now used at over 150 colleges and universities, it is the one that has been studied most extensively. In addition to multiple choice responses, this system can grade free response numerical answers, free response answers involving mathematical expressions, and, in fact, any type of answer for which it is possible to write programmed instructions to determine correctness. The software is capable of recognizing equivalent answers. Each student sees individualized problems, gets immediate feedback on whether or not the problem is correct, and is encouraged to continue reworking the problem until she or he gets the correct answer. The instructor gets detailed statistical information on work on the assignment, both at the personal and class level.
Unlike some of the other systems, WeBWorK provides no hints or templates for working types of problems, but the WeBWorK page includes a button that enables the student to send an email message for help to the instructor, giving the instructor complete control over how much assistance to offer. This does result in increased email contact between instructors and students, an effect that most instructors appreciate.
After many years of development at the University of Rochester by Michael Gage and Arnold Pizer, the MAA is now in the early planning stages for taking on responsibility for hosting and supporting WeBWorK. In the meantime, MAA hosts the home page, webwork.maa.org, that describes WeBWorK and connects to all of the resources including software downloads, instructions and advice how to set it up and how to use it, user discussion groups (both basic and advanced), and links to extensive libraries of problems. For those interested in trying it out in a few classes, but not yet ready to set it up on one of their own servers, Gage and Pizer have hosted a limited number of WeBWorK courses for other schools on a University of Rochester server. The MAA intends to provide this service in the future.
The advantages of WeBWorK include the simplicity of setting it up, the extensive libraries of problems written for a wide variety of courses, the existence of a large network of experienced users who are eager to share their knowledge and insights, and the ease with which instructors can use its Perl-based language to create their own questions. In addition, the fact that this is all freeware has encouraged many individuals to work on its further development.
Jim Glimm, former President of the AMS, was so taken by the effectiveness of WeBWorK at SUNY-Stony Brook, that he has spurred an NSF-funded survey, being run this month under the direction of Alan Tucker, of all departments now using computer software to grade homework. Glimm has seen improvement in the performance of calculus students at Stony Brook, "The key mechanism for this improvement seems to be that the students find their homework to be far more rewarding and do more of it, and, not surprisingly do learn more."
Other institutions that have seen improvements in student performance that they attribute to student use of WeBWorK include Columbia University, Morgan State University, Alfred University, The College of New Jersey, University of Portland, and Eastern Michigan University. More common than actual claims of improved student grades are positive impressions of student engagement with homework and reports that both students and faculty like it and find it easy to use. Testimonials to this effect have come in from the University of Michigan, Washington University, University of Utah, University of North Carolina-Charlotte, University of California-Irvine, University of California-Santa Barbara, Alabama State University, University of Maine, University of Kentucky, Arizona State University, and University of Wisconsin-Madison.
All of these are impressions that individuals have submitted of what they believe is happening to their students in their classes. Even though WeBWorK and other comparable on-line homework systems have been around for about ten years, there has been relatively little careful study of student attitudes and the effectiveness of using such systems. In the remainder of this article, I will summarize the findings that I have found in the literature. Virtually all of this literature relates specifically to WeBWorK. Because it was one of the first such systems, because it is very widely used, and because it is freely available, it is the system that has attracted the attention of most of the researchers in this area.
Student Perceptions of On-line Homework
One of the first groups of researchers to study WeBWorK was Roth, Ivanchenko, and Record . They surveyed the attitudes of 2387 students over the period 2002–04 at all levels of calculus as well as Discrete Mathematics, Differential Equations, and Linear Algebra. On a Likert scale of 1 to 5 (1 = never, 2 = almost never, 3 = sometimes, 4 = almost all the time, 5 = all the time), students were asked about
- Preference ("I prefer WeBWorK over paper and pencil homework"),
- Persistence ("Get an entire assignment correct." "WeBWorK forces me to keep up with the class material."),
- Effectiveness ("WeBWorK effectively prepares me for course examinations." "The immediate responses I get from WeBWorK help me learn the course material."), and
- Frustration ("Get frustrated with and give up on a particular problem due to mathematical difficulty." "Get frustrated with the syntactic requirements of answers you submit to WeBWorK.").
Average responses were 3.5 (between "sometimes" and "almost all the time") on Preference, 3.8 on Persistence, 3.2 in the first year rising to 3.3 in the second and third years on Effectiveness, and 2.4 falling to 2.2 by the third year on Frustration.
Several changes were incorporated into WeBWorK after the first year of this study including a Preview feature that lets students see the typeset version of the expression they have entered before submitting it, a feature that greatly reduced student frustration, and a warning that appears when a student attempts to submit an answer that has already been submitted and identified as incorrect. Before these warnings were incorporated, a third of all incorrect answers were previous incorrect answers that students had re-entered. The high fraction of students re-entering answers they had already been told were wrong is a reflection of one weakness of online grading that was also found by Gotel, Scharff, and Wildenberg : Students are less likely to trust the software than they would their instructor when told that their answer is incorrect.
Also in 2002, Hauk and Segalla  studied student reactions to WeBWorK for College Algebra at a "large publicly funded university in the western United States." They found that students were comfortable using computers to submit the homework and were just on the positive side of neutral in terms of ease of use of WeBWorK.
Evidence of Effectiveness
I have only been able to find three controlled studies of the effectiveness of an online homework system in improving student grades or mastery of the mathematics. All of these studies looked at the effect of WeBWorK. The bottom line is that
- If nothing else changes, then WeBWorK produces at most a very modest improvement in student performance. It clearly has no negative effect on student performance.
- There is some indication that WeBWorK does more to improve the performance of women than men.What is clear is that neither gender is disadvantaged by the introduction of WeBWorK. There is no evidence that any ethnic group is is either advantaged or disadvantaged by the introduction of WeBWorK.
- The greatest measurable benefits of online homework are that homework assignments that previously had not been graded now can be, and if homework assignments had been hand-graded by instructors, then those individuals are now freed to use their time on other activities that can improve teaching and learning.
A brief summary of the studies is given below.
Weibel and Hirsch, Rutgers University, 2001 [9,12]. The study was conducted in Fall, 2001, in Calculus I for non-science majors, using 1175 students for whom SAT scores were available. The control group that did not use WeBWorK consisted of 368 students. The remaining students had the same instruction format, but with the addition of WeBWorK assignments. They found a slight (about 1/4 of a grade) but statistically significant improvement by the WeBWorK students, even after controlling for scores on the placement exam. They found no gender effect. They did find that completion of at least 80% of the WeBWorK problems resulted in a substantial improvement in average grade (by at least half a letter grade), and that first-year students were much more likely to complete the WeBWorK assignments than were students who were past their first year.
Hauk, Powers, Safer, and Segalla, a "Big Public University," 2002 . This study was conducted in Fall 2002 with the 644 students enrolled in 19 sections of College Algebra. The control group consisted of seven sections with 236 students. There were three instructors who had both a control section and a section using WeBWorK. Student performance was measured using a standardized multiple choice test that was given at the start and end of the term as well as several problems that all instructors were required to use in the final exams. No statistically significant differences (at the 0.05-level) were found between the control and the WeBWorK sections, with or without controlling for SAT scores. There also were no statistically significant differences when the data were broken down by socio-economic status or ethnicity. They did find a small but statistically significant difference for women. Women did better in the WeBWorK sections.
Dedic, Rosenfield, and Ivanov, Concordia University, Montreal, 2006 . This study was conducted in Fall 2006 with 354 students studying Calculus I for Social Science majors. There were 354 students in nine classes, three of which were assigned to each of three conditions: C1 (control, 118 students) consisted of class lectures and paper and pencil homework, C2 (only WeBWorK, 114 students) was identical to C1 except that homework was to be submitted using WeBWorK, and C3 (WeBWorK plus, 122 students) consisted of the same lectures as the other groups, but also included one hour per week spent with the instructor in a computer lab working on the WeBWorK problems. The study found no significant differences between the students in groups C1 and C2. It did find significant differences between the students in these two groups and those in C3. This was a highly nuanced study with many factors considered including prior achievement, prior knowledge of algebra and functions, motivation, sense of self-efficacy, and gender. The students in group C3 demonstrated greater achievement, greater perseverance (completing more of the assignments), and a greater improvement in sense of self-efficacy. This last was especially notable for women.
There is one other study that is worth mentioning, an ex post facto analysis by Dufresne et al  of introductory physics classes at the University of Massachusetts-Amherst during 1997–2000. They found that students using online homework scored a third of a standard deviation higher than students in classes that collected paper and pencil homework assignments. In addition to the fact that faculty self-selected who would use the online system and no attempt was made to provide uniform instruction, the two groups differed in that only a few selected problems from each paper and pencil assignment were graded, while all of the online homework problems were graded.
MathFest in Portland, Oregon this coming summer will feature a panel discussion on "Assessing the effectiveness of online homework" by Helena Dedic, Steven Rosenfield, Vicki Roth, and Michel Scott. There also will be a contributed paper session on Online Homework - Innovation and Assessment at the Joint Math Meetings in San Francisco in January, 2010.
The first and clearest conclusion is that WeBWorK does no harm to student learning, while at the same time freeing instructor time to focus on other aspects of the course. This is a significant benefit. The second conclusion is that just adopting WeBWorK or a comparable online homework system may do some good, but one should not expect a dramatic improvement in student performance if nothing changes except the adoption of online homework.
Like the clickers that I discussed last month ("Should students be allowed to vote?"), the technology is significant not because it improves student performance, but because it facilitates other changes that can improve student performance. One would hope that faculty who are freed from grading the kinds of procedural problems that are most easily handled by online systems will use the saved time to work with students on higher order questions. The most common complaint from students who must submit homework using an online system is that they miss receiving personalized feedback. At most institutions that are considering the adoption of an online homework system, such personalized feedback has been impossible. Online homework systems may now actually facilitate it.
 Cassady, J. C., J. Budenz-Anders, G. Pavlechko, W. Mock. 2001. The Effects of internet-based formative and summative assessment on test anxiety, perceptions of threat, and achievement. presented at Annual Meeting of the American Educational Research Association (Seattle, WA, April 10–14, 2001). 12 pages. ED453815 at http://eric.ed.gov
This study shows the benefits of using online assessment tools because of the decrease in test anxiety that results when students are able to schedule tests at a time that is most convenient for them.
 Dedic, H., S. Rosenfield, I. Ivanov. 2008. Online assessments and interactive classroom sessions: a potent prescription for ailing success rates in Social Science Calculus. 210 pages. http://sun4.vaniercollege.qc.ca/PA-2005-008
This is the most recent and thorough controlled study of the effect of online homework.
 Denny, J. and C. Yackel. 2005. Implementing and teaching with WeBWorK at Mercer University. Proceedings of the 2005 ASCUE Conference. pp. 85–93. http://fits.depauw.edu/ascue/Proceedings/2005/p85.pdf
This article describes how WeBWorK has been implemented at a particular university and includes a discussion of the process of bringing the faculty up to speed on using it.
 Dufresne, R., J. Mestre, D. M. Hart, and K. A. Rath. 2002. The effect of web-based homework on test performance in large enrollment introductory physics courses. Journal of Computers in Mathematics and Science Teaching. 21(3):229–251.
This ex post facto study showed a significant improvement of students in sections using web-based homework over those using paper and pencil homework. The likely explanation is that using the web-based homework enabled all homework problems to be graded whereas only a few problems were graded under the paper and pencil system.
 Gage, M., A. K. Pizer, V. Roth.2003. WeBWorK: generating, delivering, and checking math homework via the internet. In Proceedings of the Second International Conference on the Teaching of Mathematics. New York:Wiley. http://www.math.uoc.gr/~ictm2/Proceedings/pap189.pdf.
This is a helpful overview of WeBWorK: what it is, what it does, and how it can be used. It also includes a summary of student responses.
 Gotel, O., C. Scharff, A. Wildenberg. 2007. Extending and contributing to an open source web-based system for the assessment of programming problems. ACM Principles and Practices of Programming in Java. pp. 3–3–12. http://portal.acm.org/citation.cfm?id=1294325.1294327
This paper shows how WeBWorK can be used to support a programming class.
 Hauk, S., R. A. Powers, A. Safer, A. Segalla. 2004. Impact of the web-based homework program WeBWorK on student performance in moderate enrollment college algebra courses. Preprint. http://hopper.unco.edu/faculty/personal/hauk/segalla/WBWquan.pdf
This was a study at a "Big Public University" that saw no statistical difference in grades between those who were in WeBWorK classes and those who were not. Broken down by ethnicity, they also saw no difference. But there was a statistically significant improvement for women when WeBWorK was used.
 Hauk, S. and A. Segalla. 2005. Student perceptions of the web-based homework program WeBWorK in moderate enrollment college algebra classes. Journal of Computers in Mathematics and Science Teaching. 24(3): 229–253.
This is a continuation of the study described in , focusing on student attitudes toward and how they used WeBWorK.
 Hirsch, L. and C. Weibel. 2003. Statistical evidence that web-based homework helps. MAA Focus. 23: 14.
This is a summary of the results in .
 LaRose, P. G. and R. Megginson. 2003. Implementation and assessment of on-line gateway testing at the University of Michigan. PRIMUS. 13(4):289–307. http://instruct.math.lsa.umich.edu/gw/primus.pdf
This paper describes how an on-line homework system has been used for gateway tests, significantly lowering workload while being at least as effective as paper and pencil tests in verifying that students have achieved basic mastery of computational skills in calculus.
 Roth, V., V. Ivanchenko, N. Record. 2008. Evaluating student response to WeBWorK, a web-based homework delivery and grading system. Computers & Education. 50: 1462–1482. Available through ScienceDirect: http://www.sciencedirect.com
This detailed study looks at student response to WeBWorK (very positive) and investigates how students actually use it across a wide variety of courses: calculus, multivariable calculus, discrete mathematics, differential equations, and linear algebra.
 Weibel, C. and L. Hirsch. 2002. WeBWorK effectiveness in Rutgers Calculus. preprint. 18 pages. http://www.math.rutgers.edu/~weibel/webwork.html
This reports on the controlled study at Rutgers University. The authors saw no statistical difference in grades between those who were in WeBWorK classes and those who were not. But there was a statistically significant half grade improvement (C+ to B) among those who were in WeBWorK classes and who did at least 80% of the online problems.
Access pdf files of the CUPM Curriculum Guide 2004 and the Curriculum Foundations Project: Voices of the Partner Disciplines.
Purchase a hard copy of the CUPM Curriculum Guide 2004 or the Curriculum Foundations Project: Voices of the Partner Disciplines.
Find links to course-specific software resources in the CUPM Illustrative Resources.
Find other Launchings columns