DRILL 3.1

Author(s): 
Vadim Ponomarenko

Distance learning can offer a solution to a long-standing challenge of undergraduate education: how to assign an appropriate amount of work to each student, and how to assess this work efficiently. In this paper I describe the Depository of Repetitive Internet-based probLems and Lessons (DRILL, [Link no longer works, Ed. 2013]), an online system for providing education and assessment for precalculus, which can substitute for routine problems given in homework and as exam questions. Its special features include:

Vadim Ponomarenko is an Assistant Professor of Mathematics at Trinity University.

  • adaptive testing,
  • on-the-fly question generation,
  • instant assessment,
  • context-sensitive help,
  • question balancing, and
  • two-dimensional test design.

Acknowledgements

I would like to thank an anonymous referee for the helpful suggestions used in the preparation of this article. I also thank the DRILL Project Team, and particularly Alex Kolliopoulos, for all their hard work.

The DRILL Project gratefully acknowledges the support of the Andrew W. Mellon Foundation, the Associated Colleges of the South, Trinity University, and the Herndon Fund for their generous financial support. It also acknowledges the University of Wisconsin, Trinity University, and the Department of Mathematics of Trinity University for their support in hosting the DRILL web site.

Published December, 2003
© 2003, Vadim Ponomarenko

DRILL 3.1 - Abstract

Author(s): 
Vadim Ponomarenko

Distance learning can offer a solution to a long-standing challenge of undergraduate education: how to assign an appropriate amount of work to each student, and how to assess this work efficiently. In this paper I describe the Depository of Repetitive Internet-based probLems and Lessons (DRILL [Link no longer works, Ed. 2013]), an online system for providing education and assessment for precalculus, which can substitute for routine problems given in homework and as exam questions. Its special features include:

Vadim Ponomarenko is an Assistant Professor of Mathematics at Trinity University.

  • adaptive testing,
  • on-the-fly question generation,
  • instant assessment,
  • context-sensitive help,
  • question balancing, and
  • two-dimensional test design.

Acknowledgements

I would like to thank an anonymous referee for the helpful suggestions used in the preparation of this article. I also thank the DRILL Project Team, and particularly Alex Kolliopoulos, for all their hard work.

The DRILL Project gratefully acknowledges the support of the Andrew W. Mellon Foundation, the Associated Colleges of the South, Trinity University, and the Herndon Fund for their generous financial support. It also acknowledges the University of Wisconsin, Trinity University, and the Department of Mathematics of Trinity University for their support in hosting the DRILL web site.

Published December, 2003
© 2003, Vadim Ponomarenko

DRILL 3.1 - Motivation

Author(s): 
Vadim Ponomarenko

Calculus and precalculus are the most important subjects taught in mathematics departments at colleges and universities -- a large proportion of all college students take calculus or some sort of precalculus at some point before graduation. This cannot be said of any other subject in the mathematics curriculum. For example, the Trinity University mathematics department in Fall 2003 offers 11 sections of calculus and three sections of precalculus. All other courses together comprise 10 sections. Furthermore, calculus is a prerequisite to many other courses, so an improvement in calculus and precalculus education would benefit almost all students taking mathematics during their undergraduate careers.

Calculus and precalculus education typically attempts to train students in a wide variety of skills. These frequently include the following major goals:

  1. Competence: Achieving fluency in mechanical computation and calculation of formal mathematical exercises.
  2. Modeling: Understanding and practicing the methods by which real-world problems can be modeled with formal mathematics.
  3. Proof: Learning the patterns of thought that allow the construction and comprehension of a mathematical argument.

Each of these has value in its own right, as well as value to support understanding of material learned in subsequent courses, mathematics or otherwise.

Of these three goals, the most basic and easily assessed is the first, Competence. In order to reach this goal, all students must practice. Because of the variation in skill and preparation among students, some will require substantially more practice than others on any given topic. The purpose of DRILL is to assist the instructor to help the students meet this single goal. By mechanizing the practice, it is separated somewhat from the instructor. This allows the instructor to focus more time and effort on other goals.

DRILL 3.1 - Three Paradigms

Author(s): 
Vadim Ponomarenko

Traditionally, instructors have had to choose from only two paradigms for having students practice problems: in-class or at-home. Each suffers from certain inadequacies.

Problems done in class are subject to a very strong time restriction -- they consume a limited resource. Unless the course is structured to provide abundant in-class time, either some students will receive inadequate practice or some other goals must be sacrificed. This problem extends to assessment -- by using limited test/quiz time to assess, other class goals must be sacrificed.

Problems done at home suffer from two important defects. First, there is normally a significant delay between when a student performs the work and when that student receives formative assessment. This can range from days to weeks and often can completely erase the benefit, as the focus of the course has often shifted by then. Second, it is impossible to monitor the level of collaboration employed or the amount of effort students employed. Susie might have found the assignment difficult and spent ten grueling hours, while Sally found it easy and spent forty minutes, while Cindy copied her answers in ten minutes. How can an instructor properly teach all three students when their final work looks the same?

These two paradigms share another problem: lack of customization. Susie could have used more practice, since she didn't really understand the material, while Sally found the last few problems boring and lost interest in the subject. This issue is particularly acute in prerequisite material -- the variability among students is enormous, as their mathematical histories before a given course can be quite different.

We need a third paradigm that does not suffer from these drawbacks. Problems solved online are this needed alternative. Students work problems on their own time, which is an essentially unlimited resource. With careful design, an online system can repair all of the other problems as well. It can give the instructor assessment information, relieving the need to spend class time. It can give immediate feedback, while the topic is still fresh and relevant. It can be secure, reducing or eliminating collaboration. It can monitor the time students spend. And it can deliver assignments of varying lengths -- Susie will get more problems, and Sally will get fewer.

In fact, this third paradigm can completely replace the two traditional paradigms for reaching the Competency goal, permitting the instructor to focus more energy on the other goals. Online problems can be as intensive as homework and as secure as exams.

DRILL 3.1 - Philosophy

Author(s): 
Vadim Ponomarenko

A good online system should repair the problems found in the other two paradigms and not introduce too many new problems. Some of criteria I feel are desirable:

  • Adaptive Testing
    • The duration of practice should be individually tailored to each student. Students who need more problems should get as many as they want.
  • Context-Sensitive Help
    • Students should get formative assessment appropriate to their needs.
  • Correctness
    • No new major difficulties or drawbacks should be introduced. The system should be stable, unambiguous, and mathematically correct.
  • Instant Assessment
    • Problems should be graded instantly -- students should get immediate feedback.
  • On-The-Fly Question Generation
    • Problems should be individually tailored to each student; so no two students get the same problems. This will discourage collaboration for its own sake, while permitting appropriate collaboration. Also, students may repeat the practice without encountering the same questions.
  • Security
    • The system should not be plagued with security flaws that would allow students to cheat and hackers to disrupt the works.
  • Simplicity
    • The system should be extremely easy to use for both students and instructors.
    • It should require learning an absolute minimum of non-subject material.
    • It should be friendly to people with varying degrees of computer literacy.
  • Summative Assessment
    • Statistics should be kept about the difficulties students have with the problems. This will eliminate the need for separate assessment later.

Furthermore, I believe the system should be free to use, for two philosophical reasons and one practical one:

  • First, the spirit of the World Wide Web is that of free and easy access to information. For example, consider the growing popularity of Linux and projects such as Wikipedia.
  • Second, the spirit of fair educational use is also that of free access.
  • Finally, budgets of students, educators, and educational institutions are often quite slim and do not allow for unnecessary expenditures. If a fee is charged, a significant portion of the potential users will stay away. The increasing popularity of for-pay systems only reflects a frustration with the lack of equally good free alternatives.

DRILL 3.1 - Other Systems

Author(s): 
Vadim Ponomarenko

 

The need for a third paradigm has been well-established, and consequently there has been tremendous growth in the development of online mathematics education delivery systems. For example, there are for-profit systems

Some of these have similar objectives, some are more modest -- not offering on-the-fly question generation, for instance -- while others are more ambitious, e.g., attempting to satisfy pedagogical goals other than Competence.

We are also aware of several free systems of this general nature:

Editor's note, 12/16/04: Links to three of these are broken and have been removed.

While these systems have many excellent features, they are all difficult for the instructor to use. To use any of these systems, one would have to be familiar with several of: HTML, DHTML, cgi scripts, LaTeX, Perl, UNIX, and an assortment of custom languages and language extensions. This obstacle may prevent some instructors from adopting an otherwise outstanding and worthwhile package.

DRILL 3.1 - Basic Features

Author(s): 
Vadim Ponomarenko

You can find DRILL at http://lagrange.math.trinity.edu/drill. The front page has some introductory information and a link to allow guests to preview the system.

introductory screen

It continues with an area for instructors to log in or for new instructors to create an account.

instructors' login

The final part of the opening page is the area where students can log in or create a new account.

students' login

DRILL 3.1 - System Requirements

Author(s): 
Vadim Ponomarenko

DRILL is run server-side through PHP scripts (see Bennett, 2002). Consequently, users should be able to use any browser that is HTML2.0-compliant, which includes most browsers currently in use. DRILL has been tested with Internet Explorer 6.0, Netscape 7, and Opera 7.1, although it should work with considerably older versions. The only user requirement is that cookies need to be turned on -- they are used as part of the security process.

DRILL 3.1 - Student Accounts

Author(s): 
Vadim Ponomarenko

Students have a variety of options once they have logged into their account. Their main page looks like this.

students' main page

By clicking on their name, they can access their account management screen. Here they can enter an e-mail address and change their account password.

account management screen

Students need to add all classes in which they are enrolled that use DRILL. This is done in the following screen.

student adds a class

Once a student has added a class, the instructor's exams for that class are visible. The student can see performance statistics, take an exam for credit, or practice an exam.

students' exams

Details of what happens during the exam will be discussed on Pages 11-15.

DRILL 3.1 - Instructor Accounts

Author(s): 
Vadim Ponomarenko

Instructors have a variety of easy-to-use options at their disposal. The first screen they encounter is this:

instructor homepage

By clicking on their name, instructors can access their account management screen. Here they can enter an e-mail address and change their account password.

account management screen

Instructors can create, manage, or view their classes. Here is the create-class screen:

class creation

In the class management screen, an instructor can assign exams to a class.

class management

When viewing a class, the instructor gets a wealth of statistics. Each student is listed, with student ID and a link to send e-mail. Each student's performance on each exam is listed.

view class, page 1

Also, an instructor gets a list of all assigned exams and can see their details.

view class, page 2

The instructor's exam-management screen allows several options.

exam management

It is simple to assign an exam to a class.

assign exam

Creating an exam is also easy. The two-dimensional aspect of this process will be discussed on Page 15, but the routine portions are shown here. The instructor can get help by clicking on any of the underlined topics, which are explained individually below.

create an exam, part 1

Each exam has a unique name. The instructor can select the length, the number of errors allowed so that a student will pass, a time limit and an attempt limit. Also, the instructor can decide whether a student can take the exam for "practice" -- that is, without counting against the above limits. The instructor can also decide whether to give context-sensitive help on errors. Finally, the instructor can decide whether the questions should be evenly distributed among all available questions or whether they should be chosen randomly (with the potential of asking some types more than others)

DRILL 3.1 - Security

Author(s): 
Vadim Ponomarenko

Though invisible to the users, a unique session ID is generated with each log-in. This ID is kept in cookies on the user's computer and sent to the server with each mouse click. The server keeps track of each session. It knows the associated user and what he or she is working on.

This has many benefits. A student cannot click "back" on the browser during an exam, because the server knows where the user is and will not serve up the previous question. Similarly, it is impossible to bookmark any page other than the opening page. However, if a student experiences power failure during an exam, the server will return the student to the exact question he or she was on after re-logging in (though time will still pass). If an experienced user tries to forge someone else's session ID by copying cookies or packet sniffing, the server will recognize the situation and foil the attempt. Once a user logs out, the server expires the session ID and the user's account is completely secure.

Apart from determined attack from experienced hackers, the only vulnerability of the system is collaboration: having one student take the test for another one. Instructors concerned about this vulnerability can schedule monitored sessions in a computer laboratory. This removes much of the benefit of having the exams on-line, but is useful periodically to keep the students completely honest.

DRILL 3.1 - Special Feature: Adaptive Testing

Author(s): 
Vadim Ponomarenko

Different students need different amounts of practice. This is achieved in DRILL by allowing the instructor several options. One way is to set the pass level high (such as 100%) and the number of allowed attempts greater than one. This way, a student can pass the exam on the first try, or can take it many times if more practice is needed. If a student fails an exam, the following is displayed:

exam ends with failure

The student who passes an exam sees

exam ends with success

Another instructor option is practice mode, whose details are displayed below. Practice mode gives feedback but doesn't count against attempts.

practice options

DRILL 3.1 - Special Feature: On-The-Fly Question Generation

Author(s): 
Vadim Ponomarenko

It would be pointless to allow students to retake an exam if it were identical each time. Students could simply memorize the answers -- not exactly a desirable sort of learning. The simplest measure is to vary the order of the questions -- DRILL does this -- but further, its test bank contains question models, not specific questions. Each time a question is presented, the parameters are varied.

The following question presents a false model of addition. The next time this question appears, the false model will remain, but the functions will likely be quite different.

exam question

Since the DRILL server generates questions on-the-fly, it is a trivial matter for it to also grade the answers. This immediate feedback is very helpful to students. In part to allow this convenient feature, the only questions currently supported are true/false and multiple choice. In the future, we plan to expand to include limited fill-in-the blank questions as well.

DRILL 3.1 - Special Feature: Context-Sensitive Help

Author(s): 
Vadim Ponomarenko

Having instant assessment is of limited utility if the student does not understand what went wrong. For this reason DRILL questions are written concurrently with detailed help, which is provided in response to any incorrect answer. This help is customized to the specific question at hand, including the parameters involved.

incorrect answer

DRILL 3.1 - Special Feature: Question Balancing

Author(s): 
Vadim Ponomarenko

To reduce the likelihood that a student can get a question right by its appearance, we have invested significant effort in balancing the questions. Each true/false question model is written in two halves, one true and one false, but both with very similar appearance. Furthermore, the phrasing and arrangement of the questions are varied. This reduces students cuing on appearance alone, and it effectively increases the size of the question bank.

DRILL 3.1 - Special Feature: Two-Dimensional Test Design

Author(s): 
Vadim Ponomarenko

Two-dimensional design is a feature unique to DRILL that sprang from a desire to give instructors a way to generate custom exams without needing to wade through countless questions by hand. A test is generated by selecting certain skills to be tested, as well as certain objects to which these skills shall be applied. The intersection defines the questions on the exam.

For example, an instructor might want to test the skill of solving 1-variable equations applied to the object of polynomials. Or, to test the skill of factoring applied to both of the objects polynomials and exponential functions. Once the instructor has selected the skills and objects, the server searches through the test bank and announces how many questions fit the desired criteria. The instructor does not need to select them individually -- this is done automatically.

Pictured below are sample choices for the skills and objects. The first checkbox in each list is a "none of the below" choice. Some questions may involve a skill but none of the special objects, or vice versa.

skills

objects

DRILL 3.1 - Results and Limitations

Author(s): 
Vadim Ponomarenko

Previous versions of DRILL offered only one exam, covering the topic of common algebra errors made by calculus students. It has been offered to students in first-semester calculus four times, dating back to 1998. Students were required to take the untimed exam at least once, and a small incentive was provided for successful completion. Students were free to take it wherever, whenever, and as many times as they wanted. Some students took advantage of this freedom to take the exam hundreds of times. Generally all exam-related activity took place within the first two weeks of the course.

A total of 145 students have taken that exam at least once, of which 68 completed it successfully. The others did not manage to receive a passing score, which was an unforgiving 100%. We distinguish here two groups of students -- those who went on to earn less than B in the entire course (moderate to poor calculus performance), and those students who went on to earn less than C (poor calculus performance). Naturally, we want to decrease the size of each of these groups, which would correspond to more students doing well. The following table shows aggregate data -- details are available upon request.

% of students who
passed the exam 
 % of students who
did not pass the exam
Earned < B 43 68
Earned < C 19 44

We can offer two explanations for the dramatic semester-long differences between groups that did or did not complete a single exam. Likely, both are true to some degree.

  • First, DRILL acted as a diagnostic tool. Though we did not do this during the four semesters of this experiment to avoid skewing the data, one could use the information provided by DRILL to identify weaker students and target them with additional help and resources.
  • Second, DRILL acted as an educational tool. Those students who completed the exam thoroughly reviewed their algebra skills, improved their weak points, and were fully prepared for the semester to come. Those who did not complete the exam found their algebra skills hampering them and fell behind. Though probably rather weak, this effect would be more pronounced in those students who spent ten or more hours taking and re-taking the exam.

DRILL still has a limited set of question models -- certain combinations of skills and objects are not fully implemented. By making additional question models, it could be useful in other courses as well.

DRILL 3.1 - References

Author(s): 
Vadim Ponomarenko

Bennett, A. (2002), Using PHP For Interactive Web Pages, Journal of Online Mathematics and its Applications, 2

Canfield, W. (2001), ALEKS: A Web-based Intelligent Tutoring System, Mathematics and Computer Education 35(2)

Hirsch, L., and C. Weibel (2003), Statistical Evidence that Web-based Homework Helps, Focus, 23(2)

Sanchis, G. (2001), Using Web Forms for Online Assessment, Mathematics and Computer Education 35(2)

Xiao, G. (2001), WIMS: An Interactive Mathematics Server, Journal of Online Mathematics and its Applications, 1(1)