|Additional Online Case Studies & Appendices|
Using Practice Test in Assessment of Teacher Preparation Program
Jerzy Mogilski, Jorge E. Navarro, Zhong L. Xu 
University of Texas at Brownsville and Texas Southmost College
In this article, we show that a properly constructed comprehensive exam may serve as a useful assessment tool of student learning. We discuss the assessment of the teacher preparation program based currently on the detailed item analysis of a multiple-choice exam. The exam was taken by seniors (in exceptional cases by juniors) and graduates during the preparation for the state exam for teacher certification.
Background and goals
In the state of Texas, The State Board for Educator Certification (SBEC) adopts the accreditation standards for programs that prepare educators, and administers the Examination for the Certification of Educators in Texas (ExCET). Program accreditation is based on the ExCET pass rates. SBEC uses two types of rates: first year pass rate which is the pass rates for the test takers taking the test for their first time who were students at that school and cumulative pass rate which is based on the performance over the two-year period. To be rated “Accredited” a program must achieve 70% first-year pass rate or an 80% cumulative pass rate. Otherwise, SBEC rates programs as “Accredited under Review” or “Not Accredited.”
The University of Texas at Brownsville is a young university, established in 1992. The current enrollment is 11,000 students. The Department of Mathematics consists of 18 regular faculty members and 3 lecturers. The department offers the B.S. degree in Mathematics with three tracks of study: non-teaching degree with a minor, teacher certification grades 8-12, and teacher certification grades 4-8. The majority of mathematics majors choose a teaching career after graduation. A substantial number of them choose the traditional way through the teaching certification program. The Alternative Certification Program and the Deficiency Program provide two non-traditional ways to become a certified teacher with the non-teaching B.S. degree in Mathematics. For this reason most of the mathematics majors take the Examination for the Certification of Educators in Texas (ExCET).
Our study, which started in the fall of 1997, was motivated by a major concern of the Administration of the University and the Department of Mathematics about the poor performance of our students in the ExCET for secondary mathematics teachers. According to SBEC the passing rate of UTB students in the Mathematics test was below 50%. This five-hour exam consists of 90 to 100 multiple-choice questions covering 41 competencies, which are grouped into 5 domains: (1) Mathematical Foundations, (2) Algebra, (3) Geometry, (4) Trigonometry, Analytic Geometry, Elementary Analysis, and Calculus, (5) Probability, Statistics, and Discrete Mathematics. In order to understand the reasons for the poor ExCET performance of our students, we analyzed the correlation between the curriculum for the mathematics major and the ExCET competencies. Using this correlation, we conducted a systematic assessment of student learning in mathematics for the teacher preparation program.
Details of the Assessment Program
Although the ExCET is very good in evaluating students’ knowledge in the field of mathematics, we could not use it in our assessment because the SBEC provides only very limited information about the student’s performance. Fortunately, we found another test which served as our measuring tool, the ExCET Practice Test produced for SBEC by committees of Texas educators and National Evaluation System, Inc. This test was strongly correlated with the ExCET and was designed to assist staff at educator preparation programs in providing feedback on candidate performance in relation to the ExCET test framework. According to the State Board for Educator Certification, ability to answer 70% of the questions on the ExCET Practice Test correctly indicates sufficient preparation for the ExCET. Since the fall of 1997, the ExCET Practice Test has been adopted as the Mathematics Benchmark Test and it has been administrated to more than 255 students. We summarize here the results of our five year study. The Mathematics Benchmark Test consists of 150 questions distributed fairly evenly among the 41 competencies in the ExCET and spread among the 5 mathematical domains listed above. In addition, the authors of the report have divided the 41 competencies into 20 competency groups to obtain a clear correlation between the competencies in the test and mathematical topics taught in our Math Major program. The questions in the test are in multiple-choice format and typically it takes six hours for a student to complete the test. The test indicates which competency is tested by which questions and, for this reason, serves as an excellent diagnostic tool.
There were two big advantages of using the Benchmark Test. First, it was free to the student. Second, we graded it in our department, and we could perform the item analysis according to our needs. The latter let us pinpoint the areas where the particular student was weak. This information resulted in the short term in tutoring sessions, as well as special study materials we provided to the student, and in the long term, in revising the curriculum and improving teaching. There was also a third, rather significant, advantage: the test gave students an idea of what they will find on the real exam itself, which made it much easier to study for. Since they could not become (or continue as) teachers until they pass this exam, they were highly motivated to do well on it. We used the tutorial sessions to discuss the test with the students and learn more about their perceptions of the problems. In fact, these discussions make our assessment process multidimensional and they were a very valuable source of information about the typical difficulties that our students had taking the test.
Our analysis of the results of the test started from each specific question in the test. Then, we generalized the findings to competencies, and to the groups of competencies. We finalized the analysis by correlating the results with the instructional goals and the courses offered by the department.
Benchmark Results Overall. The data of this study included scores of tests of 255 students with the item analysis by each of the 150 questions. The average percentage of correct answers was 59% and it stayed constant for the period of five years.
Benchmark Results by the Competencies. The 100 questions were grouped in 41 competencies. Of 41 competencies included in the Mathematics Benchmark Test , students scored below 40% on three, between 40% and 49% on another three, 14 between 50% and 59% on fourteen competencies, 15 between 60% and 69% on fifteen, and between 70% and 79% on the remaining six. Unfortunately, on only 51% of the competencies did students score 60% or above. Please see the table in Appendix A.
In order to make our data more transparent, we divided the 41 competencies of the Mathematics Benchmark Test into 20 competency groups. The table in Appendix B shows the percentage of correct answers by the competency groups.
Assessment of Mathematics Courses. We developed a correlation table between the 41 Benchmark competencies and the Mathematics courses. This correlation enabled us to use the Benchmark statistical data to evaluate 19 Mathematics courses in the program offered by our department. The only courses in which student scores are below 50% on the material covered in these courses are Calculus I and Calculus III . These results are consistent with the five-year average and the findings of the previous assessments. The courses in which student scores are above 50% but below 60%, are Calculus II, Mathematical Statistics, Trigonometry and Discrete Structures. The only course in which students scored above 70% on its contents is Linear Algebra. In the period of five years most of the courses improved except for the Calculus I, II and III. Please refer to the table in Appendix C for the complete evaluation of the courses.
Calculus Questions in Benchmark. Our students performed rather poorly in the calculus part of the ExCET and calculus has been constantly the weakest area in the Benchmark Test during the last five years. The test has 10 questions about basic calculus concepts and topics. The percent of correct answers on this group of questions averaged 41%, ranging from 20% on two, to 57%. The questions which students answered correctly only 20% of the time seem to be basic for any calculus course. The first asks which of the derivatives is used in order to find the inflection points. The second asks for reconstructing distance from the given velocity function. The five-year averages show that the above problem is persistent.
We had a unique opportunity to discuss the questions from the test individually with each student during the “after test” tutorial sessions. These discussions were a very valuable source of information because the students openly described typical difficulties they had taking the test.
Our findings indicate that the most noticeable difficulties our students had are:
Use of Findings
Based on our studies, we found effective ways to improve significantly the performance of the students taking the ExCET through workshops, tutoring sessions, study materials and practice tests. We also started to teach new courses specially designed for teacher preparation program such as “Problem solving and mathematical modeling” and the two-semester course “Survey of mathematical principles and concepts.” Based on our assessment the department has recently made several changes in the mathematical curriculum including increasing the number of credit hours of the calculus courses and making Discrete Structures a required course for mathematics majors. We communicated the assessment findings to the faculty of the Department of Mathematics and we started to work on establishing a new “classroom culture” for learning mathematics which would focus on understanding concepts and developing good problem solving skills.
All these efforts brought very positive results. We brought the ExCET passing rate above 80 % in one year and it has remained there since then. In recent years, our students have been very successful in passing the ExCET. The table below shows the passing rates in the ExCET for the last five years.
Although the figures in the above table differ considerably from those in the tables for the practice test there was an improvement in the practice tests over this period as well. This improvement was not as dramatic as in the ExCET. It shows that the results of curriculum changes take more time and that the workshop and practice test activities are quite effective.
Next Steps and Recommendations
It is too early to see the effect of the curriculum changes that we have made. Our department is working on developing a comprehensive plan of assessment of our major. When the assessment is implemented, we hope that it will provide some information about the effectiveness of the new courses.
On the other hand, we can see whether our graduates are successful teachers. It is easy for us to maintain contact with graduates because many of them become Mathematics teachers in the local school districts. We see them regularly at the collaborative meetings and professional development activities offered at UTB. Since the school districts encourage the teachers to go into graduate school, many of our graduates take the graduate courses in Mathematics as a part of the requirements for M. Ed. in Mathematics Education offered at UTB. Presently, we are in the process of establishing an assessment program based on the professional development activities for teachers and the graduate courses.
Appendix A: Benchmark Results Grouped by Competencies
Appendix B: Benchmark Results by Competency Groups.
In order to make our data more transparent, we divided the 41 competencies of the Mathematics Benchmark Test into 20 competency groups. The table below shows the percentage of correct answers by the competency groups.
Appendix C: Benchmark Results Grouped by Mathematics Courses
 This study was partially supported by the UTB Research Enhancement Grant and the NASA Minority University Mathematics, Science & Technology Awards for Teacher Education Program