• by Albert Cortez, Ph.D. • IDRA Newsletter • May 2003

Dr. Albert CortezThe Texas Education Agency recently released results from the March 2003 administration of the state’s new mandated assessment – the Texas Assessment of Knowledge and Skills. The new test is purported to incorporate higher levels of difficulty than the state’s prior exam – the Texas Assessment of Academic Skills, which the TAKS replaced.

Much of the attention to the results of the new exam has been grounded in the recognition that student promotion to the next grade level was tied to successful performance on the new exam. Though “promotion gate” exams such as TAKS have been used in other states for several years, the 2003 school year was the first in which passage of the test would be mandated in order for students to be promoted to the next grade.

In Texas the automatic in-grade-retention provisions are to be phased in – with passage of the test required for students enrolled in third grade in the current year. Starting in 2004-05, students who fail the third grade or the fifth grade TAKS will be automatically retained. By the 2006-07 school year, students in third, fifth, and seventh grades will be subject to the automatic retention requirements.

While the law stipulated that final decisions regarding promotion will be made by a school “grade placement committees,” it requires that only a unanimous vote by the committee members (to include the campus principal, the teacher, and a parent) can prevent retention. This virtually ensures that the great majority of students failing the TAKS will be retained.

Local and state officials closely monitored this year’s third graders and their performance on similar tests since last spring. Prior results suggested that a large number of students would fail to meet the state’s passing standard on the third grade exam.

Given this dire prediction, school administrators successfully lobbied for the TAKS performance results’ impact on the state’s school accountability system be set aside for one year. Thus regardless of student outcomes, TAKS results would not be used to rate school performance in 2002-03. Though school administrators were granted a one-year reprieve from the consequences of TAKS performance, students were not given comparable relief.

Concerns with potentially dramatic increases in the number of students who would fail to meet state passing standards led the state to seek recommendations from an expert panel on where to set its passing standard. For the third grade level, the panel recommended that students correctly answer 24 of 36 questions (or 67 percent) to achieve a “passing” score on the TAKS. Final decisions on the passing score however were left to the elected State Board of Education.

After reviewing the panel’s recommendations and assessing the implications of setting the passing score at numerous levels, the State Board of Education voted to phase-in the panel’s recommended level. For the 2003 TAKS administration, the board opted to require students to correctly answer only 20 of 36 questions correctly (56 percent), which translated to two standard errors of measurement (-2SEM) below the panel recommendation.

Third Grade TAKS Results

The table below presents a summary of third grade students’ results developed by the TEA. A recent statistic noted by TEA and reported in the media was that 89 percent of all third graders had met the minimum standard established by the state board for the spring 2002 third grade TAKS. This was higher than the 85 percent who had scored at comparable levels in the TAKS field test conducted in the fall of 2002.

The Intercultural Development Research Association suspects that the growth in the percent passing may be attributable to both an expected increase in student learning produced by additional months of instruction and intensive TAKS preparation efforts implemented by school districts over the interim period.

All groups scored at levels higher than predicted in the spring 2002 TAKS field test, with low-income pupils reflecting a six-point difference between the field test percentage and the percent scoring at 2SEM during the spring 2003 administration (78 percent vs. 84 percent).

It is noteworthy to examine the difference in passing percentages had the State Board of Education decided to adopt the passing levels recommended by its expert panel. Had the higher standard been used, statewide, only 81 percent would have passed the TAKS, and the passing rates for the various sub-groups would have dropped significantly.

While the passing rates for White students would have remained above 90 percent (91 percent at the panel standard vs. 96 percent at 2SEMs below it), the passing rates for students who are African American, Hispanic or low-income would have dropped by over 10 points, all falling into the 70 percent range.

The gap in performance suggests school efforts will have to be scaled up in the future, particularly if the minority and economically-disadvantaged student levels are to be increased to levels comparable to the lower passing standard adopted in 2003.

More importantly if the higher passing standard had been in place, almost 50,000 third graders (19 percent of the 262,600 test takers) would have been at risk of being retained in grade in 2003. Of that total, 40,300 African American students tested (29 percent) would have faced possible retention. Among the 103,289 Hispanic students tested, 26,800 (26 percent) could have been retained in grade.

The potential overall 19 percent retention rate compares to a 5 percent retention level reported for all third graders in 2001, according to TEA retention reports. This in turn represents a 280 percent increase in potential retentions that could have resulted from the application of the automatic retention requirements if the panel-recommended pass rates had been used this year.

Lowering the Bar

While lowering the bar was useful in deflecting the negative affects of state-mandated retentions based on a single test score, local schools will have to see notable improvements in passing rates, particularly among their low-income and minority pupils, to escape the eventual negative consequences of the controversial state policy.

In 2004-05 the passing standard will be raised to a level that will require students correctly answer 22 of 36 questions (61 percent) correctly, a passing score equal to one standard error of measurement below the panel recommendation. By 2005-06 the state will require that students meet the panel’s original recommended 24 correct answers out of 36 items (67 percent) to achieve a passing score.

As this same group of students encounters high-stakes assessments in the fifth grade and then later in the eighth grade, it will be important to compare their dropout and graduation rates to comparable students educated before these assessments were required.

As crucial will be monitoring the extent to which schools use TAKS performance as the primary basis for retention decisions, actions that may be reflected in the proportion of students that the grade placement committees decide to promote despite the single test score. If large proportions of students are indeed retained in grade, advocates will need to step up efforts to modify what many believe is a well intended, but dysfunctional state policy.

Harmful Effects of In-Grade Retention

Despite widely held assumptions that it is beneficial to students, hard data on subsequent student achievement following in-grade retention indicates that merely holding students back does little to improve future learning. In fact, for the majority of pupils, being retained produces harmful effects that have been found to be related to future under-achievement and for many increasing the probability of dropping out of school (McCollum et al., 1999; Cárdenas, 1990).

Until educators and the public recognize that more effective alternatives to retention exist and are viable, students will continue to bear the brunt of dysfunctional state policies. Enamored with the notion that the simple act of testing students improves student learning, many people continue to ignore the fact that it is teaching – not merely testing – that produces the outcomes reflected on assessment measures. How many students will be scarified at the altar of standardized testing in Texas and other states will depend on how long parents and communities continue to tolerate it or demand more effective alternatives.

Third Grade TAKS Results, 2003

Student Category
Number
of Students
Tested

Percent at 2SEM

Percent at 1SEM
Percent at Panel Recommended Level
Percent at 2SEM Spring 2002 – TAKS Field Test
Percent Met TAAS Minimum
Spring
2002
All Students

262,595

89%

86%

81%

85%

87%

African American
40,334
82%
77%
71%
75%
80%
Hispanic
103,289
85%
80%
74%
81%
83%
White
109,375
96%
94%
91%
93%
94%
Economically Disadvantaged
Limited English Proficient –
135,942
84%
78%
72%
78%
81%
English TAKS
Limited English Proficient –
38,517
77%
70%
63%
n/a
77%
Spanish TAKS
23,075
82%
75%
67%
n/a
n/a
Special Education
12,783
84%
80%
74%
n/a
82%
* Bold number reflects the percentages passing TAKS using the current State Board of Education passing standard of 2SEMs below the panel recommended level. Source: Texas Education Agency, 2003.

Resources

Cárdenas, J.A. “Massachusetts Focuses on Grade Retention,” IDRA Newsletter (San Antonio, Texas: Intercultural Development Research Association, November 1990).

McCollum, P., and A. Cortez, O.H. Maroney, F. Montes. Failing Our Children – Finding Alternatives to In-Grade Retention (San Antonio, Texas: Intercultural Development Research Association, 1999).

Texas Education Agency. How to Use the 2003 Early Indicator Report (Austin, Texas: Texas Education Agency, 2003).

Texas Education Agency. Texas Assessment of Academic Skills: 2003 Early Indicator Summary Report (Austin, Texas: Texas Education Agency, May 2002).

Texas Education Agency. Texas Assessment of Knowledge and Skills (TAKS) Standard Setting. Summary of Projected Impact of Phase in Options (Austin, Texas: Texas Education Agency, 2003).

Texas Education Agency. Texas Assessment of Knowledge and Skills. Phase in Summary Report (Austin, Texas: Texas Education Agency, March 2003).

Texas Education Agency. Texas Assessment of Knowledge and Skills. Spring 2003 Performance Standards. English Version Tests (Austin, Texas: Texas Education Agency, 2003).

Texas Education Agency. Texas Assessment of Knowledge and Skills. Summary Report – Test Performance. All Students (Austin, Texas: Texas Education Agency, March 2003).


Albert Cortez, Ph.D., is director of the IDRA Institute for Policy and Leadership. Comments and questions may be directed to him via e-mail at feedback@idra.org.


[©2003, IDRA. This article originally appeared in the May 2003 IDRA Newsletter by the Intercultural Development Research Association. Permission to reproduce this article is granted provided the article is reprinted in its entirety and proper credit is given to IDRA and the author.]

Share