• by Albert Cortez, Ph.D. • IDRA Newsletter • November- December 1999
In August 1999, the Office of the Texas State Auditor issued a report on the Texas Safe Schools Act. Adopted in 1997, the act created juvenile justice and disciplinary alternative education programs to remove what were termed “disruptive and or violent” students from Texas public school classrooms. The push for the creation of these alternative education programs originated from educators who were dissatisfied with provisions for removing pupils considered to be disruptive from local school classrooms.
The 1997 legislation created disciplinary alternative education programs (DAEPs) as educational programs that would ensure that, during periods of disciplinary action or expulsion from school, students would continue to be educated instead of being “put out on the street.” To address more serious offenders, the act created juvenile justice alternative education programs (JJAEPs) that would deal with incarcerated pupils who had been referred to the state’s juvenile justice system.
Years after its implementation, the state auditor reports in A Report on Safe Schools Programs that the alternative education programs operated in Texas suffer from numerous ailments ranging from lack of implementation of the state requirements to an absence of any useful evaluation data (other than sporadic anecdotal accounts) that indicate whether the programs are working as intended.
In its report, the state auditor’s office lists four general conclusions:
- School officials do not consistently remove violent students to alternative education programs as the act requires;
- The academic progress of many students in alternative education programs is not measured;
- Special education, minority and at-risk pupils are disproportionately represented in alternative education programs; and
- Some school districts that expelled pupils to JJAEPs continued to report them as eligible for Foundation School Program funding, in violation of state law.
The findings by the state auditor follow the publication of the Intercultural Development Research Association’s (IDRA) own analysis of DAEPs reported in a policy brief, Disciplinary Alternative Education Programs: What is Known; What is Needed, published and disseminated in February 1999 (Cortez and Robledo Montecel, 1999). IDRA studied data on alternative education programs that were designed to address those pupils not involved in serious violations of the state criminal code – but whom local teachers and administrators considered sufficiently disruptive to merit removal from the regular school program.
The state auditor’s findings closely parallel the conclusions reached in IDRA’s analysis of DAEPs, though the auditor’s data yield important new insights into aspects of the JJAEPs being implemented in the more densely populated counties in Texas. These new findings serve to validate IDRA’s earlier research and reinforce the stated need for significant modifications in current state policies.
Proponents of the Texas Safe Schools Act tout it as having contributed to making local schools safer for all pupils. But the auditor’s data reveal that in as many as 850 cases involving offenses that should have resulted in a student’s referral to a JJAEP, he or she was instead referred to a DAEP. State officials attributed this mis-referral to lack of local school understanding of the law or reluctance to make the more serious referral. This mis-referral created situations where more serious offenders were inter-mingled with pupils who had been referred to DAEPs for less serious offenses.
The state auditor’s report verifies IDRA’s earlier findings in two other critical areas: (1) the over-representation of minority and low-income pupils to DAEPs and (2) the serious lack of equitable accountability for DAEPs.
The state auditor noted that statewide data on student referrals to alternative education programs indicate that students considered at risk, minority students and students from low-income households were more likely to be referred to alternative education programs than were their White student counterparts. While White pupils represent 45 percent of the state enrollment, they represent only 34 percent of alternative education programs’ pupil referrals. In comparison, African American pupils constitute only 14 percent of the statewide enrollment yet they represent 21 percent of pupils referred to alternative education programs. Similarly, Hispanic pupils are reported as having constituted 38 percent of the student enrollment, but 44 percent of alternative education programs’ referrals. Overall, minorities constitute 66 percent of alternative education program placements, though they constitute only 55 percent of statewide enrollment. In IDRA’s analysis of statewide data similar patterns emerged.
A county-level look at referral rates by race and ethnic group is even more revealing. In Bexar County, 76 percent of referrals are minority pupils; in Dallas County, 65 percent are minority; in Harris County, 83 percent are minority; and in Travis County, 77 percent are minority. While it is recognized that Texas’ urban counties have a large minority population, this does not change the fact that a disproportionate number of minority pupils are becoming the subjects of referrals to these alternative education programs. Given the fact that little or nothing is known about the effectiveness of these programs, minority advocates should be concerned about the proliferation of these programs.
Confounding the data in IDRA’s research was the growing number of alternative education program pupils being reported by school districts as “unknown” race or ethnic background. This is a development that we suspect serves to mask even more serious minority over-referral.
According to the state auditor, students at risk of dropping out of school and students requiring special education services are also disproportionately placed in alternative education programs.
In the summary data provided in the report, the auditor’s study reveals that referrals tend to increase as students get older. Referrals by grade level indicate that the percentages of pupils by grade begin to seriously increase at grades six, seven and eight. Referrals peak at grade nine, which accounts for one-fourth of all alternative education programs referrals. Referrals then decline in grades 10 through 12.
We do not consider it coincidental that ninth grade is when student retentions in grade peak and the greatest number of students drop out. A closer look at the relationship between alternative education program referrals, retention in grade and dropping out is merited given these findings.
Concerns with Data Quality and Availability
Perhaps the most critical conclusion reported by the state auditor is the finding, “Data is insufficient to show if alternative education works” (1999). According to the auditor’s report:
The state has paid $28 million in fiscal year 1998 and again in 1999 for alternative education programs that have not generated sufficient data to support that they have a positive effect on student achievement.
Proponents of removing pupils from the regular school classroom and campus defended their recommendations with the proposition that the isolation of these pupils would help schools provide a quality educational program while they address the social and academic factors that caused the pupils to be removed from the regular classrooms. The auditor’s report reveals that these programs – after several years of operation – have failed to prove that they have a positive educational benefit for the pupils referred.
This finding parallels IDRA’s finding that DAEPs failed to collect and report critical data that would enable the state policy-makers and the general public to hold these programs accountable. The auditor’s report cites specific areas where the state’s alternative education programs are found wanting.
Insufficient Indicators for Measuring
Progress and Gauging Long-term Effects
The state auditor found that TEA does not have data to show if most students in DAEPs are learning. The commission that oversees JJAEP operations has limited data that JJAEP students are improving in reading and math (not all students who should be tested are tested). TEA and the commission do not know how students perform once they return to their home campuses (and no one is gathering such data).
Alternative education programs would benefit from collecting consistent academic and behavior data during the students’ tenure in the program, including data on graduation rates, courses passed, credits earned, GEDs completed, TAAS performance, attendance rates and dropout rates.
Incomplete and Inaccurate Data
The state auditor states that existing DAEP data should be cause for concern if it is to be useful in guiding further deliberations on the future of such programs. A major finding of the report was that one of the state’s largest districts failed to submit any alternative education program data, which seriously delimits state-level statistics on the program. Another finding was that TEA data on JJAEP referrals was inaccurate and did not agree with the commission’s data on the program. The auditor found that when districts had to specify information in reports, they tended to overuse the “other” code, suggesting that many were unclear about program requirements.
The overarching recommendation resulting from the auditor’s analysis of these programs was that extensive additional training should be provided to school officials who are responsible for implementing the Texas Safe Schools Act requirements. Additionally, the auditor strongly recommended that more data be collected from these programs in order to assess their characteristics, as well as their impact and effectiveness. IDRA also recommended the collection of additional data on these programs.
Funding
The state auditor examined school district funding issues as they relate to alternative education programs. One concern that emerged was the finding that some schools reported students referred to JJAEPs as being enrolled in their home district – thus receiving funding for students not actually enrolled.
The auditor explains, “State law expressly states that a student served by a JJAEP on the basis of an expulsion for certain felonies is ineligible for Foundation School Program funding.” Monies for these pupils are allocated through separate state funding provisions involving juvenile justice operations.
According to the report, school districts’ lack of understanding of these provisions created widespread problems, including students generating monies for both the local district and the JJAEPs in which they were enrolled. In the 22 counties operating mandated JJAEPs, “18 reported [alternative education programs] students’ attendance incorrectly.”
According to the auditor’s report, local districts may have to reimburse the state up to $1.4 million because of incorrect reporting in 1997-98. The amount may be larger if the problems persisted in 1998-99. The auditor recommends that TEA implement procedures to detect improper reporting of JJAEP students’ attendance.
Unequal Accountability
An area in which the state auditor and IDRA differ relates to the state accountability of alternative education programs. While IDRA has long expressed concerns about the creation of a separate and unequal accountability system for alternative education programs, the auditor’s report was mute on this point. In our own research, IDRA noted and reported that the state’s alternative education programs were subjected to separate and less stringent accountability requirements. IDRA’s findings lead to changes in state policies that were adopted in the 1999 session of the Texas legislature.
In a similar vein, preliminary findings that were shared with state policy-makers also contributed to some reforming of state alternative education policies and requirements. The extent to which these changes will actually inform future policies and, more importantly, lead to improved results for students referred to what too often resemble “satellite” operations remains to be seen. Those interested in ensuring that all pupils are afforded equal educational opportunities should continue to monitor alternative education programs very closely at both the local and state levels.
Resources
Cortez, A., and M. Robledo Montecel. Disciplinary Alternative Education Programs in Texas – What is Known; What is Needed (San Antonio, Texas: Intercultural Development Research Association, 1999).
Office of the State Auditor. A Report on Safe Schools Programs (August 1999) Report No. 99-049.
Albert Cortez, Ph.D., is the director of the IDRA Institute of Policy and Leadership. Comments and questions may be directed to him via e-mail at feedback@idra.org.
Copies of the state auditor’s report may be requested from the Office of the State Auditor. IDRA’s policy brief, “Disciplinary Alternative Education Programs: What is Known; What is Needed,” can be obtained from IDRA ($7 each) or online at the IDRA web site, www.idra.org (no charge).
[©1999, IDRA. This article originally appeared in the November- December 1999 IDRA Newsletter by the Intercultural Development Research Association. Permission to reproduce this article is granted provided the article is reprinted in its entirety and proper credit is given to IDRA and the author.]