• by Felix Montes, Ph.D. • IDRA Newsletter • November- December 1999

Dr. Felix MontesThe number of limited-English-proficient (LEP) students is increasing nationwide. According to a survey of state education agencies conducted by the National Association for Bilingual Education (NABE) for the Office for Bilingual Education and Minority Languages Affairs (OBEMLA), the total number of LEP students enrolled in public and non-public schools reached 3,184,696 students in the 1994-95 school year, a 4.8 percent increase from the previous year (Macías and Kelly, 1996).

According to the same survey, Texas has the second largest enrollment of LEP students in the country. There were 514,139 LEP students enrolled in Texas public schools in 1996-97, which is 13 percent of all the students in the state. The Texas Education Agency (TEA) reports that 11.9 percent of the state’s students participated in bilingual or English as a second language (ESL) programs (1999). This amounted to 463,134 students for the school year 1997-98.

An analysis of the Texas Assessment of Academic Skills (TAAS), a high stakes state-mandated test, shows that the gap between the percentage of LEP students who passed the test and non-LEP students who passed increased with the grade level (TEA, 1998). At the secondary level, the gap ranges from 42 percentage points to 48 percentage points in sixth and 10th grades, respectively. Likewise, the percentage of LEP students passing the 1997 TAAS decreased markedly as these students progressed through the grade levels. In the third grade, 60 percent of LEP students passed the test compared to only 22 percent in the 10th grade. These findings suggest a systematic weeding out of LEP students from the educational system, with the well-known consequences in social problems and economic losses.

Part of the problem is a severe shortage of bilingual and ESL certified teachers, from kindergarten to the 12th grade. Seventy-four districts in Texas alone reported needing more than 2,000 additional bilingual/ESL teachers (TEA, 1998). State policy requires districts to assign bilingual/ESL teachers to the lower grades when there is a shortage. As a result, higher-grade instruction for LEP students is inadequate in many school districts. This what the IDRA Content Area Program Enhancement (CAPE) project attempted to help ameliorate.

Content Area Program Enhancement Project

The CAPE project is an initiative of the Intercultural Development Research Association (IDRA) in collaboration with a predominantly Hispanic (96 percent) school district in south Texas. In this school district, one of every four students enrolled is LEP, and virtually all (93.4 percent) of is more than 14,000 students are economically disadvantaged.

CAPE is a teacher training program developed by IDRA and is based on the Cognitive Academic Language Learning Approach (CALLA). CAPE was funded by the U.S. Department of Education to serve LEP students at the intermediate level. CALLA focuses on teaching learning strategies in cooperative settings to accelerate the acquisition of language skills and academic content. Based on the work of Jim Cummins (1980; 1981) and Virginia P. Collier (1987; 1989), CALLA is supported by a strong research base in the areas of cognition and metacognition (Anderson, 1976; Gagné, 1985) and second language research (O’Malley, Chamot and Kupper, 1989). According to this research, CALLA can help teachers meet the academic needs of three types of students:

  • Those who have English communicative ability, but who are unable to use English as a tool for acquiring academic content;
  • Those who have acquired academic concepts in their native language, but who need help in transferring them to English; and
  • Those who are English dominant bilinguals and have not acquired academic language skills in their home language.

The CALLA Handbook by Chamot and O’Malley (1994) provides a comprehensive presentation of the CALLA approach in an accessible format. When this new pedagogy is applied, the classroom looks very different from the traditional model both physically and behaviorally, as instruction becomes more student-centric. The box below outlines the anticipated changes in instructional practices once the CAPE approach is adopted.

CAPE Plus Research

CAPE Plus is a bilingual education, field-initiated research program also funded by the US Department of Education. It assessed the degree to which the classrooms in the CAPE project provided more appropriate instruction for LEP students than non-CAPE classrooms as measured by student performance. Thus, CAPE Plus investigated whether or not participating in CAPE project classrooms improved the achievement of LEP students.

The design compared the performance of LEP students in sixth through eighth grades in the project classrooms with similar students in non-project classrooms at three middle school campuses. The design required several sources of data. Student scores on the TAAS were collected on a pre-test and post-test basis for both the program and non-program groups. This allowed the assessment of any significant difference attributable to the project. In addition, several qualitative data sources were used to allow for a better understanding of the project’s impact on the students, teachers and schools. A set of in-depth interviews with the team leaders and other teachers was conducted to clarify issues of implementation and comfort and to help interpret some f the findings from the other analyses. School principals and instructional facilitators were also interviewed in-depth to understand the impact of the project on the school as a whole.

Technical Assistance

IDRA provided technical assistance that consisted of numerous interactive training sessions including in-service training for all teachers in a central location, classroom demonstrations and summer institutes. During the in-service training, the teachers were exposed to the various CALLA strategies, interdisciplinary unit planning, using technology in the classroom and other relevant topics. In subsequent sessions, IDRA staff demonstrated how to use these strategies in actual classrooms. This cycle of training and demonstration occurred throughout the project duration.

Each summer, participants attended a week-long institute. The institute provided an opportunity to review in a relatively short time the various CALLA strategies sponsored by the project, to reflect about the progress made so far and to plan for the next set of activities.

Teachers were organized in teams and worked collaboratively to apply the CAPE strategies, review their effectiveness and provide mutual support. IDRA provided more than 100 person-days of technical assistance. About 80 percent of this time was spent working directly with the teachers. The rest was used to plan and coordinate the project with the district and schools.

Results: Administrators’ Perspective

The support from the administrators to smooth out the implementation of the project was crucial. Principals and instructional facilitators who provided the teachers with institutional support and a safe place to experiment with the new techniques were more satisfied with the project results. These administrators knew what the project was attempting to accomplish.

One principal stated:

I feel that teachers have to have as many tools as they can to reach out to students. They all have different learning styles. When we had our first meeting, I could see that the proposed strategies would be useful to teach the students.

Administrators faced difficult issues that confronted the program implementation. Perhaps the most problematic issue was what they called “the pull out problem.” They recognized the need for teachers to participate in program activities but were concerned about pulling them out of the classroom. One problem that was immediately apparent was the lack of substitute teachers to cover for the participating teachers. The district maintains a pool of substitute teachers, and when many teachers from the various schools were absent, the pool was depleted, and some schools could not get enough substitutes.

Some solutions to this problem were found and implemented. They included scheduling training on Saturdays and after class as well as organizing summer institutes. But the best alternative was part of the program itself: in-class demonstrations. Principals and instructional facilitators were very impressed by these demonstrations because fewer substitutes were needed and because of the intrinsically practical nature of these demonstrations.

One principal commented:

The actual demonstrations are invaluable tools that seldom get used. It is very time intensive and costly in subtle ways, but it is very important to see these strategies work or not work with your own students. And sometimes even the presenters might have a problem with certain students. And that is where they can monitor and adjust, and the teachers can learn. That makes for a good relationship between the teachers and presenters. The demos are really special.

Administrators indicated that they had witnessed how the teachers were enthusiastic about the program and the results they were observing in their own students. They saw it as valuable and were appreciative of the potential benefits the program had for the students.

Results: Teachers’ Perspective

Most teachers felt that the program was beneficial to the students. Below are some of the benefits the teachers indicated the program generated.

  • It provided tools that helped in reaching a much broader range of students.
  • It brought fresh ideas that research has shown work with students.
  • It allowed the research to move forward in the actual field so we know what works.
  • It provided lessons for students who speak a different language and provided them with a way of feeling more secure in learning new material.
  • It was useful for social studies, reading, language arts and even in science and mathematics.
  • It taught strategies that involved language acquisition and content vocabulary that was easier for students to relate to.

An important aspect of the program was an expected impact on the students’ TAAS scores. Teachers were asked if they thought that the program had improved the chances of passing this state-mandated test among the students involved. Below is a summary of what they said.

  • It did improve their chances of passing the TAAS. When IDRA staff came, they demonstrated how to implement learning strategies with the students that would get them to participate. Through these strategies, teachers encouraged students to start thinking. Teachers used the strategies extensively in their teaching process.
  • The language barrier seemed less formidable. The project gave LEP students the opportunity to learn English. But the TAAS requires you to know English, to know skills and to know how to decipher information. They are still on level one, learning English. Eventually, with the project, they will pass the TAAS.
  • For the writing and reading it will help. For the math it would help with reading the problem and with problem solving skills and knowing what to do to solve the problem.
  • The CAPE project has allowed the students to have a better foundation with language so they do not fear the reading as much. It has given them a level of comfort. Teachers approached the material from different angles in different ways, including paired reading, oral reading, teacher modeled reading and so forth. They no longer have that fear. Their reading scores were very good on the pre-TAAS (mid- to high-70s). We are hoping to hit the 80s in TAAS in reading. Everything began to click this year.

Since these teachers were representing other participating teachers in their respective schools, it was important to know what these teachers knew about the other teachers’ preferences regarding the strategies sponsored by the project. In this spirit, IDRA asked them about the CAPE strategies the other teachers on their team had suggested seemed to work best for them and their students. Definition diagonals, shared reading and other learning strategies were effective for different teachers depending on their students and subject matter.

Results: Students’ Perspective

The most relevant aspect of this research was finding out whether CAPE would make any difference in student performance as measured by a standardized instrument such as the TAAS. To answer this question, the project compared the performance of all students, including LEP students, in the project classrooms with similar students in non-project classrooms at three middle school campuses. Two dependent variables were used: the Reading Texas Learning Index (Reading TLI) and the Mathematics Texas Learning Index (Mathematics TLI). These two indexes, derived from the TAAS, are the best indicators of whether the students are on target to pass the TAAS at the end of their high school year (see boxes below).

A General Linear Model (GLM) with repeated measures was used to investigate the effects of CAPE instruction on the student Reading TLI scores, the dependent variable. The GLM analyzes groups of related dependent variables that represent different measurements of the same attribute. This model allows for comparisons of within-subjects and between-subjects factors. The within-subjects factors in our current application are the pretest and posttest measures of the Reading TLI scores. The between-subjects factors included the various conditions that subdivided the whole sample into sub-groups. The chief between-subjects factor was whether the students were in a CAPE or non-CAPE classroom. Other between-subjects factors analyzed included LEP and at risk conditions. The boxes below show the resulting means and tests of significance resulting from the statistical analysis.

There is a clear trend indicating that the project contributed to the improvement of CAPE students’ chances of passing the TAAS. For example, CAPE LEP student scores showed a larger improvement than did non-CAPE LEP student scores, as shown in the box. The gap between the two groups was 7.35 points. By 1999, the gap was reduced to only 1.15 points.

There was one explicit indicator that corroborates the sense that the project was most effective in reaching those students who most needed this kind of instruction, that is, those who are both LEP and at risk of dropping out. While both groups registered significant gains, the CAPE group average gain was significantly larger than that of the non-CAPE group, thus virtually eliminating the gap between them (see graphs). This supports the perception of teachers and administrators that CAPE was beneficial for all students, especially for the LEP students.

A similar GLM model was used to investigate the effects of CAPE instruction on the student Mathematics TLI scores. Analysis of the Mathematics TLI strongly suggests that CAPE instruction had an important and positive contribution to the students’ chances of passing the TAAS. All CAPE groups under analysis obtained statistically significant gains from 1998 to 1999. For the non-CAPE groups, the opposite was the case. Virtually none had statistically significant gains.

The CAPE research program was designed to investigate whether the CALLA-based CAPE approach to teaching had a beneficial effect on LEP students’ performance as measured by the high stakes standardized test TAAS. The research found that in fact it did. Using the reading and Mathematics TLI, the research found significant differences between CAPE and non-CAPE classrooms for LEP students who are also at risk of dropping out of school. These indexes are the best indicators of whether the students are on target to pass the TAAS.

Changes in Instructional and Assessment Practices
 

Before CAPE Participation

 

 

After CAPE Participation

 

No assessment of prior knowledge when beginning instruction. Assessment and activation of prior knowledge when beginning instruction.
 

Indirect or no instruction on cognitive and metacognitive learning strategies.

 

 

Direct instruction on cognitive and metacognitive learning strategies.

 

Teachers evaluate student progress. Teachers and students evaluate student progress.
 

Expectations for student learning are hidden.

 

 

Expectations for student learning are made clear.

 

Evaluation of content area knowledge is confounded by language proficiency. Evaluation of content area knowledge is kept separate from language proficiency levels.
 

Oral and written language issues relevant to content area knowledge and skills are ignored.

 

 

Oral and written language issues relevant to content area knowledge and skills are identified and incorporated into instruction.

 

 

1998 and 1999 Reading TLI
CAPE and Non-CAPE Classrooms, All Students and Subgroups – Descriptive Demographics and Tests of Significance

 

 

Sample

 

 

Year

 

 

CAPE Classrooms

 

 

Non-CAPE Classrooms

 

 

F-Test

 

 

N

 

 

Mean

 

 

SD

 

 

t-test

 

 

N

 

 

Mean SD

 

 

SD

 

 

t-test

 

 

All Students

 

 

1998

1999

 

 

985

 

 

64.72

69.87

 

 

30.28

26.57

 

 

6.36*

 

 

618

 

 

64.71

68.44

 

 

29.70

28.34

 

 

3.67*

 

 

2.20

 

 

LEP

 

 

1998

1999

 

 

177

 

 

44.32

56.06

 

 

32.16

29.11

 

 

5.26*

 

 

100

 

 

51.67

57.21

 

 

28.92

28.16

 

 

2.19*

 

 

0.68

 

 

At Risk

 

 

1998

1999

 

 

637

 

 

63.40

67.95

 

 

27.31

24.44

 

 

4.57*

 

 

436

 

 

62.04

65.73

 

 

28.30

27.15

 

 

3.08*

 

 

1.82

 

 

LEP and At Risk

 

 

1998

1999

 

 

126

 

 

50.23

59.21

 

 

28.44

25.35

 

 

3.35*

 

 

85

 

 

54.89

60.72

 

 

27.79

26.26

 

 

2.10*

 

 

5.27*

 

 

* Difference is statistically significant at least at the p £ .05

 

 

1998 and 1999 Mathematics TLI
CAPE and Non-CAPE Classrooms, All Students and Subgroups – Descriptive Demographics and Tests of Significance

 

 

Sample

 

 

Year

 

 

CAPE Classrooms

 

 

Non-CAPE Classrooms

 

 

F-Test

 

 

N

 

 

Mean

 

 

SD

 

 

t-test

 

 

N

 

 

Mean

 

 

SD

 

 

t-test

 

 

All Students

 

 

1998

1999

 

 

985

 

 

64.72

68.04

 

 

28.75

25.99

 

 

4.73*

 

 

618

 

 

64.55

66.66

 

 

28.66

27.50

 

 

2.09*

 

 

1.99

 

 

LEP

 

 

1998

1999

 

 

177

 

 

46.46

55.59

 

 

33.22

30.36

 

 

3.96*

 

 

100

 

 

55.35

58.71

 

 

30.15

28.45

 

 

1.27

 

 

0.54

 

 

At Risk

 

 

1998

1999

 

 

637

 

 

63.84

66.65

 

 

25.98

23.92

 

 

2.87*

 

 

436

 

 

62.75

64.48

 

 

27.58

26.67

 

 

1.42

 

 

3.23

 

 

LEP and At Risk

 

 

1998

1999

 

 

126

 

 

53.21

59.14

 

 

29.42

26.61

 

 

2.15*

 

 

85

 

 

59.58

62.55

 

 

28.10

25.62

 

 

1.02

 

 

6.95*

 

 

* Difference is statistically significant at least at the p £ .05

 

alt

alt

alt


Resources

Anderson, J.R. Language, Memory, and Thought (Hillsdale, N.J.: Erlbaum, 1976).

Chamot, A.U. and J.M. O’Malley. The CALLA Handbook (Reading, Mass.: Addison Wesley Publishing Company, 1994).

Collier, V.P. “Age and Rate of Acquisition of Second Language for Academic Purposes,” TESOL Quarterly (1987) 21, 617-41.

Collier, VP“How Long? A Synthesis of Research on Academic Achievement in Second Language,” TESOL Quarterly (1989) 23, 509-31.

Cummins, J. “The Construct of Proficiency in Bilingual Education,” In J.E. Alatis (Ed.), Georgetown University Round Table on Languages and Linguistics (Washington, D.C.: Georgetown University Press, 1980).

Cummins, J. “Age on Arrival and Immigrant Second Language Learning in Canada: A Reassessment,” Applied Linguistics (1981) 2, 132-49.

Gagné, E.D. The Cognitive Psychology of School Learning (Boston, Mass.: Little, Brown, 1985).

Macías, R.F., And C. Kelly. Summary Report of the Survey of States’ Limited-English-Proficient Students and Available Educational Programs and Services, 1994-1995, report of a study funded by OBEMLA (Washington, DC: The George Washington University, 1996) N.T295005001.

O’Malley, J.M., And A.U. Chamot, L. Kupper. “Listening Comprehension Strategies in Second Language Acquisition,” Applied Linguistics (1989) 10(4).

Texas Education Agency. Academic Achievement of Elementary Students with Limited English Proficiency in Texas Public Schools, Report No. 10 (Austin, Texas: Texas Education Agency, Office of Policy Planning and Research, 1998).

Texas Education Agency. Snapshot ’98: 1997-98 School District Profiles (Austin, Texas: Texas Education Agency, Office of Policy Planning and Research, 1999).


Felix Montes, Ph.D., coordinates technology at IDRA. Comments and questions may be directed to him via e-mail at feedback@idra.org.


[©1999, IDRA. This article originally appeared in the November- December 1999 IDRA Newsletter by the Intercultural Development Research Association. Permission to reproduce this article is granted provided the article is reprinted in its entirety and proper credit is given to IDRA and the author.]

Share