blog posts and news stories

Two New Studies for Regional Education Laboratory (REL) Southwest Completed

Student Group Differences in Arkansas Indicators of Postsecondary Readiness and Success

It is well documented that students from historically excluded communities face more challenges in school. They are often less likely to obtain postsecondary education, and as a result see less upward social mobility. Educational researchers and practitioners have developed policies aimed at disrupting this cycle. However, an important factor necessary to make these policies work is the ability of school administrators to identify students that are at risk of not reaching certain academic benchmarks and/or exhibit certain behavioral patterns that are correlated with future postsecondary success.

Arkansas Department of Education (ADE), like education authorities in many other states, is tracking K-12 students’ college readiness and enrollment and collecting a wide array of student progress indicators meant to predict their postsecondary success. A recent study by Regional Education Laboratory (REL) Southwest showed that a logistic regression model that uses a fairly small number of such indicators, measured as early as in seventh or eighth grade, predicts with a high degree of accuracy whether students will enroll in college four or five years later (Hester et al., 2021). But does this predictive model – and the entire “early warning” system that could rely on it – work equally well for all student groups? In general, predictive models are designed to reduce average prediction error. So, when the dataset used for predictive modeling covers several substantially different populations, the models tend to make more accurate predictions for the largest subset and less accurate for the rest of the observations. Meaning, if the sample your model relies on is mostly White, it will most accurately predict outcomes for White students. In addition, predictive strength of some indicators may vary across student groups. In practice, this means that such a model may turn out to be less useful to forecast the outcome for those students who should benefit the most from it

Researchers from Empirical Education and AIR teamed up to complete a study for REL Southwest that focuses on the differences in predictive strength and model accuracy across student groups. It was a massive analytical undertaking based on nine years of tracking two cohorts of six graders from the whole state, close to 80,000 records and hundreds of variables including student characteristics (including gender, race/ethnicity, eligibility for the National School Lunch Program, English learner student status, disability status, age, and district locale), middle and high school academic and behavioral indicators, and their interactions. First, we found that several student groups—including Black and Hispanic students, students eligible for the National School Lunch Program, English learner students, and students with disabilities—were substantially less likely (by 10 percentage points or more) to be ready for or enroll in college than students without these characteristics. However, our main finding, and a reassuring one, is that the model’s predictive power and predictive strength of most indicators is similar across student groups. In fact, the model often does a better job predicting postsecondary outcomes for those student groups in most need of support.

Let’s talk about what “better” means in a study like that. It is fair to say that statistical model quality is seldom of particular interest in educational research and is often limited to a footnote showing the value of R2 (the proportion of variation in the outcome explained by independent variables). It can tell us something about the amount of “noise” in the data, but it is hardly something that policy makers are normally concerned with. In the situation where the model’s ability to predict a binary outcome—whether or not the student went to college—is the primary concern, there is a clear need for an easily interpretable and actionable metric. We just need to know how often the model is likely to predict the future correctly based on current data.

Logistic regression, which is used for predicting binary outcomes, produces probabilities of outcomes. When the predicted probability (like that of college enrollment) is above fifty percent, then we say that it predicts success (“yes, this student will enroll”), and it predicts failure (“no, they will not enroll”) otherwise. When the actual outcomes are known, we can evaluate the accuracy of the model. Counting the cases in which the predicted outcome coincides with the actual one and dividing it by the total number of cases yields the overall model accuracy. The model accuracy is a useful metric that is typically reported in predictive studies with binary outcomes. We found, for example, that the model accuracy in predicting college persistence (students completing at least two years of college) is 70% when only middle school indicators are used as predictors, and it goes up to 75% when high school indicators are included. These statistics vary little across student groups, by no more than one or two percentage points. Although it is useful to know that outcomes two years after graduation from high school can be predicted with a decent accuracy in as early as eighth grade, the ultimate goal is to ensure that students at risk of failure are identified while schools still can provide them with necessary support. Unfortunately, such a metric as the model accuracy is not particularly helpful in this case.

Instead, a metric called “model specificity” in the parlance of predictive analytics lets us view the data from a different angle. It is calculated as a proportion of correctly predicted negative outcomes alone, ignoring the positive ones. The model specificity metric turns out to vary across student groups a lot in our study but the nature of this variation validates the ADE’s system: for the student groups in most need of support, the model specificity is higher than for the rest of the data. For some student groups, the model can detect that a student is not on track to postsecondary success with near certainty. For example, failure to attain college persistence is correctly predicted from middle school data in 91 percent of cases for English learner students compared to 65 percent for non-English learner students. Adding high school data into the mix lowers the gap—to 88 vs 76 percent—but specificity is still higher for the English learner students, and this pattern holds across all other student groups.

The predictive model used in the ADE study can certainly power an efficient early warning system. However, we need to keep in mind what those numbers mean. For some students from historically excluded communities, their early life experiences create significant obstacles down the road. Some high schools are not doing enough to put these students on a new track that would ensure college enrollment and graduation. It is also worth noting that while this study provides evidence that ADE has developed an effective system of indicators, the observations used in the study come from the two cohorts of students who were in the sixth graders in 2008–09 and 2009–10. Many socioeconomic conditions have changed since then. Thus, the only way to assure that the models remain accurate is to proceed from isolated studies to building “live” predictive tools that would update the models as soon as a new annual batch of outcome data becomes available.

Read complete report titled “Student Group Differences In Arkansas’ Indicators of Postsecondary Readiness and Success” here.

Early Progress and Outcomes of a Grow Your Own Grant Program for High School Students and Paraprofessionals in Texas

Shortage and high turnover of teachers is a problem that rural schools face across the nation. Empirical Education researchers have contributed to the search for solutions for this problem several times in recent years, including two studies completed for REL Southwest (Sullivan, et al., 2017; Lazarev et al., 2017). While much of the policy research is focused on the ways to recruit and retain credentialed teachers, some states are exploring novel methods to create new pathways into the profession that would help create new local teacher cadres. One such promising initiative is Grow Your Own (GYO) program funded by the Texas Education Agency (TEA). Starting in 2019, TEA provides grants to schools and districts that intend to expand the local teacher labor force through one or both of the following pathways. The first pathway offers high school students an early start in teacher preparation through a sequence of education and training courses. The second pathway aims to help paraprofessionals already employed by schools to transition into teaching positions by covering tuition for credentialing programs, as well as offering a stipend for living expenses.

In the joint project with AIR, Empirical Education researchers explored the potential of the first pathway for high school students to address teacher shortages in rural communities and to increase the diversity of teachers. Since this is a new program, our study was based on the first two years of implementation. We found promising evidence that GYO can positively impact rural communities and increase teacher diversity. For example, GYO grants were allocated primarily to rural and small town communities, and programs were implemented in smaller schools with a higher percentage of Hispanic students and economically disadvantaged students. Participating schools also had higher enrollment in the teacher preparation courses. In short, GYO seems to be reaching rural areas with smaller and more diverse schools, and is boosting enrollment in teacher preparation courses in these areas. However, we also found that fewer than 10% of students in participating districts completed at least one education and training course, and fewer than 1% of students completed the full sequence of courses. Additionally, white and female students are overrepresented in these courses. These and other preliminary results will help the state education agency to fine-tune the program and work toward a successful final result: a greater number and increased diversity of effective teachers who are from the community in which they teach. We look forward to continuing research on the impact of “Grow Your Own.”

Read complete report titled “Early Progress and Outcomes of a Grow Your Own Grant Program for High School Students and Paraprofessionals in Texas” here.

All research Empirical Education has conducted for REL Southwest can be found on our REL-SW webpage.

References

Hester, C., Plank, S., Cotla, C., Bailey, P., & Gerdeman, D. (2021). Identifying Indicators That Predict Postsecondary Readiness and Success in Arkansas. REL 2021-091. Regional Educational Laboratory Southwest. https://eric.ed.gov/?id=ED613040

Lazarev, V., Toby, M., Zacamy, J., Lin L., & Newman, D. (2017). Indicators of Successful Teacher Recruitment and Retention in Oklahoma Rural Schools (REL 2018–275). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. https://ies.ed.gov/ncee/rel/Products/Publication/3872.

Sullivan, K., Barkowski, E., Lindsay, J., Lazarev, V., Nguyen, T., Newman, D., & Lin, L. (2017). Trends in Teacher Mobility in Texas and Associations with Teacher, Student, and School Characteristics (REL 2018–283). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. https://ies.ed.gov/ncee/rel/Products/Publication/3883

2023-01-10

IES Published Our REL Southwest Study on Trends in Teacher Mobility

The U.S. Department of Education’s Institute of Education Sciences published a report of a study we conducted for REL Southwest! We are thankful for the support and engagement we received from the Educator Effectiveness Research Alliance throughout the study.

The study was published in December 2017 and provides updated information regarding teacher mobility for Texas public schools during the 2011-12 through 2015-16 school years. Teacher mobility is defined as teachers changing schools or leaving the public school system.

In the report, descriptive information on mobility rates is presented at the regional and state levels for each school year. Mobility rates are disaggregated further into destination proportions to describe the proportion of teacher mobility due to within-district movement, between-district movement, and leaving Texas public schools. This study leverages data collected by the Texas Education Agency during the pilot of the Texas Teacher Evaluation and Support System (T-TESS) in 57 school districts in 2014-15. Analyses examine how components of the T-TESS observation rubric are related to school-level teacher mobility rates.

During the 2011-12 school year, 18.7% of Texas teachers moved schools within a district, moved between districts, or left the Texas Public School system. By 2015-16, this mobility rate had increased to 22%. Moving between districts was the primary driver of the increase in mobility rates. Results indicate significant links between mobility and teacher, student, and school demographic characteristics. Teachers with special education certifications left Texas public schools at nearly twice the rate of teachers with other teaching certifications. School-level mobility rates showed significant positive correlations with the proportion of special education, economically disadvantaged, low-performing, and minority students. School-level mobility rates were negatively correlated with the proportion of English learner students. Schools with higher overall observation ratings on the T-TESS rubric tended to have lower mobility rates.

Findings from this study will provide state and district policymakers in Texas with updated information about trends and correlates of mobility in the teaching workforce, and offer a systematic baseline for monitoring and planning for future changes. Informed by these findings, policymakers can formulate a more strategic and targeted approach for recruiting and retaining teachers. For instance, instead of using generic approaches to enhance the overall supply of teachers or improve recruitment, more targeted efforts to attract and retain teachers in specific subject areas (for example, special education), in certain stages of their career (for example, novice teachers), and in certain geographic areas are likely to be more productive. Moreover, this analysis may enrich the existing knowledge base about schools’ teacher retention and mobility in relation to the quality of their teaching force, or may inform policy discussions about the importance of a stable teaching force for teaching effectiveness.

2018-02-01

We Are Participating in the Upcoming REL Webinar on Teacher Mobility

Join Regional Educational Laboratories Midwest and Southwest for a free webinar on October 4 to learn how states can address teacher demand and mobility trends. As a partner in REL Southwest, we will be reporting on our work on teacher recruitment and retention in rural Oklahoma.

Teachers and administrators change schools for a variety of reasons. Mobility can be a positive if an educator moves to a position that is a better fit, but it can also have serious implications for states. Mobility may harm schools that serve high-need populations, and mobility can also create additional recruitment and hiring costs for districts.

This webinar focuses on research addressing the teacher pipeline and the mobility of teachers between schools and districts. Presenters will discuss two published REL Midwest research studies on teacher mobility trends and strategies for estimating teacher supply and demand. Following each presentation, leaders from the Oklahoma State Department of Education and the Minnesota Department of Education will respond to the presentations and share state initiatives to meet teacher staffing needs. Presenters also will briefly highlight two upcoming REL Southwest studies related to teacher supply and demand that are expected to be released later this year.

The studies that will be discussed are:
- An examination of the movement of educators within and across three Midwest Region states (REL Midwest, AIR)
- Strategies for estimating teacher supply and demand using student and teacher data (REL Midwest, AIR)
- Indicators of successful teacher recruitment and retention in Oklahoma rural schools (REL Southwest, Empirical Education)
- Teacher mobility in Texas: Trends and associations with student, teacher, and school characteristics (REL Southwest, AIR, Empirical Education)

This webinar is designed for state education staff, administrators in schools and districts with significant American Indian populations, American Indian community leaders, research alliance and community of practice members, and education researchers. If you cannot attend the live event, register at the link below to be notified when a recording of the webinar is available online.

Exploring Educator Movement Between Districts
October 4, 2017
10:00–11:30 a.m. PT

The Regional Educational Laboratories (RELs) build the capacity of educators to use data and research to improve student outcomes. Each REL responds to needs identified in its region and makes learning opportunities and other resources available to educators throughout the United States. The REL program is a part of the Institute of Education Sciences (IES) in the U.S. Department of Education. To receive regular updates on REL work, including events and reports, follow IES on Facebook and Twitter.

You can register for this event on the REL website.

2017-09-21

Work has Started on Analysis of Texas Educator Evaluation (T-TESS) Pilot

Empirical Education, through its contract with the REL Southwest, has begun the data collection process for an analysis of the Texas Teacher Evaluation and Support System (T-TESS) pilot conducted by the Texas Education Agency. This is announced on the IES site. Empirical’s Senior Research Scientist Val Lazarev is leading the analysis, which will focus on the elements and components of the system to better understand what T-TESS is measuring and provide alternative approaches to forming summative or composite scores.

2015-06-05

Report completed on the effectiveness of MathForward

Empirical Education assisted Richardson Independent School District (RISD) in conducting an evaluation of the MathForward algebra readiness program published by Texas Instruments. RISD implemented MathForward in their 7th and 8th grade general mathematics and 9th grade Algebra I classes. The research employed an interrupted time series design comparing existing student achievement scores with MathForward to student achievement scores from the three years prior to the introduction of MathForward.

The results of this four-year study suggest that 7th grade students scored, on average, 11 percentile points higher with MathForward than the 7th grade students from the three previous years without MathForward. A similar result for the 8th grade suggests that students participating in MathForward scored, on average, 9 percentile points higher. While the trend did not hold for 9th grade, further exploration suggests that 9th grade students whose teachers had 3 years experience using MathForward scored higher than 9th grade students whose teachers did not use MathForward.

The report further illustrates how an interrupted time series design can be used to study a program as it is rolled out over several years. This research will be presented at the 2010 AERA conference in Denver, Colorado (Friday, April 30 — Tuesday, May 4).

2009-11-13
Archive