blog posts and news stories

Study of Alabama STEM Initiative Finds Positive Impacts

On February 21, 2012 the U.S. Department of Education released the final report of an experiment that Empirical Education has been working on for the last six years. The report, titled Evaluation of the Effectiveness of the Alabama Math, Science, and Technology Initiative (AMSTI) is now available on the Institute of Education Sciences website. The Alabama State Department of Education held a press conference to announce the findings, attended by Superintendent of Education Bice, staff of AMSTI, along with educators, students, and co-principal investigator of the study, Denis Newman, CEO of Empirical Education.

AMSTI was developed by the state of Alabama and introduced in 2002 with the goal of improving mathematics and science achievement in the state’s K-12 schools. Empirical Education was primarily responsible for conducting the study—including the design, data collection, analysis, and reporting—under its subcontract with the Regional Education Lab, Southeast (the study was initiated through a research grant to Empirical). Researchers from Academy of Education Development, Abt Associates, and ANALYTICA made important contributions to design, analysis and data collection.

The findings show that after one year, students in the 41 AMSTI schools experienced an impact on mathematics achievement equivalent to 28 days of additional student progress over students receiving conventional mathematics instruction. The study found, after one year, no difference for science achievement. It also found that AMSTI had an impact on teachers’ active learning classroom practices in math and science that, according to the theory of action posited by AMSTI, should have an impact on achievement. Further exploratory analysis found effects for student achievement in both mathematics and science after two years. The study also explored reading achievement, where it found significant differences between the AMSTI and control groups after one year. Exploration of differential effect for student demographic categories found consistent results for gender, socio-economic status, and pretest achievement level for math and science. For reading, however, the breakdown by student ethnicity suggests a differential benefit.

Just about everybody at Empirical worked on this project at one point or another. Besides the three of us (Newman, Jaciw and Zacamy) who are listed among the authors, we want to acknowledge past and current employees whose efforts made the project possible: Jessica Cabalo, Ruthie Chang, Zach Chin, Huan Cung, Dan Ho, Akiko Lipton, Boya Ma, Robin Means, Gloria Miller, Bob Smith, Laurel Sterling, Qingfeng Zhao, Xiaohui Zheng, and Margit Zsolnay.

With solid cooperation of the state’s Department of Education and the AMSTI team, approximately 780 teachers and 30,000 upper-elementary and middle school students in 82 schools from five regions in Alabama participated in the study. The schools were randomized into one of two categories: 1) Those who received AMSTI starting the first year, or 2) Those who received “business as usual” the first year and began participation in AMSTI the second year. With only a one-year delay before the control group entered treatment, the two-year impact was estimated using statistical techniques developed by, and with the assistance of our colleagues at Abt Associates. Academy for Education Development assisted with data collection and analysis of training and program implementation.

Findings of the AMSTI study will also be presented at the Society for Research on Educational Effectiveness (SREE) Spring Conference taking place in Washington D.C. from March 8-10, 2012. Join Denis Newman, Andrew Jaciw, and Boya Ma on Friday March 9, 2012 from 3:00pm-4:30pm, when they will present findings of their study titled, “Locating Differential Effectiveness of a STEM Initiative through Exploration of Moderators.” A symposium on the study, including the major study collaborators, will be presented at the annual conference of the American Educational Research Association (AERA) on April 15, 2012 from 2:15pm-3:45pm at the Marriott Pinnacle ⁄ Pinnacle III in Vancouver, Canada. This session will be chaired by Ludy van Broekhuizen (director of REL-SE) and will include presentations by Steve Ricks (director of AMSTI); Jean Scott (SERVE Center at UNCG); Denis Newman, Andrew Jaciw, Boya Ma, and Jenna Zacamy (Empirical Education); Steve Bell (Abt Associates); and Laura Gould (formerly of AED). Sean Reardon (Stanford) will serve as the discussant. A synopsis of the study will also be included in the Common Guidelines for Education Research and Development.

2012-02-21

Empirical Education Partners with Carnegie Learning on New Student Performance Guarantee

Schools looking to improve student Algebra and Geometry achievement have signed up for a guarantee from Carnegie Learning® that states that students using the company’s Cognitive Tutor programs will pass their math courses. Empirical Education is tasked to monitor student performance in participating schools. Starting this school year, Carnegie Learning guarantees that students who take three complete and consecutive years of Carnegie Learning’s math courses will pass their math class in the third year. The guarantee applies to middle and high school students taking the Carnegie Learning Bridge to Algebra, Algebra, Algebra II, and Geometry courses.

In the coming weeks/months, Empirical will collect roster data, course grades, and assessment scores from schools as well as usage data from Carnegie Learning’s math teaching software. These data will be combined to generate biannual reports that will provide schools with evidence they can use to effectively improve implementation of the courses and raise student achievement.

Carnegie Learning’s guarantee is part of their School Improvement Grant support efforts. “Partnering with Empirical Education will allow us to get mid- and end-of-year research reports into the hands of our school partners,” says Steve Ritter, Co-Founder and Chief Scientist at Carnegie Learning. “It’s part of our continuous improvement cycle; we’re excited to see the progress districts committed to the turnaround and transformation process can make with these new, powerful tools.”

2010-11-30

Poway Completes Study from MeasureResults Pilot

The results are in for Poway Unified School District’s first research study using our MeasureResults online tool. PUSD was interested in measuring the impact of CompassLearning’s Odyssey Reading program in the middle grades. Using an “interrupted time series” design with a comparison group, they found that both 7th and 8th grade students averaged 1 to 2 points higher than expected on the NWEA MAP Literacy assessment. PUSD plans to continue their evaluation of CompassLearning Odyssey in different subject areas and grade levels. Join us in D.C. at this year’s CoSN conference on March 1, 2010 as Eric Lehew, Executive Director of Learning Support Services at PUSD, presents findings and reflections on the process of using MeasureResults to conduct research at the local district level.

Click here to download a copy of the PUSD achievement report.

2010-02-12

Report completed on the effectiveness of MathForward

Empirical Education assisted Richardson Independent School District (RISD) in conducting an evaluation of the MathForward algebra readiness program published by Texas Instruments. RISD implemented MathForward in their 7th and 8th grade general mathematics and 9th grade Algebra I classes. The research employed an interrupted time series design comparing existing student achievement scores with MathForward to student achievement scores from the three years prior to the introduction of MathForward.

The results of this four-year study suggest that 7th grade students scored, on average, 11 percentile points higher with MathForward than the 7th grade students from the three previous years without MathForward. A similar result for the 8th grade suggests that students participating in MathForward scored, on average, 9 percentile points higher. While the trend did not hold for 9th grade, further exploration suggests that 9th grade students whose teachers had 3 years experience using MathForward scored higher than 9th grade students whose teachers did not use MathForward.

The report further illustrates how an interrupted time series design can be used to study a program as it is rolled out over several years. This research will be presented at the 2010 AERA conference in Denver, Colorado (Friday, April 30 — Tuesday, May 4).

2009-11-13
Archive