blog posts and news stories

Presenting Research-Based Tools and Resources for Family & Community Engagement at the NIEA Convention

On October 8, 2016, Jenna Zacamy, Erica Plut, and Haidee Williams (AIR) presented a workshop at the National Indian Education Association convention in Reno, Nevada. The workshop, Research-Based Tools and Resources for Family & Community Engagement, provided approximately 50 attendees with research literature on this topic, examples of promising practices to engage families and communities in various contexts gathered from the literature, and a method for exploring local programs and planning for improvements. Conversations were rich as attendees shared their current strategies and discussed new ideas. Implementing research-based practices for family and community engagement specific to Native Americans can help improve engagement and achievement outcomes in these unique communities.


Presentation at the 2016 Learning Forward Annual Conference

Learning Forward announced that our proposal was accepted for the 2016 annual conference being held in Vancouver, Canada this year. Teacher Evaluation Specialist K.C. MacQueen will join Fort Wayne Community Schools’ (FWCS) Todd Cummings and Learning Forward’s Kay Psencik in presenting “Principals Collaborating to Deepen Understanding of High-Quality Instruction.” They will highlight how FWCS is engaged in a process to ensure equitable evaluation of teacher effectiveness using Observation Engine™. If you or someone you know is attending the annual conference in December 2016, here are the details of the presentation.

  • Day/time: Tuesday, December 6, 2016 from 10AM-Noon
  • Session: I 15

Report of the Evaluation of iRAISE Released

Empirical Education Inc. has completed its evaluation (read the report here) of an online professional development program for Reading Apprenticeship. WestEd’s Strategic Literacy Initiative (SLI) was awarded a development grant under the Investing in Innovation (i3) program in 2012. iRAISE (internet-based Reading Apprenticeship Improving Science Education) is an online professional development program for high school science teachers. iRAISE trained more than 100 teachers in Michigan and Pennsylvania over the three years of the grant. Empirical’s randomized control trial measured the impact of the program on students with special attention to differences in their incoming reading achievement levels.

The goal of iRAISE was to improve student achievement by training teachers in the use of Reading Apprenticeship, an instructional framework that describes the classroom in four interacting dimensions of learning: social, personal, cognitive, and knowledge-building. The inquiry-based professional development (PD) model included a week-long Foundations training in the summer; monthly synchronous group sessions and smaller personal learning communities; and asynchronous discussion groups designed to change teachers’ understanding of their role in adolescent literacy development and to build capacity for literacy instruction in the academic disciplines. iRAISE adapted an earlier face-to-face version of Reading Apprenticeship professional development, which was studied under an earlier i3 grant, Reading Apprenticeship Improving Secondary Education (RAISE), into a completely online course, creating a flexible, accessible platform.

To evaluate iRAISE, Empirical Education conducted an experiment in which 82 teachers across 27 schools were randomly assigned to either receive the iRAISE Professional Development during the 2014-15 school year or continue with business as usual and receive the program one year later. Data collection included monthly teacher surveys that measured their use of several classroom instructional practices and a spring administration of an online literacy assessment, developed by Educational Testing Service, to measure student achievement in literacy. We found significant positive impacts of iRAISE on several of the classroom practice outcomes, including teachers providing explicit instruction on comprehension strategies, their use of metacognitive inquiry strategies, and their levels of confidence in literacy instruction. These results were consistent with the prior RAISE research study and are an important replication of the previous findings, as they substantiate the success of SLI’s development of a more accessible online version of their teacher PD. After a one-year implementation with iRAISE, we do not find an overall effect of the program on student literacy achievement. However, we did find that levels of incoming reading achievement moderate the impact of iRAISE on general reading literacy such that lower scoring students benefit more. The success of iRAISE in adapting immersive, high-quality professional development to an online platform is promising for the field.

You can access the report and research summary from the study using the links below.
iRAISE research report
iRAISE research summary


Empirical’s Impact as a Service Providing Insight to EdTech Companies

Education innovators and entrepreneurs have been receiving a boost of support from private equity investors. Currently, ASU GSV is holding their 2016 Summit to support new businesses whose goals are to make a difference in education. Reach Newschools Capital (Reach) is one such organization providing early stage funding, as well as business acumen to entrepreneurs who are trying to solve the most challenging issues…and often with the most challenged populations, in K-12 education. Through Empirical Education, Reach is providing research services by examining the demographic impact of the constituents these education innovators hope to serve. By examining company data from 20 of Reach’s portfolio companies, Empirical provides reports and easy-to-read graphs comparing customer demographic information to national average estimates.

The reports have been well received in gleaning the kind of information companies need to stay on mission…economically, through goods and services, and as social impact.

“The Edtech industry is trying to change the perception that the latest and greatest technologies are only reaching the wealthiest students with the most resources. These reports are disproving this claim, showing that there are a large number of low-income, minority students utilizing these products.” said Aly Sharp, Product Manager for Empirical Education.


Math in Focus Paper Accepted to JREE

Chief Scientist Andrew Jaciw’s paper entitled Assessing Impacts of Math in Focus, a “Singapore Math” Program was accepted by the Journal of Research on Educational Effectiveness. The paper reports the results of an RCT conducted in Clark County (Las Vegas, NV) by a team that included Whitney Hegseth, Li Lin, Megan Toby, Denis Newman, Boya Ma, and Jenna Zacamy. From the abstract (available online here):

Twenty-two grade-level teams across twelve schools were randomized to the program or business as usual. Measures included indicators of fidelity to treatment, and student mathematics learning. Impacts on mathematics achievement ranged between .11 and .15 standard deviation units, with no differential impact based on level of pretest [or] minority status.


Five-year evaluation of Reading Apprenticeship i3 implementation reported at SREE

Empirical Education has released two research reports on the scale-up and impact of Reading Apprenticeship, as implemented under one of the first cohorts of Investing in Innovation (i3) grants. The Reading Apprenticeship Improving Secondary Education (RAISE) project reached approximately 2,800 teachers in five states with a program providing teacher professional development in content literacy in three disciplines: science, history, and English language arts. RAISE supported Empirical Education and our partner, IMPAQ International, in evaluating the innovation through both a randomized control trial encompassing 42 schools and a systematic study of the scale-up of 239 schools. The RCT found significant impact on student achievement in science classes consistent with prior studies. Mean impact across subjects, while positive, did not reach the .05 level of significance. The scale-up study found evidence that the strategy of building cross-disciplinary teacher teams within the school is associated with growth and sustainability of the program. Both sides of the evaluation were presented at the annual conference of the Society for Research on Educational Effectiveness, March 6-8, 2016 in Washington DC. Cheri Fancsali (formerly of IMPAQ, now at Research Alliance for NYC Schools) presented results of the RCT. Denis Newman (Empirical) presented a comparison of RAISE as instantiated in the RCT and scale-up contexts.

You can access the reports and research summaries from the studies using the links below.
RAISE RCT research report
RAISE RCT research summary
RAISE Scale-up research report
RAISE Scale-up research summary


Evaluation Concludes Aspire’s PD Tools Show Promise to Impact Classroom Practice

Empirical Education Inc. has completed an independent evaluation (read the report here) of a set of tools and professional development opportunities developed and implemented by Aspire Public Schools under an Investing in Innovation (i3) grant. Aspire was awarded the development grant in the 2011 funding cycle and put the system, Transforming Teacher Talent (t3), into operation in 2013 in their 35 California schools. The goal of t3 was to improve teacher practice as measured by the Aspire Instructional Rubric (AIR) and thereby improve student outcomes on the California Standards Test (CST), the state assessment. Some of the t3 components connected the AIR scores from classroom observations to individualized professional development materials building on tools from BloomBoard, Inc.

To evaluate t3, Empirical principal investigator, Andrew Jaciw and his team designed the strongest feasible evaluation. Since it was not possible to split the schools into two groups by having two versions of Aspire’s technology infrastructure supporting t3, a randomized experiment or other comparison group design was not feasible. Working with the National Evaluation of i3 (NEi3) team, Empirical developed a correlational design comparing two years of teacher AIR scores and student CST scores; that is, from the 2012-13 school year to the scores in the first year of implementation, 2013-14. Because the state was in a transition to new Common Core tests, the evaluation was unable to collect student outcomes systematically. The AIR scores, however, provided evidence of substantial overall improvement with an effect size of 0.581 standard deviations (p <.001). The evidence meets the standards for “evidence-based” as defined in the recently enacted Every Student Succeeds Act (ESSA), which requires, at the least, that the test of the intervention “demonstrates a statistically significant effect on improving…relevant outcomes based on…promising evidence from at least 1 well designed and well-implemented correlational study with statistical controls for selection bias.” A demonstration of promise can assist in obtaining federal and other funding.


SREE Spring 2016 Conference Presentations

We are excited to be presenting two topics at the annual Spring Conference of The Society for Research on Educational Effectiveness (SREE) next week. Our first presentation addresses the problem of using multiple pieces of evidence to support decisions. Our second presentation compares the context of an RCT with schools implementing the same program without those constraints. If you’re at SREE, we hope to run into you, either at one of these presentations (details below) or at one of yours.

Friday, March 4, 2016 from 3:30 - 5PM
Roosevelt (“TR”) - Ritz-Carlton Hotel, Ballroom Level

6E. Evaluating Educational Policies and Programs
Evidence-Based Decision-Making and Continuous Improvement

Chair: Robin Wisniewski, RTI International

Does “What Works”, Work for Me?: Translating Causal Impact Findings from Multiple RCTs of a Program to Support Decision-Making
Andrew P. Jaciw, Denis Newman, Val Lazarev, & Boya Ma, Empirical Education

Saturday, March 5, 2016 from 10AM - 12PM
Culpeper - Fairmont Hotel, Ballroom Level

Session 8F: Evaluating Educational Policies and Programs & International Perspectives on Educational Effectiveness
The Challenge of Scale: Evidence from Charters, Vouchers, and i3

Chair: Ash Vasudeva, Bill & Melinda Gates Foundation

Comparing a Program Implemented under the Constraints of an RCT and in the Wild
Denis Newman, Valeriy Lazarev, & Jenna Zacamy, Empirical Education


Learning Forward Presentation Highlights Fort Wayne Partnership

This past December, Teacher Evaluation Specialist K.C. MacQueen presented at the annual Learning Forward conference. MacQueen presented alongside Fort Wayne Community Schools’ (FWCS) Todd Cummings and Laura Cain, and Learning Forward’s Kay Psencik. The presentation titled, “Implementing Inter-Rater Reliability in a Learning System,” highlighted how FWCS has used Calibration & Certification Engine (CCE), School Improvement Network’s branded version of Observation Engine™, to ensure equitable evaluation of teacher effectiveness. FWCS detailed the process they used to engage instructional leaders in developing a common rubric vocabulary around their existing teacher observation rubric. While an uncommon step and one that definitely added to the implementation timeline, FWCS prioritized this collaboration and found that it increased both inter-rater reliability and buy-in to the process with the ultimate goal of assisting teachers in improving classroom instruction to result in greater student growth.


Feds Moving Toward a More Rational and Flexible Approach to Teacher Support and Evaluation

Congress is finally making progress on a bill to replace NCLB. Here’s an excerpt from a summary of the draft law.

Helps states support teachers– The bill provides resources to states and school districts to implement various activities to support teachers, principals, and other educators, including allowable uses of funds for high quality induction programs for new teachers, ongoing professional development opportunities for teachers, and programs to recruit new educators to the profession. Ends federal mandates on evaluations, allows states to innovate- The bill allows, but does not require, states to develop and implement teacher evaluation systems. This bill eliminates the definition of a highly qualified teacher—which has proven onerous to states and school districts—and provides states with the opportunity to define this term.

This is very positive. It makes teacher evaluation no longer an Obama-imposed requirement but allows states, that want to do it (and there are quite a few of those), to use federal funds to support it. It removes the irrational requirement that “student growth” be a major component of these systems. This will lower the reflexive resistance from unions because the purpose of evaluation can be more clearly associated with teacher support (for more on that argument, see the Real Clear Education piece). It will also encourage the use of observation and feedback from administrators and mentors. Removing the outmoded definition of “highly qualified teacher” opens up the possibility of wider use of research-based analyses of what is important to measure in effective teaching.

A summary is also provided by EdWeek. On a separate note, it says: “That new research and innovation program that some folks were describing as sort of a next generation ‘Investing in Innovation’ program made it into the bill. (Sens. Orrin Hatch, R-Utah, and Michael Bennet, D-Colo., are big fans, as is the administration.)”