blog posts and news stories

New Mexico Implementation


Empirical Education and the New Mexico Public Education Department (NMPED) are entering into their fourth year of collaboration using Observation Engine to increase educator effectiveness by improving understanding of the NMTEACH observation protocol and inter-rater reliability amongst observers using it. During the implementation, Observation Engine has been used for calibration and professional development with over 2,000 educators across the state annually. In partnership with the Southern Regional Education Board (SREB), who is providing training on best practices, the users in New Mexico have pushed the boundaries of what is possible with Observation Engine. Observation Engine was initially used solely for certifying observers prior to live classroom observations. Now, observers are relying on Observation Engine’s lesson functionality to provide professional development throughout the year. In addition, some administrators are now using videos and content from Observation Engine directly with teachers to provide them with models of what good instruction looks like.

The exciting news is that the collaborative efforts of NMPED, SREB, and Observation Engine are demonstrating impressive results across New Mexico that are noteworthy, especially when compared to the rest of the nation. In a compilation of teacher performance ratings from 19 states that have reformed their evaluation system since the seminal Widget Effect Report, Kraft and Gilmour (2016) found that in a majority of these states, fewer than 3 percent of teachers are rated below proficient. New Mexico stood out as an outlier among these states with 26.2% of teachers rated below proficient, a percentage comparable with more realistic pilots of educator effectiveness ratings. This is likely a sign of excellent professional development, as well as a willingness to realistically adjust the thresholds for proficiency based on the data that is being yielded and examined from actual practice, such as data captured within Observation Engine.

Kraft, M.A., & Gilmour, A.F. (2016). Revisiting the Widget Effect: Teacher Evaluation Reforms and the Distribution of Teacher Effectiveness. Brown University working paper. Retrieved July 21, 2016, from https://scholar.harvard.edu/mkraft/publications/revisiting-widget-effect-teacher-evaluation-reforms-and-distribution-teacher.

2016-12-02

Empirical Education Publication Productivity


Empirical Education’s research group, led by Chief Scientist Andrew Jaciw, has been busy publishing articles that address key concerns of educators and researchers.

Our article describing the efficacy trial of Math in Focus program that was accepted by JREE earlier this year just arrived in print to our Palo Alto office a couple of weeks ago. If you subscribe to JREE, it’s the very first article in the current issue (volume 9, number 4). If you don’t subscribe, we have a copy in our lobby if anyone would like to stop by and check it out.

Another article that the analysis team has been working on is called “An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials.” This one has recently been accepted for publication in the Evaluation Review in the issue that should be printed any day now. The paper grows out of our work on many cluster randomized trials and our interest in differential impacts of programs. We believe that the question of “what works” has limited meaning without systematic exploration of “for whom” and “under what conditions”. The common perception is that these latter concerns are secondary and our designs have too little power to assess them. We challenge these notions and provide guidelines for addressing these questions.

In another issue of Evaluation Review, we published two companion articles:

Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach: The Methodology


Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences from Comparison Groups Studies

This work further extends our interest in issues of external validity and equip researchers with a strategy for testing the limits of generalizations from randomized trials. Written for a technical audience, the work extends an approach commonly used to assess levels of selection bias in estimates from non-experimental studies to examine bias in generalized inferences from experiments and non-experiments.

It’s always exciting for our team to share the findings from our experiments, as well as the things we learn during the analysis that can help the evaluation community provide more productive evidence for educators. Much of our work is done in partnership with other organizations and if you’re interested in partnering with us on this kind of work, please email Robin Means.

2016-11-18

Pittsburgh Public Schools Uses 20 New Content Suite Videos


In June 2015, Pittsburgh Public Schools began using Observation Engine for calibration and training their teacher evaluators. They were one of our first clients to use the Content Suite to calibrate and certify classroom observers.

The Content Suite contains a collection of master scored videos along with thoughtful, objective score justifications on all observable elements of teaching called for by the evaluation framework used in the district. And it includes short video clips focused on one particular aspect of teaching. The combination of full-length and short clips makes it possible to easily and flexibly set up practice exercises, collaborative calibration sessions, and formal certification testing. Recently, we have added 20 new videos to the Content Suite collection!

The Content Suite can be used with most frameworks, either as-is or modified to ensure that the scores and justifications are consistent with the local context and observation framework interpretation. Observation Engine support staff will work closely with each client to modify content and design a customized implementation plan that meets the goals of the school system and sets up evaluators for success. For more information about the Content Suite, click here.

2016-11-10

Arkansas Implements Observation Engine Statewide

BloomBoard’s observation tool, EdReflect, has been used across the state of Arkansas since fall 2014. Last year, the Arkansas Department of Education piloted Observation Engine, an online observation training and calibration tool from Empirical Education Inc., in four districts under the state’s Equitable Access Plan. Accessible through the BloomBoard platform, Observation Engine allows administrators and other teacher evaluators to improve scoring calibration and reliability through viewing and rating videos of classroom lessons collected in thousands of classrooms across the country.

Paired with BloomBoard resources and training, the results were impressive. In one district, the number of observers scoring above target increased from 43% to 100%. Not only that but the percent discrepancy (scores that were two levels above or below the target) decreased from 9% to 0%. Similar results were found in the other three pilot districts, prompting decision makers to make Observation Engine readily available to districts throughout the state.

“EdReflect has proven to be a valuable platform for educator observations in Arkansas. The professional conversation, which results from the ability to provide timely feedback and shared understanding of effective practice, has proven to ensure a transparency and collaboration that we have not experienced before. With the addition of Empirical Education’s Observation Engine, credentialed teacher observers have ready access to increase inter-rater reliability and personal skill. For the first time this year, BloomBoard Collections and micro-credentials have begun meeting individualized professional learning needs for educators all over the state.”
– Sandra Hurst, Arkansas Department of Education

In July, the Arkansas Department of Education decided to offer Observation Engine to the entire state. About half of all districts in the state opted in to receive the service, with the implementation spanning three groups of users in Arkansas. The Beginning Administrators group has already started pursuing a micro-credential based on Observation Engine. Micro-credentials are a digital form of certification that indicate a person has demonstrated competency in a specific skill set. The Beginning Administrators group can earn their “Observation Skills for Beginning Administrators” micro-credential by demonstrating observation skill competencies using Observation Engine’s online observer calibration tool to practice and assess observation skills.

Next month, the 26 more districts under the Equitable Access Plan and the remaining Arkansas districts will begin using Observation Engine. We look forward to following and reporting on the progress of these districts during the 2016-17 school year.

2016-11-02

Presenting Research-Based Tools and Resources for Family & Community Engagement at the NIEA Convention

On October 8, 2016, Jenna Zacamy, Erica Plut, and Haidee Williams (AIR) presented a workshop at the National Indian Education Association convention in Reno, Nevada. The workshop, Research-Based Tools and Resources for Family & Community Engagement, provided approximately 50 attendees with research literature on this topic, examples of promising practices to engage families and communities in various contexts gathered from the literature, and a method for exploring local programs and planning for improvements. Conversations were rich as attendees shared their current strategies and discussed new ideas. Implementing research-based practices for family and community engagement specific to Native Americans can help improve engagement and achievement outcomes in these unique communities.

2016-10-18

Presentation at the 2016 Learning Forward Annual Conference

Learning Forward announced that our proposal was accepted for the 2016 annual conference being held in Vancouver, Canada this year. Teacher Evaluation Specialist K.C. MacQueen will join Fort Wayne Community Schools’ (FWCS) Todd Cummings and Learning Forward’s Kay Psencik in presenting “Principals Collaborating to Deepen Understanding of High-Quality Instruction.” They will highlight how FWCS is engaged in a process to ensure equitable evaluation of teacher effectiveness using Observation Engine™. If you or someone you know is attending the annual conference in December 2016, here are the details of the presentation.

  • Day/time: Tuesday, December 6, 2016 from 10AM-Noon
  • Session: I 15
2016-08-02

Report of the Evaluation of iRAISE Released

Empirical Education Inc. has completed its evaluation (read the report here) of an online professional development program for Reading Apprenticeship. WestEd’s Strategic Literacy Initiative (SLI) was awarded a development grant under the Investing in Innovation (i3) program in 2012. iRAISE (internet-based Reading Apprenticeship Improving Science Education) is an online professional development program for high school science teachers. iRAISE trained more than 100 teachers in Michigan and Pennsylvania over the three years of the grant. Empirical’s randomized control trial measured the impact of the program on students with special attention to differences in their incoming reading achievement levels.

The goal of iRAISE was to improve student achievement by training teachers in the use of Reading Apprenticeship, an instructional framework that describes the classroom in four interacting dimensions of learning: social, personal, cognitive, and knowledge-building. The inquiry-based professional development (PD) model included a week-long Foundations training in the summer; monthly synchronous group sessions and smaller personal learning communities; and asynchronous discussion groups designed to change teachers’ understanding of their role in adolescent literacy development and to build capacity for literacy instruction in the academic disciplines. iRAISE adapted an earlier face-to-face version of Reading Apprenticeship professional development, which was studied under an earlier i3 grant, Reading Apprenticeship Improving Secondary Education (RAISE), into a completely online course, creating a flexible, accessible platform.

To evaluate iRAISE, Empirical Education conducted an experiment in which 82 teachers across 27 schools were randomly assigned to either receive the iRAISE Professional Development during the 2014-15 school year or continue with business as usual and receive the program one year later. Data collection included monthly teacher surveys that measured their use of several classroom instructional practices and a spring administration of an online literacy assessment, developed by Educational Testing Service, to measure student achievement in literacy. We found significant positive impacts of iRAISE on several of the classroom practice outcomes, including teachers providing explicit instruction on comprehension strategies, their use of metacognitive inquiry strategies, and their levels of confidence in literacy instruction. These results were consistent with the prior RAISE research study and are an important replication of the previous findings, as they substantiate the success of SLI’s development of a more accessible online version of their teacher PD. After a one-year implementation with iRAISE, we do not find an overall effect of the program on student literacy achievement. However, we did find that levels of incoming reading achievement moderate the impact of iRAISE on general reading literacy such that lower scoring students benefit more. The success of iRAISE in adapting immersive, high-quality professional development to an online platform is promising for the field.

You can access the report and research summary from the study using the links below.
iRAISE research report
iRAISE research summary

2016-07-01

Empirical’s Impact as a Service Providing Insight to EdTech Companies

Education innovators and entrepreneurs have been receiving a boost of support from private equity investors. Currently, ASU GSV is holding their 2016 Summit to support new businesses whose goals are to make a difference in education. Reach Newschools Capital (Reach) is one such organization providing early stage funding, as well as business acumen to entrepreneurs who are trying to solve the most challenging issues…and often with the most challenged populations, in K-12 education. Through Empirical Education, Reach is providing research services by examining the demographic impact of the constituents these education innovators hope to serve. By examining company data from 20 of Reach’s portfolio companies, Empirical provides reports and easy-to-read graphs comparing customer demographic information to national average estimates.

The reports have been well received in gleaning the kind of information companies need to stay on mission…economically, through goods and services, and as social impact.

“The Edtech industry is trying to change the perception that the latest and greatest technologies are only reaching the wealthiest students with the most resources. These reports are disproving this claim, showing that there are a large number of low-income, minority students utilizing these products.” said Aly Sharp, Product Manager for Empirical Education.

2016-04-19

Math in Focus Paper Published in JREE

Chief Scientist Andrew Jaciw’s paper entitled Assessing Impacts of Math in Focus, a “Singapore Math” Program was accepted by the Journal of Research on Educational Effectiveness. The paper reports the results of an RCT conducted in Clark County (Las Vegas, NV) by a team that included Whitney Hegseth, Li Lin, Megan Toby, Denis Newman, Boya Ma, and Jenna Zacamy. From the abstract (available online here):

Twenty-two grade-level teams across twelve schools were randomized to the program or business as usual. Measures included indicators of fidelity to treatment, and student mathematics learning. Impacts on mathematics achievement ranged between .11 and .15 standard deviation units, with no differential impact based on level of pretest [or] minority status.

2016-03-30

Five-year evaluation of Reading Apprenticeship i3 implementation reported at SREE

Empirical Education has released two research reports on the scale-up and impact of Reading Apprenticeship, as implemented under one of the first cohorts of Investing in Innovation (i3) grants. The Reading Apprenticeship Improving Secondary Education (RAISE) project reached approximately 2,800 teachers in five states with a program providing teacher professional development in content literacy in three disciplines: science, history, and English language arts. RAISE supported Empirical Education and our partner, IMPAQ International, in evaluating the innovation through both a randomized control trial encompassing 42 schools and a systematic study of the scale-up of 239 schools. The RCT found significant impact on student achievement in science classes consistent with prior studies. Mean impact across subjects, while positive, did not reach the .05 level of significance. The scale-up study found evidence that the strategy of building cross-disciplinary teacher teams within the school is associated with growth and sustainability of the program. Both sides of the evaluation were presented at the annual conference of the Society for Research on Educational Effectiveness, March 6-8, 2016 in Washington DC. Cheri Fancsali (formerly of IMPAQ, now at Research Alliance for NYC Schools) presented results of the RCT. Denis Newman (Empirical) presented a comparison of RAISE as instantiated in the RCT and scale-up contexts.

You can access the reports and research summaries from the studies using the links below.
RAISE RCT research report
RAISE RCT research summary
RAISE Scale-up research report
RAISE Scale-up research summary

2016-03-09
Archive