News from 2011
- Final Report Released on The Efficacy of PCI’s Reading Program
- Research Guidelines Re-released to Broader Audience
- District Data Study: Empirical’s Newest Research Product
- Empirical presents at AERA 2012
- Empirical's Chief Scientist co-authored a recently released NCEE Reference Report
- Expertise Provided for New York Times Front Page Story
- Join Empirical Education at ALAS, AEA, and NSDC
- New Reports Show Positive Results for Elementary Reading Program
- New RFP calls for Building Regional Research Capacity
- Quasi-experimental Design Used to Build Evidence for Adolescent Reading Intervention
- Conference Season 2011
Final Report Released on The Efficacy of PCI’s Reading Program
December 9, 2011
Empirical has released the final report of a three-year longitudinal study on the efficacy of the PCI Reading Program, which can be found on our reports page. This study, the first formal assessment of the PCI Reading Program, evaluated the program among a sample of third- through eighth-grade students with supported-level disabilities in Florida’s Brevard Public Schools and Miami-Dade County Public Schools. The primary goal of the study was to identify whether the program could achieve its intended purpose of teaching specific sight words. The study was completed in three “phases,” or school years. The results from Phase 1 and 2 showed a significant positive effect on student sight word achievement and Phase 2 supported the initial expectation that two years of growth would be greater than one year (read more on results of Phase 1 and Phase 2).
“Working with Empirical Education was a win for us on many fronts. Their research was of the highest quality and has really helped us communicate with our customers through their several reports and conference presentations. They went beyond just outcomes to show how teachers put our reading program to use in classrooms. In all their dealings with PCI and with the school systems they were highly professional and we look forward to future research partnership opportunities.”
- Lee Wilson, President & CEO, PCI Educational Publishing
In Phase 3, the remaining sample of students was too small to conduct any impact analyses, so researchers investigated patterns in students’ progress through the program. The general findings were positive in that the exploration confirmed that students continue to learn more sight words with a second year of exposure to PCI although at a slower pace than expected by the developers. Furthermore, findings across all three phases show high levels of teacher satisfaction with the program. Along with this positive outcome, teacher-reported student engagement levels were also high.
Research Guidelines Re-released to Broader Audience
December 5, 2011
The updated guidelines for evaluation research were unveiled at the SIIA Ed Tech Business Forum held in New York City on November 28 - 29. Authored by Empirical’s CEO, Denis Newman, and issued by the Software and Information Industry Association (SIIA), the guidelines seek to provide a standard of best practices for conducting and reporting evaluation studies for educational technologies in order to enhance the quality, credibility, and utility to education decision makers.
Denis introduced the guidelines during the "Meet the authors of SIIA Publications" session on November 29. Non-members will be able to purchase the guidelines from Selling to Schools starting Thursday, December 1, 2011 (with continued free access to SIIA members).
UPDATE: Denis was interviewed by Glen McCandless of Selling to Schools on December 15, 2011 to discuss key aspects of the guidelines. Listen to the full interview here.
District Data Study: Empirical’s Newest Research Product
November 20, 2011
Empirical Education introduces its newest offer: District Data Study. Aimed at providing evidence of effectiveness, District Data Study assists vendors in conducting quantitative case studies using historical data from schools and districts currently engaged in a specific educational program.
There are two basic questions that can be cost-effectively answered given the available data.
- Are the outcomes (behavioral or academic) for students in schools that use the program better than outcomes of comparable students in schools not (or before) using the program?
- Is the amount of program usage associated with differences in outcomes?
The data studies result in concise reports on measurable academic and behavioral outcomes using appropriate statistical analyses of customer data from implementation of the educational product or program. District Data Study is built on efficient procedures and engineering infrastructure that can be applied to individual districts already piloting a program or veteran clients with longstanding implementation.
Empirical Presents at AERA 2012
November 18, 2011
We will again be presenting at the annual meeting of the American Educational Research Association (AERA). Join the Empirical Education team in Vancouver, Canada from April 13 – 17, 2012. Our presentations will span two divisions: 1) Measurement and Research Methodology and 2) Research, Evaluation and Assessment in Schools.
Research Topics will include:
- Current Studies in Program Evaluation to Improve Student Achievement Outcomes
- Evaluating Alabama’s Math, Science and Technology Initiative: Results of a Three-Year, State-Wide Randomized Experiment
- Accommodating Data From Quasi–Experimental Design
- Quantitative Approaches to the Evaluation of Literacy Programs and Instruction for Elementary and Secondary Students
We look forward to seeing you at our sessions to discuss our research. You can also download our presentation schedule here. As has become tradition, we plan to host yet another of our popular AERA receptions. Details about the reception will follow in the months to come.
Empirical's Chief Scientist co-authored a recently released NCEE Reference Report
November 02, 2011
Together with researchers from Abt Associates, Andrew Jaciw, Chief Scientist of Empirical Education, co–authored a recently released report entitled, "Estimating the Impacts of Educational Interventions Using State Tests or Study-Administered Tests". The full report released by the The National Center for Education Evaluation and Regional Assistance (NCEE) can be found on the Institute of Education Sciences (IES) website.The NCEE Reference Report examines and identifies factors that could affect the precision of program evaluations when they are based on state assessments instead of study-administered tests. The authors found that using the same test for both the pre- and post-test yielded more precise impact estimates; using two pre-test covariates, one from each type of test (state assessment and study- administered standardized test), yielded more precise impact estimates; using as the dependent variable the simple average of the post-test scores from the two types of tests yielded more precise impact estimates and smaller sample size requirements than using post-test scores from only one of the two types of tests.
Expertise Provided for New York Times Front Page Story
October 11, 2011
Empirical’s CEO, Denis Newman, was one of the experts consulted by New York Times reporter Trip Gabriel in his Sunday Times, front page story, “Inflating the Software Report Card.” Newman’s commentary on the first article in this series can be seen here. The article also refers to the guidelines for evaluation research issued by the Software and Information Industry Association (SIIA), which can be found on the SIIA site. In addition, the report referred to in the article—which was not authored by Newman but a team of company researchers—can be found on our reports page. (Some readers were confused by the misspelling of Newman’s first name as “Dennis”.)
Join Empirical Education at ALAS, AEA, and NSDC
October 10, 2011
This year, the Association of Latino Administrators & Superintendents (ALAS) will be holding its 8th annual summit on Hispanic Education in San Francisco. Participants will have the opportunity to attend speaker sessions, roundtable discussions, and network with fellow attendees. Denis Newman, CEO of Empirical Education, together with John Sipe, Senior Vice President and National Sales Manager at Houghton Mifflin Harcourt and Jeannetta Mitchell, eight-grade teacher at Presidio Middle school and a participant in the pilot study, will take part in a 30-minute discussion reviewing the study design and experiences gathered around a one-year study of Algebra on the iPad. The session takes place on October 13th at the Salon 8 of the Marriott Marquis in San Francisco from 10:30am to 12:00pm.
Also this year, the American Evaluation Association (AEA) will be hosting its 25th annual conference from November 2–5 in Anaheim, CA. Approximately 2,500 evaluation practitioners, academics, and students from around the globe are expected to gather at the conference. This year’s theme revolves around the challenges of values and valuing in evaluation.
We are excited to be part of AEA again this year and would like to invite you to join us at two presentations. First, Denis Newman will be hosting the roundtable session on Returning to the Causal Explanatory Tradition: Lessons for Increasing the External Validity of Results from Randomized Trials. We examine how the causal explanatory tradition—originating in the writing of Lee Cronbach—can inform the planning, conduct and analysis of randomized trials to increase external validity of findings. Find us in the Balboa A/B room on Friday, November 4th from 10:45am to 11:30am.
Second, Valeriy Lazarev and Denis Newman will present a paper entitled, “From Program Effect to Cost Savings: Valuing the Benefits of Educational Innovation Using Vertically Scaled Test Scores And Instructional Expenditure Data.” Be sure to stop by on Saturday, November 5th from 9:50am to 11:20am in room Avila A.
Furthermore, Jenna Zacamy, Senior Research Manager at Empirical Education, will be presenting on two topics at the National Staff Development Council (NSDC) annual conference taking place in Anaheim, CA from December 3rd to 7th. Join her on Monday, December 5th at 2:30pm to 4:30pm when she will talk about the impact on student achievement for grades 4 through 8 of the Alabama Math, Science, and Technology Initiative, together with Pamela Finney and Jean Scott from SERVE Center at UNCG.
On Tuesday, December 6th at 10:00am to 12:00pm Jenna will discuss prior and current research on the effectiveness of a large-scale high school literacy reform together with Cathleen Kral from WestEd and William Loyd from Washtenaw Intermediate School District.
New Reports Show Positive Results for Elementary Reading Program
September 21, 2011
Two studies of the Treasures reading program from McGraw-Hill are now posted on our reports page. Treasures is a basal reading program for students in grades K–6. Although the first study was a multi-site study while the second was conducted in the Osceola school district, both found positive impacts on reading achievement in grades 3–5.
The primary data for the first study were scores supplied with district permission by Northwest Evaluation Association from their MAP reading test. The study uses a quasi-experimental comparison group design based on 35 Treasures and 48 comparison schools primarily in the midwest. The study found that Treasures had a positive impact on overall elementary student reading scores, the strongest effect being observed for grade 5.
The second study’s data were provided by the Osceola school district and consist of demographic information, FCAT test scores, and information on student transfers during the year (between schools within the districts and from other districts). The dataset for this time series design covered five consecutive school years from 2005–06 to 2009–10, including two years prior to introduction of the intervention and three years after the introduction. The study included exploration of moderators that demonstrated a stronger positive effect for students with disabilities and English learners than the rest of the student population. We also found a stronger positive impact on girls than on boys.
Check back for results from follow-up studies, which are currently underway in other states and districts.
New RFP calls for Building Regional Research Capacity
May 11, 2011
The US Department of Education (ED) has just released the eagerly anticipated RFP for the next round of the Regional Education Laboratories (RELs). This RFP contains some very interesting departures from how the RELs have been working, which may be of interest especially to state and local educators.
For those unfamiliar with federal government organizations, the RELs are part of the National Center for Education Evaluation and Regional Assistance (abbreviated NCEE), which is within the Institute of Education Sciences (IES), part of ED. The country is divided up into ten regions, each one served by a REL—so the RFP announced today is really a call for proposals in ten different competitions. The RELs have been in existence for decades but their mission has evolved over time. For example, the previous RFP (about 6 years ago) put a strong emphasis on rigorous research, particularly randomized control trials (RCTs) leading the contractors in each of the 10 regions to greatly expand their capacity, in part by bringing in subcontractors with the requisite technical skills. (Empirical conducted or assisted with RCTs in four of the 10 regions.) The new RFP changes the focus in two essential ways.
First, one of the major tasks is building capacity for research among practitioners. Educators at the state and local levels told ED that they needed more capacity to make use of the longitudinal data systems that the ED has invested in through grants to the states. It is one thing to build the data systems. It is another thing to use the data to generate evidence that can inform decisions about policies and programs. Last month at the conference of the Society for Research on Educational Effectiveness, Rebecca Maynard, Commissioner of NCEE talked about building a “culture of experimentation” among practitioners and building their capacity for simpler experiments that don’t take so long and are not as expensive as those NCEE has typically contracted for. Her point was that the resulting evidence is more likely to be used if the practitioners are “up close and immediate.”
The second idea found in the RFP for the RELs is that each regional lab should work through “alliances” of state and local agencies. These alliances would cross state boundaries (at least within the region) and would provide an important part of the REL’s research agenda. The idea goes beyond having an advisory panel for the REL that requests answers to questions. The alliances are also expected to build their own capacity to answer these questions using rigorous research methods but applying them cost-effectively and opportunistically. The capacity of the alliances should outlive the support provided by the RELs. If your organization is part of an existing alliance and would like to get better at using and conducting research, there are teams being formed to go after the REL contracts that would be happy to hear from you. (If you’re not sure who to call, let us know and we’ll put you in touch with an appropriate team.)
Quasi-experimental Design Used to Build Evidence for Adolescent Reading Intervention
April 15, 2011
A study of Jamestown Reading Navigator (JRN) from McGraw-Hill (now posted on our reports page), conducted in Miami-Dade County Public Schools, found positive results on the Florida state reading test (FCAT) for high school students in their intensive reading classes. JRN is an online application, with internal record keeping making it possible to identify the treatment group for a comparison design. While the full student, teacher and roster data for 9th and 10th grade intensive reading classes were provided by the district, JRN—as an online application—provided the identification of the student and teacher users through the computer logs. The quasi-experimental design was strengthened by using schools with both JRN and non-JRN students. Of the 70 schools that had JRN logs, 23 had JRN and non-JRN intensive reading classes and sufficient data for analysis.
Conference Season 2011
February 18, 2011
Empirical researchers will again be on the road this conference season, and we’ve included a few new conference stops. Come meet our researchers as we discuss our work at the following events. If you will be present at any of these, please get in touch so we can schedule a time to speak with you, or come by to see us at our presentations.
This year, the NCES-MIS “Deep in the Heart of Data” Conference will offer more than 80 presentations, demonstrations, and workshops conducted by information system practitioners from federal, state, and local K-12 agencies.
Come by and say hello to one of our research managers, Joseph Townsend, who will be running Empirical Education’s table display at the Hilton Hotel in Austin, Texas from February 23-25th. Joe will be presenting interactive demonstrations of MeasureResults, which allows school district staff to conduct complete program evaluations online.
Attendees of this spring’s Society for Research on Educational Effectiveness(SREE) Conference, held in Washington, DC March 3-5, will have the opportunity to discuss questions of generalizability with Empirical Education’s Chief Scientist, Andrew Jaciw and President, Denis Newman at two poster sessions. The first poster, entitled External Validity in the Context of RCTs: Lessons from the Causal Explanatory Tradition applies insights from Lee Cronbach to current RCT practices. In the second poster, The Use of Moderator Effects for Drawing Generalized Causal Inferences, Jaciw addresses issues in multi-site experiments. They look forward to discussing these posters both online at the conference website and in person.
We are pleased to announce that we will have our first showing this year at the Association for Education Finance and Policy(AEFP) Annual Conference. Join us in the afternoon on Friday, March 25th at the Grand Hyatt in Seattle, WA as Empirical’s research scientist, Valeriy Lazarev, presents a poster on Cost-benefit analysis of educational innovation using growth measures of student achievement.
We will again have a strong showing at the 2011 American Educational Research Association(AERA) Conference. Join us in festive New Orleans, April 8-12 for the final results on the efficacy of the PCI Reading Program, our qualitative findings from the first year of formative research on our MeasureResults online program evaluation tool, and more.
View our AERA presentation schedule for more details and a complete list of our participants.
This year’s SIIA Ed Tech Industry Summit will take place in gorgeous San Francisco, just 45 minutes north of Empirical Education’s headquarters in the Silicon Valley. We invite you to schedule a meeting with us at the Palace Hotel from May 22-24.