blog posts and news stories

Uncovering ARRA’s Research Requirements

Researchers at Empirical Education provided a detailed overview of the various research themes and requirements of the ARRA stimulus initiatives with state department of education officials during their December 9th webinar entitled, “Meet Stimulus Funds’ Research Requirements with Confidence.” The webinar gave specific examples of how states may start planning their applications, building research partnerships, as well as an overview of the ED’s current thinking about building local research capacity. The initiatives that were discussed included Race to the Top, Enhancing Education Through Technology, Investing in Innovation, Title I School Improvement Grants, and State Longitudinal Data Systems.

2009-12-17

Four Presentations Accepted for AERA 2010

Empirical Education will be heading to the snow-capped mountains of Denver, Colorado next April! Once again, Empirical will have a strong showing at the 2010 American Educational Research Association conference, which will be held at the Colorado Convention Center on April 30 — May 4, 2010. Our presentations will span several divisions, including Learning & Instruction; Measurement & Research Methodology; and Research, Evaluation, & Assessment in Schools. Research topics will include:

  • Examining the Efficacy of a Sight-Word Reading Program for Students with Significant Cognitive Disabilities: Phase 2

  • Matched Pairs, ICCs and R-squared: Lessons from Several Effectiveness Trials in Education

  • Addressing Challenges of Within School Randomization

  • Measuring the Impact of a Math Program As It Is Rolled Out Over Several Years

In line with our past 2 years of successful AERA receptions, Empirical Education plans to host another “meet and greet” at this year’s conference as well. Join our mailing list to receive the details.

2009-12-01

Report completed on the effectiveness of MathForward

Empirical Education assisted Richardson Independent School District (RISD) in conducting an evaluation of the MathForward algebra readiness program published by Texas Instruments. RISD implemented MathForward in their 7th and 8th grade general mathematics and 9th grade Algebra I classes. The research employed an interrupted time series design comparing existing student achievement scores with MathForward to student achievement scores from the three years prior to the introduction of MathForward.

The results of this four-year study suggest that 7th grade students scored, on average, 11 percentile points higher with MathForward than the 7th grade students from the three previous years without MathForward. A similar result for the 8th grade suggests that students participating in MathForward scored, on average, 9 percentile points higher. While the trend did not hold for 9th grade, further exploration suggests that 9th grade students whose teachers had 3 years experience using MathForward scored higher than 9th grade students whose teachers did not use MathForward.

The report further illustrates how an interrupted time series design can be used to study a program as it is rolled out over several years. This research will be presented at the 2010 AERA conference in Denver, Colorado (Friday, April 30 — Tuesday, May 4).

2009-11-13

Empirical Education assists the Instructional Research Group (IRG)

Starting this winter, Empirical Education will assist the Instructional Research Group (IRG) with their project, Teacher Quality Study: An Investigation of the Impacts of Teacher Study Groups as a Means to Enhance the Quality of Reading Instruction for First Graders in High Poverty Schools in Two States. The project, focusing on first-grade reading instruction in high poverty schools with large ethnic minority populations, will be funded by the U.S. Department of Education. The role Empirical Education will play in this partnership is to obtain student demographic data from IRG; to survey teachers; to enter, check, and warehouse data; and to provide data files and report summaries to IRG.

2009-11-01

i3 Request for Proposals Calls for New Approaches to Rigorous Evaluation

In the strongest indication yet that the new administration is serious about learning from its multi-billion-dollar experience, the draft notice for the Invest in Innovation (i3) grants sets out new requirements for research and evaluation. While it is not surprising that the U.S. Department of Education requires scientific evidence for programs asking for funds for expansion and scaling up, it is important to note that strong evidence is now being defined not just in terms of rigorous methods but also in terms of “studies that in total include enough of the range of participants and settings to support scaling up to the State, regional, or national level.” This requirement for generalizability is a major step toward sponsoring research that has value for practical decisions. Along the same lines, high quality evaluations are those that include implementation data and performance feedback.

The draft notice also includes recognition of an important research design: “interrupted time series.” While not acceptable under the current What Works Clearinghouse criteria, this method—essentially looking for a change in a series of measures taken before and after implementing a new program—has enormous practical application for schools systems with solid longitudinal data systems.

Finally, we notice that ED is requiring that all evaluators cooperate with broader national efforts to combine evidence from multiple sources and will provide technical assistance to evaluators to assure consistency among researchers. They want to be sure at the end of the process they have useful evidence about what worked, what didn’t, and why.

2009-10-26

Growth at Empirical Education is More than Just Statistical

We are growing! Empirical Education is excited to announce a recent expansion in our company. Over the summer, we have had the pleasure of adding five new members to our team, rounding out our number to 30 total employees. The addition of four members to our research department is a vital step in beginning work on our new projects taking place during the 2009-2010 school year. The statistician joining our analysis team will work closely with the engineering team to help streamline our MeasureResults processes.

In addition to enriching our staff, we are also increasing our space. We acquired a third office unit in our Palo Alto building this summer, and we are busily readying it for a move-in. Meanwhile, we are remaining quite cozy in our two current offices.

We view this expansion as a glimpse of what the future will hold for our company, and we look forward to introducing you to some of our new faces. Feel free to stop by our office anytime for a guided tour.

2009-09-01

Poway District Using MeasureResults

With the new school year approaching, we are excited to announce a partnership with MeasureResults’ newest user, Poway Unified School District (PUSD). With the help of MeasureResults, PUSD educators will design and conduct their own program evaluations, while outsourcing the analytics and reporting functions to MeasureResults’ automated analysis engine. Planned study designs include an “interrupted time series,” which compares current achievement levels to levels from several years prior to the introduction of the program under evaluation. Plans also include a comparison group study which, by matching classrooms that are using the program with similar classrooms that are not, can estimate the difference that the new program has made. Special analyses will determine whether the program benefits various subgroups of students (e.g. English language learners) more than others. We anticipate that PUSD’s valuable product feedback and input will enable us to make ongoing improvements to MeasureResults’ functionality and usability.

2009-08-17

Empirical Education Lends Expertise to the Software Industry

Empirical Education continued its active involvement as an associate member of Software & Information Industry Association (SIIA), participating in its 9th annual Ed Tech Industry Summit and helping to draft a set of guidelines for provider-sponsored research on educational technology products and service.

At SIIA’s San Francisco conference held May 3-5, Dr. Denis Newman discussed Empirical Education’s experience in state and local experimental evaluations as a panelist in a session addressing technology-based assessment tools entitled “Harvesting Information from Assessment to Support the Learner and the Educator.” Other panelists representing SRI and CTB/McGraw Hill brought their expertise in testing to a discussion of the differences between formative and summative assessments and how data from such assessments are used to inform decision-making at various levels: teacher, school, district, and state. Dr. Newman’s remarks drew the discussion into technology trends such as longitudinal data systems, growth scales, and value-add analysis and the need to apply statistical processes that are not yet commonly part of school data systems.

For the past year, SIIA has also supported a working group consisting of members with interest and expertise in research and evaluation. Dr. Newman was asked to participate as co-chair and has been working with the other group members to complete a set of guidelines that can provide guidance in designing scientific evaluations, given the particular characteristics of technology interventions. The company has worked with many SIIA members, assisting them to work with school systems to evaluate their products and services.

2009-06-10

New Directions for Research Discussed at Institute of Education Sciences Conference

The Fourth Annual Institute of Education Sciences (IES) Research Conference was convened June 7 - 9 with an air of anticipation about new directions, as John Q. Easton began his term as director. Formerly executive director of the Consortium on Chicago School Research, he brings a new perspective to IES. While Dr. Easton is quoted in Education Week as saying he will retain the rigor that IES has made a priority, the Consortium’s work points to the importance of building local capacity for research to support reform. In a paper published online, he and his colleagues provide a clear and detailed rationale for their approach that includes the need for combining high quality research with the ability to cut through technical details to communicate both good and bad news to local decision makers.

Three Empirical Education staff members furthered this agenda of building capacity for local school and district evaluations in poster presentations at the conference. Dr. Robert Smith, the company’s vice president of engineering, outlined the company’s progress on MeasureResults™, a web-based evaluation solution for schools and districts. (Funding for the development of MeasureResults is from an IES Small Business Innovation Research grant.)

Dr. Denis Newman, the company’s president, and Andrew P. Jaciw, director of experimental design and analysis, presented their findings on the process of developing low cost, timely, and locally relevant experiments (funded by an IES research grant). Development efforts on MeasureResults continue through the Empirical Education team’s application of the knowledge gained from this project entitled “Low Cost Experiments to Support Local School District Decisions.” This project guides the team in developing decision makers’ understanding of, and building local capacity for, conducting evaluation research.

2009-06-09

Empirical Education to Design a Multi-Year Evaluation of New GreatSchools Initiative

GreatSchools, a nonprofit provider of web-based resources for parents, has contracted with Empirical Education to design a multi-year evaluation of its initiative to empower parents to participate in their children’s development and educational success.

The initiative is funded by the Gates Foundation, the Robertson Foundation, and the Walton Family Foundation. Bill Jackson, Founder and President of GreatSchools, wrote in his blog:

“More than a year ago, we began to consider: What more could we at GreatSchools do to improve education? How could we do more for our large audience of parents? And what could we do for low-income parents whose children face the steepest climb to college? Our answer: We should leverage the technology of our times to create a comprehensive parent-training program and support group that inspires and guides parents—especially low-income parents—to raise children who are ready for college.”

2009-05-13
Archive