blog posts and news stories

Stimulating Innovation and Evidence

After a massive infusion of stimulus money into K-12 technology through the Title IID “Enhancing Education Through Technology” (EETT) grants, known also as “ed-tech” grants, the administration is planning to cut funding for the program in future budgets.

Well, they’re not exactly “cutting” funding for technology, but consolidating the dedicated technology funding stream into a larger enterprise, awkwardly named the “Effective Teaching and Learning for a Complete Education” program. For advocates of educational technology, here’s why this may not be so much a blow as a challenge and an opportunity.

Consider the approach stated at the White House “fact sheet”:

“The Department of Education funds dozens of programs that narrowly limit what states, districts, and schools can do with funds. Some of these programs have little evidence of success, while others are demonstrably failing to improve student achievement. The President’s Budget eliminates six discretionary programs and consolidates 38 K-12 programs into 11 new programs that emphasize using competition to allocate funds, giving communities more choices around activities, and using rigorous evidence to fund what works…Finally, the Budget dedicates funds for the rigorous evaluation of education programs so that we can scale up what works and eliminate what does not.”

From this, technology advocates might worry that policy is being guided by the findings of “no discernable impact” from a number of federally funded technology evaluations (including the evaluation mandated by the EETT legislation itself).

But this is not the case. The White House declares, “The President strongly believes that technology, when used creatively and effectively, can transform education and training in the same way that it has transformed the private sector.”

The administration is not moving away from the use of computers, electronic whiteboards, data systems, Internet connections, web resources, instructional software, and so on in education. Rather, the intention is that these tools are integrated, where appropriate and effective, into all of the other programs.

This does put technology funding on a very different footing. It is no longer in its own category. Where school administrators are considering funding from the “Effective Teaching and Learning for a Complete Education” program, they may place a technology option up against an approach to lower class size, a professional development program, or other innovations that may integrate technologies as a small piece of an overall intervention. Districts would no longer write proposals to EETT to obtain financial support to invest in technology solutions. Technology vendors will increasingly be competing for the attention of school district decision-makers on the basis of the comparative effectiveness of their solution—not just in comparison to other technologies but in comparison to other innovative solutions. The administration has clearly signaled that innovative and effective technologies will be looked upon favorably. It has also signaled that effectiveness is the key criterion.

As an Empirical Education team prepares for a visit to Washington DC for the conference of the Consortium for School Networking and the Software and Information Industry Association’s EdTech Government Forum, (we are active members in both organizations), we have to consider our message to the education technology vendors and school system technology advocates. (Coincidentally, we will also be presenting research at the annual conference of the Society for Research on Educational Effectiveness, also held in DC that week). As a research company we are constrained from taking an advocacy role—in principle we have to maintain that the effectiveness of any intervention is an empirical issue. But we do see the infusion of short term stimulus funding into educational technology through the EETT program as an opportunity for schools and publishers. Working jointly to gather the evidence from the technologies put in place this year and next will put schools and publishers in a strong position to advocate for continued investment in the technologies that prove effective.

While it may have seemed so in 1993 when the U.S. Department of Education’s Office of Educational Technology was first established, technology can no longer be considered inherently innovative. The proposed federal budget is asking educators and developers to innovate to find effective technology applications. The stimulus package is giving the short term impetus to get the evidence in place.

2010-02-14

Poway Completes Study from MeasureResults Pilot

The results are in for Poway Unified School District’s first research study using our MeasureResults online tool. PUSD was interested in measuring the impact of CompassLearning’s Odyssey Reading program in the middle grades. Using an “interrupted time series” design with a comparison group, they found that both 7th and 8th grade students averaged 1 to 2 points higher than expected on the NWEA MAP Literacy assessment. PUSD plans to continue their evaluation of CompassLearning Odyssey in different subject areas and grade levels. Join us in D.C. at this year’s CoSN conference on March 1, 2010 as Eric Lehew, Executive Director of Learning Support Services at PUSD, presents findings and reflections on the process of using MeasureResults to conduct research at the local district level.

Click here to download a copy of the PUSD achievement report.

2010-02-12

Empirical Education Appoints Chief Scientist

We are pleased to announce the appointment of Andrew Jaciw, Ph.D. as Empirical Education’s Chief Scientist. Since joining the company more than five years ago, Dr. Jaciw has guided and shaped our analytical and research design practices, infusing our experimental methodologies with the intellectual traditions of both Cronbach and Campbell. As Chief Scientist, he will continue to lead Empirical’s team of scientists setting direction for our MeasureResults evaluation and analysis processes, as well as basic research into widely applicable methodologies. Andrew received his Ph.D in Education from Stanford University.

2010-02-05

Webinar: Uncovering ARRA’s Research Requirements

Researchers at Empirical Education provided a detailed overview of the various research themes and requirements of the ARRA stimulus initiatives with state department of education officials during their December 9 webinar entitled, “Meet Stimulus Funds’ Research Requirements with Confidence.“ The webinar gave specific examples of how states may start planning their applications and building research partnerships, as well as an overview of the ED’s current thinking about building local research capacity. The initiatives that were discussed included Race to the Top, Enhancing Education Through Technology, Investing in Innovation, Title I School Improvement Grants, and State Longitudinal Data Systems.

A follow-up webinar was broadcasted on January 20, 2010; it outlined a specific example of a program evaluation design that districts can use with existing data. The presentation can be viewed below. Stay tuned for future webinar topics on more alternative experimental research designs.

2010-01-22

Uncovering ARRA’s Research Requirements

Researchers at Empirical Education provided a detailed overview of the various research themes and requirements of the ARRA stimulus initiatives with state department of education officials during their December 9th webinar entitled, “Meet Stimulus Funds’ Research Requirements with Confidence.” The webinar gave specific examples of how states may start planning their applications, building research partnerships, as well as an overview of the ED’s current thinking about building local research capacity. The initiatives that were discussed included Race to the Top, Enhancing Education Through Technology, Investing in Innovation, Title I School Improvement Grants, and State Longitudinal Data Systems.

2009-12-17

Four Presentations Accepted for AERA 2010

Empirical Education will be heading to the snow-capped mountains of Denver, Colorado next April! Once again, Empirical will have a strong showing at the 2010 American Educational Research Association conference, which will be held at the Colorado Convention Center on April 30 — May 4, 2010. Our presentations will span several divisions, including Learning & Instruction; Measurement & Research Methodology; and Research, Evaluation, & Assessment in Schools. Research topics will include:

  • Examining the Efficacy of a Sight-Word Reading Program for Students with Significant Cognitive Disabilities: Phase 2

  • Matched Pairs, ICCs and R-squared: Lessons from Several Effectiveness Trials in Education

  • Addressing Challenges of Within School Randomization

  • Measuring the Impact of a Math Program As It Is Rolled Out Over Several Years

In line with our past 2 years of successful AERA receptions, Empirical Education plans to host another “meet and greet” at this year’s conference as well. Join our mailing list to receive the details.

2009-12-01

Report completed on the effectiveness of MathForward

Empirical Education assisted Richardson Independent School District (RISD) in conducting an evaluation of the MathForward algebra readiness program published by Texas Instruments. RISD implemented MathForward in their 7th and 8th grade general mathematics and 9th grade Algebra I classes. The research employed an interrupted time series design comparing existing student achievement scores with MathForward to student achievement scores from the three years prior to the introduction of MathForward.

The results of this four-year study suggest that 7th grade students scored, on average, 11 percentile points higher with MathForward than the 7th grade students from the three previous years without MathForward. A similar result for the 8th grade suggests that students participating in MathForward scored, on average, 9 percentile points higher. While the trend did not hold for 9th grade, further exploration suggests that 9th grade students whose teachers had 3 years experience using MathForward scored higher than 9th grade students whose teachers did not use MathForward.

The report further illustrates how an interrupted time series design can be used to study a program as it is rolled out over several years. This research will be presented at the 2010 AERA conference in Denver, Colorado (Friday, April 30 — Tuesday, May 4).

2009-11-13

Empirical Education assists the Instructional Research Group (IRG)

Starting this winter, Empirical Education will assist the Instructional Research Group (IRG) with their project, Teacher Quality Study: An Investigation of the Impacts of Teacher Study Groups as a Means to Enhance the Quality of Reading Instruction for First Graders in High Poverty Schools in Two States. The project, focusing on first-grade reading instruction in high poverty schools with large ethnic minority populations, will be funded by the U.S. Department of Education. The role Empirical Education will play in this partnership is to obtain student demographic data from IRG; to survey teachers; to enter, check, and warehouse data; and to provide data files and report summaries to IRG.

2009-11-01

i3 Request for Proposals Calls for New Approaches to Rigorous Evaluation

In the strongest indication yet that the new administration is serious about learning from its multi-billion-dollar experience, the draft notice for the Invest in Innovation (i3) grants sets out new requirements for research and evaluation. While it is not surprising that the U.S. Department of Education requires scientific evidence for programs asking for funds for expansion and scaling up, it is important to note that strong evidence is now being defined not just in terms of rigorous methods but also in terms of “studies that in total include enough of the range of participants and settings to support scaling up to the State, regional, or national level.” This requirement for generalizability is a major step toward sponsoring research that has value for practical decisions. Along the same lines, high quality evaluations are those that include implementation data and performance feedback.

The draft notice also includes recognition of an important research design: “interrupted time series.” While not acceptable under the current What Works Clearinghouse criteria, this method—essentially looking for a change in a series of measures taken before and after implementing a new program—has enormous practical application for schools systems with solid longitudinal data systems.

Finally, we notice that ED is requiring that all evaluators cooperate with broader national efforts to combine evidence from multiple sources and will provide technical assistance to evaluators to assure consistency among researchers. They want to be sure at the end of the process they have useful evidence about what worked, what didn’t, and why.

2009-10-26

Growth at Empirical Education is More than Just Statistical

We are growing! Empirical Education is excited to announce a recent expansion in our company. Over the summer, we have had the pleasure of adding five new members to our team, rounding out our number to 30 total employees. The addition of four members to our research department is a vital step in beginning work on our new projects taking place during the 2009-2010 school year. The statistician joining our analysis team will work closely with the engineering team to help streamline our MeasureResults processes.

In addition to enriching our staff, we are also increasing our space. We acquired a third office unit in our Palo Alto building this summer, and we are busily readying it for a move-in. Meanwhile, we are remaining quite cozy in our two current offices.

We view this expansion as a glimpse of what the future will hold for our company, and we look forward to introducing you to some of our new faces. Feel free to stop by our office anytime for a guided tour.

2009-09-01
Archive