blog posts and news stories

Research: From NCLB to Obama’s Blueprint for ESEA

We can finally put “Scientifically Based Research” to rest. The term that appeared more than 100 times in NCLB appears zero times in the Obama administration’s Blueprint for Reform, which is the document outlining its approach to the reauthorization of ESEA. The term was always an awkward neologism, coined presumably to avoid simply saying “scientific research.” It also allowed NCLB to contain an explicit definition to be enforced—a definition stipulating not just any scientific activities, but research aimed at coming to causal conclusions about the effectiveness of some product, policy, or laboratory procedure.

A side effect of the SBR focus has been the growth of a compliance mentality among both school systems and publishers. Schools needed some assurance that a product was backed by SBR before they would spend money, while textbooks were ranked in terms of the number of SBR-proven elements they contained.

Some have wondered if the scarcity of the word “research” in the new Blueprint might signal a retreat from scientific rigor and the use of research in educational decisions (see, for example, Debra Viadero’s blog). Although the approach is indeed different, the new focus makes a stronger case for research and extends its scope into decisions at all levels.

The Blueprint shifts the focus to effectiveness. The terms “effective” or “effectiveness” appear about 95 times in the document. “Evidence” appears 18 times. And the compliance mentality is specifically called out as something to eliminate.

“We will ask policymakers and educators at all levels to carefully analyze the impact of their policies, practices, and systems on student outcomes. … And across programs, we will focus less on compliance and more on enabling effective local strategies to flourish.” (p. 35)

Instead of the stiff definition of SBR, we now have a call to “policymakers and educators at all levels to carefully analyze the impact of their policies, practices, and systems on student outcomes.” Thus we have a new definition for what’s expected: carefully analyzing impact. The call does not go out to researchers per se, but to policymakers and educators at all levels. This is not a directive from the federal government to comply with the conclusions of scientists funded to conduct SBR. Instead, scientific research is everybody’s business now.

Carefully analyzing the impact of practices on student outcomes is scientific research. For example, conducting research carefully requires making sure the right comparisons are made. A study that is biased by comparing two groups with very different motivations or resources is not a careful analysis of impact. A study that simply compares the averages of two groups without any statistical calculations can mistakenly identify a difference when there is none, or vice versa. A study that takes no measure of how schools or teachers used a new practice—or that uses tests of student outcomes that don’t measure what is important—can’t be considered a careful analysis of impact. Building the capacity to use adequate study design and statistical analysis will have to be on the agenda of the ESEA if the Blueprint is followed.

Far from reducing the role of research in the U.S. education system, the Blueprint for ESEA actually advocates a radical expansion. The word “research” is used only a few times, and “science” is used only in the context of STEM education. Nonetheless, the call for widespread careful analysis of the evidence of effective practices that impact student achievement broadens the scope of research, turning all policymakers and educators into practitioners of science.

2010-03-17

Conference Season has Arrived

Springtime marks the start of “conference season” and Empirical Education has been busy attending and preparing for the various meetings and events. We are participating in five conferences (CoSN, SIIA, SREE, NCES-MIS, and AERA) and we hope to see some familiar faces in our travels. If you will be attending any of the following meetings, please give us a call. We’d love to schedule a time to speak with you.

CoSN

The Empirical team headed to the 2010 Consortium of School Networking conference in Washington, DC at the Omni Shoreham Hotel from February 28—March 3, 2010. We were joined by Eric Lehew, Executive Director of Learning Support Services at Poway Unified School District, who co-presented with us a poster titled, “Turning Existing Data into Research” (Monday, March 1 from 1:00pm to 2:00pm). As exhibitors, Empirical Education also hosted a 15-minute vendor demonstration entitled Building Local Capacity: Using Your Own Data Systems to Easily Measure Program Effectiveness, to launch our MeasureResults tool.

SIIA

The Software & Information Industry Association held their 2010 Ed Tech Government Forum in Washington, DC on March 3–4. The focus this year was on Education Funding & Programs in a (Post) Stimulus World and included speakers, such as Secretary of Education, Arne Duncan and West Virginia Superintendent of Schools, Steven Paine.

SREE

Just as the SIIA Forum came to a close, the Society for Research on Educational Effectiveness held their annual conference—Research Into Practice—March 4-6 where our chief scientist, Andrew Jaciw, and research scientist, Xiaohui Zheng, presented their poster on estimating long-term program impacts when the control group joins treatment in the short-term. Dr. Jaciw was also named on a paper presentation with Rob Olsen of Abt Associates.

Thursday March 4, 2010
3:30pm–5:00pm: Session 2
2E. Research Methodology
Examining State Assessments
Forum
Chair: Jane Hannaway, The Urban Institute
Using State Or Study-Administered Achievement Tests in Impact Evaluations
Rob Olsen and Fatih Unlu, Abt Associates and Andrew Jaciw, Empirical Education
Friday March 5, 2010
5:00pm–7:00pm: Poster Session
Poster Session: Research Methodology
Estimating Long-Term Program Impacts When the Control Group Joins Treatment in the Short-Term: A Theoretical and Empirical Study of the Tradeoffs Between Extra- and Quasi-Experimental Estimates
Andrew Jaciw, Boya Ma, and Qingfeng Zhao, Empirical Education

NCES-MIS

The 23rd Annual Management Information Systems (MIS) Conference was held in Phoenix, Arizona March 3-5. Co-sponsored by the Arizona Department of Education and the U.S. Department of Education’s National Center for Education Statistics (NCES), the MIS Conference brings together the people who work with information collection, management, transmittal, and reporting in school districts and state education agencies. The majority of the sessions focused on data use, data standards, statewide data systems, and data quality. For more information, refer to the program highlights.

AERA

We will have a strong showing at the American Educational Research Association annual conference in Denver, Colorado from Friday, April 30 through Tuesday, May 4. Please come talk to us at our poster and paper sessions. View our AERA presentation schedule to find out which of our presentations you would like to attend. And we hope to see you at our customary stylish reception Sunday evening, May 2 from 6 to 8:30—mark your calendars!

IES

We will be presenting at the IES Research Conference in National Harbor, MD from June 28-30. View our poster here.

2010-03-12

MeasureResults® to be Launched at CoSN 2010

Empirical Education will launch its web-based educational research solution, MeasureResults on March 1 at the Consortium for School Networking conference in Washington, DC. MeasureResults is a suite of online tools that makes rigorous research designs and statistical processes accessible to school systems and educational publishers who want to evaluate the effectiveness of products and services aimed at improving student performance.

“MeasureResults will change the way that school districts and product developers conduct rigorous evaluations,” said Denis Newman, Empirical Education President. “Instead of hiring outside evaluators or onsite research experts or statisticians, MeasureResults allows school district personnel to design a study, collect data, and review reports in our user-friendly online platform.”

MeasureResults grew out of a federally funded research project to develop a low-cost method for schools to conduct their own research. The product was developed for commercial distribution under a Small Business Innovation Research grant from the U.S. Department of Education. By moving the educational research processes online, MeasureResults makes school-run evaluations more efficient and less expensive.

2010-02-23

Stimulating Innovation and Evidence

After a massive infusion of stimulus money into K-12 technology through the Title IID “Enhancing Education Through Technology” (EETT) grants, known also as “ed-tech” grants, the administration is planning to cut funding for the program in future budgets.

Well, they’re not exactly “cutting” funding for technology, but consolidating the dedicated technology funding stream into a larger enterprise, awkwardly named the “Effective Teaching and Learning for a Complete Education” program. For advocates of educational technology, here’s why this may not be so much a blow as a challenge and an opportunity.

Consider the approach stated at the White House “fact sheet”:

“The Department of Education funds dozens of programs that narrowly limit what states, districts, and schools can do with funds. Some of these programs have little evidence of success, while others are demonstrably failing to improve student achievement. The President’s Budget eliminates six discretionary programs and consolidates 38 K-12 programs into 11 new programs that emphasize using competition to allocate funds, giving communities more choices around activities, and using rigorous evidence to fund what works…Finally, the Budget dedicates funds for the rigorous evaluation of education programs so that we can scale up what works and eliminate what does not.”

From this, technology advocates might worry that policy is being guided by the findings of “no discernable impact” from a number of federally funded technology evaluations (including the evaluation mandated by the EETT legislation itself).

But this is not the case. The White House declares, “The President strongly believes that technology, when used creatively and effectively, can transform education and training in the same way that it has transformed the private sector.”

The administration is not moving away from the use of computers, electronic whiteboards, data systems, Internet connections, web resources, instructional software, and so on in education. Rather, the intention is that these tools are integrated, where appropriate and effective, into all of the other programs.

This does put technology funding on a very different footing. It is no longer in its own category. Where school administrators are considering funding from the “Effective Teaching and Learning for a Complete Education” program, they may place a technology option up against an approach to lower class size, a professional development program, or other innovations that may integrate technologies as a small piece of an overall intervention. Districts would no longer write proposals to EETT to obtain financial support to invest in technology solutions. Technology vendors will increasingly be competing for the attention of school district decision-makers on the basis of the comparative effectiveness of their solution—not just in comparison to other technologies but in comparison to other innovative solutions. The administration has clearly signaled that innovative and effective technologies will be looked upon favorably. It has also signaled that effectiveness is the key criterion.

As an Empirical Education team prepares for a visit to Washington DC for the conference of the Consortium for School Networking and the Software and Information Industry Association’s EdTech Government Forum, (we are active members in both organizations), we have to consider our message to the education technology vendors and school system technology advocates. (Coincidentally, we will also be presenting research at the annual conference of the Society for Research on Educational Effectiveness, also held in DC that week). As a research company we are constrained from taking an advocacy role—in principle we have to maintain that the effectiveness of any intervention is an empirical issue. But we do see the infusion of short term stimulus funding into educational technology through the EETT program as an opportunity for schools and publishers. Working jointly to gather the evidence from the technologies put in place this year and next will put schools and publishers in a strong position to advocate for continued investment in the technologies that prove effective.

While it may have seemed so in 1993 when the U.S. Department of Education’s Office of Educational Technology was first established, technology can no longer be considered inherently innovative. The proposed federal budget is asking educators and developers to innovate to find effective technology applications. The stimulus package is giving the short term impetus to get the evidence in place.

2010-02-14

Poway Completes Study from MeasureResults Pilot

The results are in for Poway Unified School District’s first research study using our MeasureResults online tool. PUSD was interested in measuring the impact of CompassLearning’s Odyssey Reading program in the middle grades. Using an “interrupted time series” design with a comparison group, they found that both 7th and 8th grade students averaged 1 to 2 points higher than expected on the NWEA MAP Literacy assessment. PUSD plans to continue their evaluation of CompassLearning Odyssey in different subject areas and grade levels. Join us in D.C. at this year’s CoSN conference on March 1, 2010 as Eric Lehew, Executive Director of Learning Support Services at PUSD, presents findings and reflections on the process of using MeasureResults to conduct research at the local district level.

Click here to download a copy of the PUSD achievement report.

2010-02-12

Empirical Education Appoints Chief Scientist

We are pleased to announce the appointment of Andrew Jaciw, Ph.D. as Empirical Education’s Chief Scientist. Since joining the company more than five years ago, Dr. Jaciw has guided and shaped our analytical and research design practices, infusing our experimental methodologies with the intellectual traditions of both Cronbach and Campbell. As Chief Scientist, he will continue to lead Empirical’s team of scientists setting direction for our MeasureResults evaluation and analysis processes, as well as basic research into widely applicable methodologies. Andrew received his Ph.D in Education from Stanford University.

2010-02-05

Webinar: Uncovering ARRA’s Research Requirements

Researchers at Empirical Education provided a detailed overview of the various research themes and requirements of the ARRA stimulus initiatives with state department of education officials during their December 9 webinar entitled, “Meet Stimulus Funds’ Research Requirements with Confidence.“ The webinar gave specific examples of how states may start planning their applications and building research partnerships, as well as an overview of the ED’s current thinking about building local research capacity. The initiatives that were discussed included Race to the Top, Enhancing Education Through Technology, Investing in Innovation, Title I School Improvement Grants, and State Longitudinal Data Systems.

A follow-up webinar was broadcasted on January 20, 2010; it outlined a specific example of a program evaluation design that districts can use with existing data. Stay tuned for future webinar topics on more alternative experimental research designs.

2010-01-22

Uncovering ARRA’s Research Requirements

Researchers at Empirical Education provided a detailed overview of the various research themes and requirements of the ARRA stimulus initiatives with state department of education officials during their December 9th webinar entitled, “Meet Stimulus Funds’ Research Requirements with Confidence.” The webinar gave specific examples of how states may start planning their applications, building research partnerships, as well as an overview of the ED’s current thinking about building local research capacity. The initiatives that were discussed included Race to the Top, Enhancing Education Through Technology, Investing in Innovation, Title I School Improvement Grants, and State Longitudinal Data Systems.

2009-12-17

Four Presentations Accepted for AERA 2010

Empirical Education will be heading to the snow-capped mountains of Denver, Colorado next April! Once again, Empirical will have a strong showing at the 2010 American Educational Research Association conference, which will be held at the Colorado Convention Center on April 30 — May 4, 2010. Our presentations will span several divisions, including Learning & Instruction; Measurement & Research Methodology; and Research, Evaluation, & Assessment in Schools. Research topics will include:

  • Examining the Efficacy of a Sight-Word Reading Program for Students with Significant Cognitive Disabilities: Phase 2

  • Matched Pairs, ICCs and R-squared: Lessons from Several Effectiveness Trials in Education

  • Addressing Challenges of Within School Randomization

  • Measuring the Impact of a Math Program As It Is Rolled Out Over Several Years

In line with our past 2 years of successful AERA receptions, Empirical Education plans to host another “meet and greet” at this year’s conference as well. Join our mailing list to receive the details.

2009-12-01

Report completed on the effectiveness of MathForward

Empirical Education assisted Richardson Independent School District (RISD) in conducting an evaluation of the MathForward algebra readiness program published by Texas Instruments. RISD implemented MathForward in their 7th and 8th grade general mathematics and 9th grade Algebra I classes. The research employed an interrupted time series design comparing existing student achievement scores with MathForward to student achievement scores from the three years prior to the introduction of MathForward.

The results of this four-year study suggest that 7th grade students scored, on average, 11 percentile points higher with MathForward than the 7th grade students from the three previous years without MathForward. A similar result for the 8th grade suggests that students participating in MathForward scored, on average, 9 percentile points higher. While the trend did not hold for 9th grade, further exploration suggests that 9th grade students whose teachers had 3 years experience using MathForward scored higher than 9th grade students whose teachers did not use MathForward.

The report further illustrates how an interrupted time series design can be used to study a program as it is rolled out over several years. This research will be presented at the 2010 AERA conference in Denver, Colorado (Friday, April 30 — Tuesday, May 4).

2009-11-13
Archive