blog posts and news stories

IES Holds its First Workshop on How States and Districts Can Evaluate Their Interventions

Empirical Education staff members participated in a workshop held in Washington sponsored by the U.S. Department of Education’s Institute of Education Sciences (IES) on April 24. The goal of the event was to encourage “locally initiated impact evaluations.” The primary presenter, Mark Lipsey of Vanderbilt University, called upon his extensive experience in rigorous evaluations to illustrate and explain the how and why of local evaluations. David Holdzkom of Wake County (NC) provided the perspective from a district where the staff has been conducting rigorous research on their own programs. Empirical Education assisted IES in preparing for the workshop by helping to recruit school district participants and presenters. Of the approximately 150 participants, 40% represented state and local education agencies (the other 60% were from universities, colleges, private agencies, R&D groups, and research companies). An underlying rationale for this workshop was the release of an RFP on this topic. Empirical Education expects to be working with a number of school districts on their responses to this solicitation, which is due October 2, 2008.


Two-Year Study on Effectiveness of Graphing Calculators Released

Results are in from a two-year randomized control trial on the effectiveness of graphing calculators on Algebra and Geometry achievement. Two reports are now available for this project, which was sponsored by Texas Instruments. In year 1 we contrasted business as usual in the math classes of two California school districts with classes equipped with sets of graphing calculators and led by teachers who received training in their use. In the second year we contrasted calculator-only classrooms with those also equipped with a calculator-based wireless networking system.

The project tracked achievement through state and other standardized test scores and implementation through surveys and observations. For the most part, the experiment could not discern an impact as a result of providing the equipment and training for the teachers. Data from surveys and observations make clear that the technology was not used extensively (and by some, not at all) suggesting that training, usability, and alignment issues must be addressed in adoption of this kind of program. There were modest effects, especially for Geometry, but these were often not found consistently for the two measurement scales. In one case contradictory results for the two school districts suggests that researchers should use caution in combining data from different settings.


At NCES Conference, Empirical Education Explains a Difficulty in Linking Student and Teacher Records

The San Francisco Bay Area, our “home town”, was the site for the 2008 National Center for Educational Statistics (NCES) conference. From February 25 to 29, educators and researchers from all over the country came to discuss data collection and analysis. Laurel Sterling and Robert Smith of Empirical Education presented “Tracking Teachers of Instruction for Data Accuracy and Improving Educational Outcomes”. Their topic was the need to differentiate between, the teachers who actually perform instruction for students and the teachers with whom those students are officially registered. They explained that in our research we keep track of the “teacher of instruction” vs. “teacher of registration”. Without this distinction we are unable to properly identify the student clusters or associate student growth with the right teacher. Sharing instructional responsibilities within a grade-level team is common enough to be of concern. In a large experiment involving teachers in grades 4 through 8, 17% reported teaching students who were not assigned to them on the official class roster. The audience was very lively and in the question period contributed to the topic. One district IT manager indicated that there is movement in this direction even at the state level. For a copy of the presentation, send an email to Robin Means.


Empirical Education Joins the What Works Clearinghouse Research Team

Mathematica Policy Research, Inc. has subcontracted with Empirical Education to serve as one of the research partners on the new What Works Clearinghouse (WWC) team. This week, Empirical research staff joined a seminar to talk through the latest policies and criteria for judging the quality and rigor of effectiveness research.

Last summer, the Department of Education granted leadership of the WWC to Mathematica, (formerly led by AIR), which put together a team consisting of Empirical, RAND, SRI, and a number of other research organizations. This round of work is expected to have a greater emphasis on outreach to schools, industry, and other stakeholders.


Maui Schools Sign Subscription Agreement for Empirical Education Research Services

Empirical Education will be providing research services to the Maui School District through an innovative subscription arrangement for MeasureResults™, an Internet-based research and consulting offering. The initial application of this service will be investigations of the longer term impact of the Cognitive Tutor program that was implemented under a Math and Science Partnership program grant.

The company’s MeasureResults service is a response to the ever increasing demands on school systems to validate their program and spending decisions based on the analysis of solid data. Most districts do not have the staff and facilities to set up data and run complex statistical analyses. In Maui, the service will take advantage of the sophisticated data warehousing capabilities being put in place statewide. MeasureResults is designed to simplify the technical and logistical steps of conducting experiments by building powerful and verified analytical techniques into an uncomplicated framework. The offering includes consultative services on research design and web-based interfaces to gather data, automate analysis, and generate reports. MeasureResults is bundled with technical support that includes review of all analyses and reports by trained statisticians.


Empirical at 2008 AERA Conference

Acceptance notifications are in! Empirical Education will have a strong showing at the 2008 American Educational Research Association conference, which will be held in New York on March 24–28, 2008. Empirical staff will be presenting their research and findings in several divisions and special interest groups, including School Evaluation and Program Development, Learning and Instruction, Survey Research in Education, and Measurement and Research Methodology: Quantitative Methods and Statistical Theory. Our six presentations will include a variety of topics:

View pictures of Empirical Education’s reception at The Chocolate Bar in NYC.


Empirical Education Pilots Workshop at REL-NEI Regional Meeting

Providence, RI was the site of REL Northeast and Islands’ 2007 Regional Meeting on Teacher Quality. Empirical Education staff headed out to the east coast to pilot the “Becoming Good Consumers of Research” workshop to an audience of about 30 education researchers, school administrators, university professors, and other education professionals interested in using research to inform school decisions. Gloria Miller, the company’s director of evaluation design, facilitated group discussions that touched on identifying different types of research and potential sources of bias. Gloria also provided participants with a Critical Reader’s toolkit designed to help readers evaluate the trustworthiness and relevance of various pieces of research. “Consumers of Research” is the first in a planned series of workshops focused on increasing the understanding and use of research in schools. The next round of workshops, scheduled for March 2008, is currently being developed.


Report Released on the Effectiveness of Carnegie Learning’s Bridge to Algebra

Empirical Education released the results of a year-long randomized experiment on the comparative effectiveness of Carnegie Learning’s Bridge to Algebra program. This study, like that on Carnegie Learning’s Cognitive Tutor for Algebra (see news for May 24, 2007), was conducted in cooperation with the Maui School District in Hawai’i and funded through a grant to Empirical Education from the U.S. Department of Education. The experiment could not discern an overall difference between the program and control group measured by either NWEA’s test for general math or the test’s algebraic operations subscale. However, for the algebraic operations outcomes (but not for general math), we found that students scoring low before participating in Bridge to Algebra benefited significantly more from the program’s algebraic operations instruction than did students with high initial scores. The district was specifically interested in looking at how the different ethnic groups, particularly the Hawaiian/Part-Hawaiian and Filipino students, performed in the new program. Controlling for pretest, we did not find that the program had a different effect for different ethnicities. The district was also interested in learning whether the program was differentially effective for students taught by certified teachers versus those with non–certified teachers. For the overall score (but not the algebraic operations sub-strand), we found that the program gave the non–certified teachers an advantage. Finally, despite some implementation challenges, teachers reported a generally positive attitude toward the new program.


REL West Calls on Empirical for Assistance with Experiment on New Economics Program

WestEd, which holds the contract for the Regional Education Laboratory contract in the western region (REL West) has contracted with Empirical Education for the operations of a large randomized experiment involving over 75 high school economics teachers throughout California and Arizona. With a September 2007 start, the project has called for very rapid start up by Empirical Education staff including delivery and processing approximately 40,000 student tests and surveys, acquiring data from over 50 school districts and conducting web-based surveys with 84 teachers. The Problem-Based Economics program, developed by Buck Institute for Education is being tested in the context of a single-semester course. During the 2007-2008 school year, two cohorts of students will take the Test of Economic Literacy (TEL) as well as performance assessments and attitudinal surveys to test the impact of the new program compared to the materials in use in the classrooms of control teachers.


Learning Point and Empirical Partner for Research on Formative Assessment

Learning Point Associates, which holds the contract for the Midwest Regional Education Lab, has contracted with Empirical Education for the operations of a large randomized experiment expected to include more than 100 elementary school teachers when in full swing in the fall of 2008. A pilot experiment, beginning this fall, involves a small number of schools in Illinois. The experiment will test the effectiveness of Northwest Evaluation Association’s formative assessment and professional development to be used in fourth- and fifth-grade classrooms. Working with the principal investigators Matt Dawson of LPA and David Cordray of Vanderbilt, Empirical will be responsible for recruiting schools, acquiring and warehousing student data, and conducting observations and surveys.