blog posts and news stories

Development Grant Awarded to Empirical Education

The U.S. Department of Education awarded Empirical Education a research grant to develop web-based software tools to support school administrators in conducting their own program evaluations. The two-and-a-half year project was awarded through the Small Business Innovative Research program administered and funded by the Institute of Education Sciences in the U.S. Department of Education. The proposal received excellent reviews in this competitive program. One reviewer remarked: “This software system is in the spirit of NCLB and IES to make curriculum, professional development, and other policy decisions based on rigorous research. This would be an improvement over other systems that districts and schools use that mostly generate tables.” While current data-driven decision making systems provide tabular information or comparisons in terms of bar graphs, the software to be developed—an enhancement of our current MeasureResults™ program—helps school personnel create appropriate research designs following a decision process. It then provides access to a web-based service that uses sophisticated statistical software to test whether there is a difference in the results for a new program compared to the school‘s existing programs. The reviewer added that the web-based system instantiates a “very good idea to provide [a] user-friendly and cost-effective software system to districts and schools to insert data for evaluating their own programs.” Another reviewer agreed, noting that: “The theory behind the tool is sound and would provide analyses appropriate to the questions being asked.” The reviewer also remarked that “…this would be a highly valuable tool. It is likely that the tool would be widely disseminated and utilized.” The company will begin deploying early versions of the software in school systems this coming fall.

2008-05-22

IES Holds its First Workshop on How States and Districts Can Evaluate Their Interventions

Empirical Education staff members participated in a workshop held in Washington sponsored by the U.S. Department of Education’s Institute of Education Sciences (IES) on April 24. The goal of the event was to encourage “locally initiated impact evaluations.” The primary presenter, Mark Lipsey of Vanderbilt University, called upon his extensive experience in rigorous evaluations to illustrate and explain the how and why of local evaluations. David Holdzkom of Wake County (NC) provided the perspective from a district where the staff has been conducting rigorous research on their own programs. Empirical Education assisted IES in preparing for the workshop by helping to recruit school district participants and presenters. Of the approximately 150 participants, 40% represented state and local education agencies (the other 60% were from universities, colleges, private agencies, R&D groups, and research companies). An underlying rationale for this workshop was the release of an RFP on this topic. Empirical Education expects to be working with a number of school districts on their responses to this solicitation, which is due October 2, 2008.

2008-05-01

Two-Year Study on Effectiveness of Graphing Calculators Released

Results are in from a two-year randomized control trial on the effectiveness of graphing calculators on Algebra and Geometry achievement. Two reports are now available for this project, which was sponsored by Texas Instruments. In year 1 we contrasted business as usual in the math classes of two California school districts with classes equipped with sets of graphing calculators and led by teachers who received training in their use. In the second year we contrasted calculator-only classrooms with those also equipped with a calculator-based wireless networking system.

The project tracked achievement through state and other standardized test scores and implementation through surveys and observations. For the most part, the experiment could not discern an impact as a result of providing the equipment and training for the teachers. Data from surveys and observations make clear that the technology was not used extensively (and by some, not at all) suggesting that training, usability, and alignment issues must be addressed in adoption of this kind of program. There were modest effects, especially for Geometry, but these were often not found consistently for the two measurement scales. In one case contradictory results for the two school districts suggests that researchers should use caution in combining data from different settings.

2008-03-01

At NCES Conference, Empirical Education Explains a Difficulty in Linking Student and Teacher Records

The San Francisco Bay Area, our “home town”, was the site for the 2008 National Center for Educational Statistics (NCES) conference. From February 25 to 29, educators and researchers from all over the country came to discuss data collection and analysis. Laurel Sterling and Robert Smith of Empirical Education presented “Tracking Teachers of Instruction for Data Accuracy and Improving Educational Outcomes”. Their topic was the need to differentiate between, the teachers who actually perform instruction for students and the teachers with whom those students are officially registered. They explained that in our research we keep track of the “teacher of instruction” vs. “teacher of registration”. Without this distinction we are unable to properly identify the student clusters or associate student growth with the right teacher. Sharing instructional responsibilities within a grade-level team is common enough to be of concern. In a large experiment involving teachers in grades 4 through 8, 17% reported teaching students who were not assigned to them on the official class roster. The audience was very lively and in the question period contributed to the topic. One district IT manager indicated that there is movement in this direction even at the state level. For a copy of the presentation, send an email to Robin Means.

2008-02-29

Empirical Education Joins the What Works Clearinghouse Research Team

Mathematica Policy Research, Inc. has subcontracted with Empirical Education to serve as one of the research partners on the new What Works Clearinghouse (WWC) team. This week, Empirical research staff joined a seminar to talk through the latest policies and criteria for judging the quality and rigor of effectiveness research.

Last summer, the Department of Education granted leadership of the WWC to Mathematica, (formerly led by AIR), which put together a team consisting of Empirical, RAND, SRI, and a number of other research organizations. This round of work is expected to have a greater emphasis on outreach to schools, industry, and other stakeholders.

2008-01-29

Maui Schools Sign Subscription Agreement for Empirical Education Research Services

Empirical Education will be providing research services to the Maui School District through an innovative subscription arrangement for MeasureResults™, an Internet-based research and consulting offering. The initial application of this service will be investigations of the longer term impact of the Cognitive Tutor program that was implemented under a Math and Science Partnership program grant.

The company’s MeasureResults service is a response to the ever increasing demands on school systems to validate their program and spending decisions based on the analysis of solid data. Most districts do not have the staff and facilities to set up data and run complex statistical analyses. In Maui, the service will take advantage of the sophisticated data warehousing capabilities being put in place statewide. MeasureResults is designed to simplify the technical and logistical steps of conducting experiments by building powerful and verified analytical techniques into an uncomplicated framework. The offering includes consultative services on research design and web-based interfaces to gather data, automate analysis, and generate reports. MeasureResults is bundled with technical support that includes review of all analyses and reports by trained statisticians.

2007-12-17

Empirical at 2008 AERA Conference

Acceptance notifications are in! Empirical Education will have a strong showing at the 2008 American Educational Research Association conference, which will be held in New York on March 24–28, 2008. Empirical staff will be presenting their research and findings in several divisions and special interest groups, including School Evaluation and Program Development, Learning and Instruction, Survey Research in Education, and Measurement and Research Methodology: Quantitative Methods and Statistical Theory. Our six presentations will include a variety of topics:


View pictures of Empirical Education’s reception at The Chocolate Bar in NYC.

2007-12-15

Empirical Education Pilots Workshop at REL-NEI Regional Meeting

Providence, RI was the site of REL Northeast and Islands’ 2007 Regional Meeting on Teacher Quality. Empirical Education staff headed out to the east coast to pilot the “Becoming Good Consumers of Research” workshop to an audience of about 30 education researchers, school administrators, university professors, and other education professionals interested in using research to inform school decisions. Gloria Miller, the company’s director of evaluation design, facilitated group discussions that touched on identifying different types of research and potential sources of bias. Gloria also provided participants with a Critical Reader’s toolkit designed to help readers evaluate the trustworthiness and relevance of various pieces of research. “Consumers of Research” is the first in a planned series of workshops focused on increasing the understanding and use of research in schools. The next round of workshops, scheduled for March 2008, is currently being developed.

2007-11-14

Report Released on the Effectiveness of Carnegie Learning’s Bridge to Algebra

Empirical Education released the results of a year-long randomized experiment on the comparative effectiveness of Carnegie Learning’s Bridge to Algebra program. This study, like that on Carnegie Learning’s Cognitive Tutor for Algebra (see news for May 24, 2007), was conducted in cooperation with the Maui School District in Hawai’i and funded through a grant to Empirical Education from the U.S. Department of Education. The experiment could not discern an overall difference between the program and control group measured by either NWEA’s test for general math or the test’s algebraic operations subscale. However, for the algebraic operations outcomes (but not for general math), we found that students scoring low before participating in Bridge to Algebra benefited significantly more from the program’s algebraic operations instruction than did students with high initial scores. The district was specifically interested in looking at how the different ethnic groups, particularly the Hawaiian/Part-Hawaiian and Filipino students, performed in the new program. Controlling for pretest, we did not find that the program had a different effect for different ethnicities. The district was also interested in learning whether the program was differentially effective for students taught by certified teachers versus those with non–certified teachers. For the overall score (but not the algebraic operations sub-strand), we found that the program gave the non–certified teachers an advantage. Finally, despite some implementation challenges, teachers reported a generally positive attitude toward the new program.

2007-10-15

REL West Calls on Empirical for Assistance with Experiment on New Economics Program

WestEd, which holds the contract for the Regional Education Laboratory contract in the western region (REL West) has contracted with Empirical Education for the operations of a large randomized experiment involving over 75 high school economics teachers throughout California and Arizona. With a September 2007 start, the project has called for very rapid start up by Empirical Education staff including delivery and processing approximately 40,000 student tests and surveys, acquiring data from over 50 school districts and conducting web-based surveys with 84 teachers. The Problem-Based Economics program, developed by Buck Institute for Education is being tested in the context of a single-semester course. During the 2007-2008 school year, two cohorts of students will take the Test of Economic Literacy (TEL) as well as performance assessments and attitudinal surveys to test the impact of the new program compared to the materials in use in the classrooms of control teachers.

2007-09-06
Archive