New report and model legislation offer a positive alternative to today's poor use of student data and punitive approach to accountability
EAST LANSING, Mich. (Oct. 22, 2013) – Can we reverse the wrong course on data and accountability? Data-driven improvement and accountability strategies in U.S. education are generally inferior to those of other countries, say Boston College professors Andy Hargreaves, the Thomas More Brennan Professor of Education in the Lynch school of Education, and Henry Braun, the Boisi Professor of Education and Public Policy in the Lynch School of Education. A new brief released today shows how most U.S. education metrics are imposed from above with the threat of punitive consequences, as well as being harmfully unbalanced and disconnected from the main goals of schooling.
In their report, Data-driven Improvement and Accountability, published by the National Education Policy Center (NEPC) with funding from the Great Lakes Center for Education Research and Practice and the Ford Foundation, professors Andy Hargreaves and Henry Braun begin by describing the now-familiar pattern of narrowly defined, high-stakes measures that create "perverse incentives" for educators to narrow the curriculum, teach to the test and allocate their efforts disproportionately to students who yield the quickest test score gains, rather than those with the greatest needs.. They then offer a series of recommendations for how data can be used more effectively and more humanely.
Data-driven improvement and accountability (DDIA) strategies in schools and school systems are now widespread. When it is used thoughtfully, DDIA provides educators with valuable feedback on their students' progress by pinpointing where the most useful interventions can be made. Thoughtful uses of DDIA also give parents and the public accurate and meaningful information about student learning and school performance.
However, in the United States, measures of learning are usually limited in number and scope, and the consequences for schools and teachers of apparently poor performance are often punitive. The result is double jeopardy:
The flawed use of DDIA in much of U.S. education has significant ramifications. "When accountability is prioritized over improvement, DDIA neither helps educators make better pedagogical judgments nor enhances educators' knowledge of, and relationships with, their students. Instead of being informed by the evidence, educators become driven to distraction by narrowly defined data that compel them to analyze dashboards, grids and spreadsheets in order to bring about short-term improvements in results," Hargreaves and Braun caution.
"Schools are driven to place too much emphasis on test scores that capture what can be easily measured. In doing so, they neglect other important skills and qualities that are difficult to quantify. Numerical data should also be combined with the collective professional judgment of teachers if effective decisions are going to be made that truly promote students' learning and development," Hargreaves said.
To ensure that student improvement becomes the main driver of DDIA – and not simply an afterthought to accountability concerns – Hargreaves and Braun offer two key recommendations:
The report's authors draw comparisons with strong and weak uses of data in business and in professional sports. Thoughtful application of DDIA in those sectors, employing a wide range of measures that are valued by all participants and that are both defined and discussed in a spirit of individual and collective responsibility, has given rise to sustainable, high-level performance of enterprises and sports teams. The authors point out the implications for public education.
"Policymakers must take responsibility for assuring the public not only that the range and quality of the indicators used for improvement and/or accountability are adequate to these tasks, but also that educators have sufficient resources and support to effectively implement improvement strategies," Braun said.
Hargreaves and Braun conclude in their report, "Expertise has no algorithm. Wisdom does not manifest itself on a spreadsheet. Numbers must be the servant of professional knowledge, not its master. Educators can and should be guided and informed by data systems; but never driven by them."
A companion report, Model Legislative Language for Comprehensive Assessment and Accountability, authored by attorney Kathy Gebhardt, is also included.
Both reports were produced by the National Education Policy Center (NEPC) with funding from the Great Lakes Center for Education Research and Practice. The Ford Foundation provided additional funding for the policy report.
Find the legislative brief and model legislation on the Great Lakes Center website:
The brief and model legislation are also found on the NEPC website:
The mission of the Great Lakes Center for Education Research and Practice is to support and disseminate high quality research and reviews of research for the purpose of informing education policy and to develop research-based resources for use by those who advocate for education reform.
Find us on Facebook at: https://www.facebook.com/GreatLakesCenter.