Using Research Findings and Practice-based Insights: Guidance for Policy, Practice, and Future Research

2008 
The call for research-based findings to guide policy and practice is getting louder. At the same time, there is a great reliance (although often not acknowledged publicly) on practice-based insights to guide policy and practice. The KMD process was originally designed to identify what we know and how well we know it in relation to a small number of key topics, purposively working with both research-based findings and practice-based insights. Particular approaches, for example to deepen teachers’ content knowledge, that were supported by knowledge from both research and practice as feasible and effective under a given set of conditions, would be prime candidates for dissemination and widespread implementation. Approaches recommended by expert practitioners that had not been systematically studied would be prime candidates for research, and further unpacking would be needed to explore areas where research and practice had divergent conclusions. Although this original design for KMD work appeared conceptually sound, it didn’t pan out as intended. We have found that much of the empirical literature in our focus topics is more appropriately characterized as program evaluation than research. For example, in many cases a study found that a multi-faceted program aimed at deepening teacher content knowledge was effective, but it was not possible to tell from the study the extent to which any design feature contributed to the outcome. Practitioner insights could be gathered at a much smaller grain size, but without the assurances of validity that systematic study would provide. A typical conclusion of our work is that a particular piece of practitioner advice is consistent with, rather than strongly supported by, the available empirical evidence. As a result, we have added a component to our work that goes beyond disseminating what is known and how well it is known to include ideas for enhancing the systematic accumulation of knowledge to address key problems of policy and practice. Building on existing studies to move the field's understanding forward depends in part on attending to methodological issues. Substantively, many important questions remain open for the broader field to address, but first some decisions must be made on where best to focus investigations. Which features of interventions are most important to study? What aspects of the area should be measured and by what means? What background and experience characteristics of teachers might matter most and should therefore be accounted for in the study designs? What attributes of the context might influence generalizability of findings or replicability of programs? Insights from experienced practitioners can inform decisions about priority hypotheses to be explored. Those strategies that experienced practitioners have found to be effective for particular purposes, often with some sense of why they are effective, offer strong candidates for study. Similarly, practitioners who have worked in a variety of programs often have a sense of the characteristics of participants that make a difference in what strategies do and do not work. These variables would be strong candidates for inclusion in research models. Finally, practitioners who have worked in multiple settings or have observed programs that have scaled up to new sites can offer suggestions about the features of the context in which interventions take place that might influence the success of a strategy (e.g., rural versus urban, extended summer experiences versus academic-year experiences).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    0
    Citations
    NaN
    KQI
    []