Evaluating early intervention programmes: six common pitfalls, and how to avoid them

Authors:
MARTIN Jack, et al
Publisher:
Early Intervention Foundation
Publication year:
2018
Pagination:
30
Place of publication:
London

High-quality evidence on ‘what works’ plays an essential part in improving the design and delivery of public services. The guide outlines six of the most common issues in evaluation design and execution that can undermine confidence in a study’s findings, how they can be avoided or rectified. These are: no robust comparison group, a high drop-out rate, excluding participants from the analysis, using inappropriate measures, using small sample sizes, and lack of long-term follow up. For each issue, case studies and a list of useful resources are included. The guide will be useful for evaluators and programme providers and also aims to help policymakers, practitioners and commissioners to make informed choices. It draws on over 100 in-depth assessments of the evidence for the effectiveness of programmes designed to improve outcomes for children carried out by the Early Intervention Foundation. (Edited publisher abstract)

Subject terms:
early intervention, evaluation, research methods, research design;
Content type:
practice guidance
Location(s):
United Kingdom
Link:
Register/Log in to view this resource

Key to icons

  • Free resource Free resource
  • Journal article Journal article
  • Book Book
  • Digital media Digital media
  • Journal Journal

Give us your feedback

Social Care Online continues to be developed in response to user feedback.

Contact us with your comments and for any problems using the website.

Sign up/login for more

Register/login to access resource links, advanced search and email alerts