Skip to main content

You are here


The Planning and Evaluation Corner

The programs and practices in NREPP have been rigorously evaluated. However, before a program or practice could be evaluated, it had to be planned and designed. And before it could be planned and designed, some needs had to be identified, which the program or practice would be addressing. This Corner of the Learning Center, on Planning and Evaluation, discusses and provides extensive additional resources about best practices in needs assessment, planning, and evaluation. Each of these topics has a large literature associated with it. This Corner provides a selection of resource materials -- from among those large bodies of literature on research, analysis, and guidance -- that have been deemed to be highly relevant to needs assessment, planning, and evaluation for behavioral health programs and practices.

Click on the links below for additional resource materials relevant to planning and evaluation methodology:

Needs Assessment
Needs Assessment and cultural Competence: Questions to ask
These questions will help to ensure that your assessment process is inclusive and culturally relevant.

Planning for Effective Program Evaluation
This module provides information on developing logic models and conducting a needs assessment.

Criteria for Analyzing Assessment Data
When setting prevention priorities, communities often find it helpful to analyze their assessment data according to five criteria.

BEST Training on Evidence Based Practice
Ultimately aimed at strengthening the linkage between research knowledge and practice, the specific components of this process involve posing specific and well-structured questions, searching effectively and efficiently for the best evidence, evaluating any evidence that you identify by research standards and your own agency and professional judgment, and finally taking action based on your assessment.        

Tools for Implementing an Evidence-Based Approach in Public Health Practice
Easily accessible and time-efficient tools for implementing an evidence-based public health (EBPH) approach to improve population health.

Identifying Needs
A community needs assessment will help you gather information to use for program selection and planning.

Evidence-Based Practice Skills Assessment for Criminal Justice Organizations
The Evidence-based Practice Skills Assessment (EBPSA) is a self-report measurement tool designed to gauge the extent to which correctional staff demonstrate the skills necessary to successfully implement Evidence-based Practices (EBP). This guide will summarize how the EBPSA enhances an organization’s ability to become a more effective evidence-based organization.

back to top

The Hexagon Tool
Helps states, communities, and agencies evaluate new and existing interventions via needs, fit, resource availability, evidence, readiness for replication, and capacity to implement.  

Willing, Able, Ready: Basics and Policy Implications of Readiness as a Key Component for Implementation of Evidence-Based Interventions
This brief is one in a series exploring issues related to the implementation of evidence-based interventions.

Centers for Disease Control and Prevention Assessment & Planning Models, Frameworks & Tools
Assessment and Planning Models, Frameworks and Tools

Applying the Strategic Prevention Framework
The SPF reflects a community-based approach to prevention efforts and helps states, tribes, and jurisdictions build the infrastructure necessary for successful outcomes.

Identifying and Selecting Evidence-Based Interventions
The purpose of this guidance is to assist State and community planners in applying the Substance Abuse and Mental Health Services Administration’s (SAMHSA’s) Strategic Prevention Framework (SPF) to identify and select evidence-based interventions that address local needs and reduce substance use problems.

Scaling-up Brief: Exploration Stage
Exploration Stage processes are designed to assure mutually informed agreement to proceed with use of an innovation; both the Implementation Team and the organization understand what is to be done, how it will be done, and the resources and timelines for doing it.

Scaling-up Brief: Cascading Logic Model
The Cascading Logic Model helps states define and operationalize the infrastructure needed for effective statewide implementation of EBPs.

ImpleMap: Exploring the Implementation Landscape
When creating implementation capacity in an organization new to active implementation, the first task is to map the current implementation landscape. The ImpleMap interview process assists implementation specialists in collecting information to inform active implementation planning and development in the organization.

Stages of Implementation Analysis: Where Are We?
When creating Implementation Teams to provide supports that are effective, integrated, efficient, and sustainable, the first task is map the current implementation landscape. The goal is, build on current strengths and collect information to inform planning the best path toward developing implementation capacity in this provider organization.

LINKS Syntheses
A synthesis of information from experimentally evaluated programs in the LINKS (Lifecourse Interventions to Nurture Kids Successfully) database provided in fact sheets on Program Population, Program Outcome, and Program Approach.

Psychosocial Interventions for Mental and Substance Use Disorders
A framework for establishing evidence-based standards.

What Research Says About Readiness?
Creating readiness for change.

Organizational (Staff) Assessments
Measures aspects of organizational readiness for change.

Using Logic Models, a Key Building Block of Results-Focused Programs
The interview provides an overview of why a clear program description is important, what logic models are and how they’re used, how the CDC is using logic models to clarify grantee proposals, and advice to program leaders and public managers about using logic models.

back to top

How to Design Performance Measures to Better Measure Impact
Public leaders can design performance measures to better reflect impact, including through the use of regression-adjusted performance measures.

Understanding Evidence
Explains the purpose and meaning of the Continuum of Evidence of Effectiveness, a tool that was developed to facilitate a common understanding of what the Best Available Research Evidence means in the field of violence prevention. This Continuum also provides common language for researchers, practitioners, and policymakers in discussing evidence-based decision making.

Performance Management and Program Evaluation
This video overview is designed to help public leaders and program managers to better understand performance management and program evaluation — their differences and their synergies — so they can more closely integrate these efforts.

Strengthening Evaluation Capacity Within Agencies
The Director of the Office of Planning, Research and Evaluation within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services speaks about improving the effectiveness and efficiency of ACF programs.

Evaluation Tools and Resources
These tools, aides, tip sheets, and other resources are available to support with planning and management, implementation, and analysis of data and evaluations.

Logic Model Development Guide
Designed to assist organizations in creating programmatic logic models.

Developing a Logic Model to Guide Program Evaluation
Shows how logic models can be used to inform program planning, implementation, and evaluation.

Evaluation Steps
Six connected steps together can be used as a starting point to tailor an evaluation for a particular public health effort, at a particular point in time. An order exists for fulfilling each step – in general, the earlier steps provide the foundation for subsequent progress.

Lessons from Evaluators’ Experiences with Scale
The Harvard Family Research Project spoke with three evaluators to discover how evaluation can inform and assess scaling efforts. They shared lessons from their experiences in evaluating programs as they went to scale.

W.K. Kellogg Foundation Evaluation Handbook
This handbook provides a framework for thinking about evaluation as a relevant and useful program tool. It was written primarily for project directors who have direct responsibility for the ongoing evaluation of W.K. Kellogg Foundation-funded projects.

A Checklist for Building Organizational Evaluation Capacity
The purpose of this checklist is to provide a set of guidelines for organizational evaluation capacity building (ECB), i.e., for incorporating evaluation routinely into the life of an organization.

Criteria for Selection of High-Performing Indicators
The checklist includes practice-based criteria to be considered in the selection of indicators for use in monitoring and evaluation.

Evaluating the Initiative
This toolkit aids in developing an evaluation of a community program or initiative.

Some Methods for Evaluating Comprehensive Community Initiatives

Evaluation Questions and Designs

Change Tool
To complete the CHANGE tool, all eight action steps and resources are provided on this page.

Communicating Evaluation Results
This module will address the process for reviewing and interpreting the data and end with different strategies to communicate results to your stakeholders.

2013 CASEL Guide: Effective Social and Emotional Learning Programs—Preschool and Elementary School Edition
The CASEL Guide provides a systematic framework for evaluating the quality of social and emotional programs and applies this framework to identify and rate well-designed, evidence-based SEL programs with potential for broad dissemination to schools across the United States. The guide also shares best-practice guidelines for district and school teams on how to select and implement SEL programs. Finally, it offers recommendations for future priorities to advance SEL research and practice.

2015 CASEL Guide: Effective Social and Emotional Learning Programs—Middle and High School Edition
The CASEL Guide provides a systematic framework for evaluating the quality of social and emotional programs and applies this framework to identify and rate well-designed, evidence-based SEL programs with potential for broad dissemination to schools across the United States.