Outcome Accountability Process

What is the Outcome Accountability Process?

Identifying Your Service Strategy

What is it you’re planning to do? Are you implementing an evidence-based model with well-established protocols and guidelines for each activity? Or are you innovating, trying out a new set of activities for which no ‘road map’ exists? Maybe your service strategy combines a variety of activities, some with established curricula and some of your own invention. Gather all the information you can and consider all the relevant aspects of what you are planning. (Hint: If you are implementing an evidence-based program, there are likely Logic Models available to help you with your preparation.) Be clear about what all fits into your program.

Outcome accountability is a process of multiple steps, briefly summarized below. Understanding the process will help you to plan and implement an evaluation process that works for your program.

When you’ve worked through this step, you are ready to begin developing your Logic Model. You will likely make several revisions as you work through the various steps below.

Developing a Logic Model

Your program’s logic model is the vital conceptual tool that links the needs of the people you work with, what you know from research and experience, the results (or outcomes) you want to achieve, the way you will work together to get those results, and the methods you will use to find out if those results are being achieved. More details on a logic model can be seen in our section on What is a Logic Model?

Choosing Evaluation Methods & Tools

Once you have identified your service or program model (strategy) and the desired results, you need to choose the methods you will use to track activities and progress. Measurement tools can be as simple and basic as staff observation forms and participant satisfaction surveys, or they may include more complicated methods like standardized tests. A variety of surveys and measurement tools are described on the Compendium of Annotated Measurement Tools page.

Consider collecting both quantitative and qualitative data. While quantitative data, which come in the form of numbers, graphs, derived scores and statistics, tell whether or how often something occurred, qualitative data, which come in the form of spoken or written words or photographs, tell why and how something occurred. To learn more about qualitative evaluation, click here to link with the guide: Using Qualitative Data in Program Evaluation: Telling the Story of a Prevention Program.

Developing an Evaluation Implementation Plan

Once a logic model has been developed and your evaluation methodology and tools have been carefully selected, it is important to have a clear plan that details the steps involved in implementing your evaluation. This is a complex process that may take a great deal of time and effort. A solid plan is usually the result of a team effort that involves stakeholders, including parent-consumers.

Your evaluation plan should answer – at a minimum – the following questions:

Documenting Program Implementation

As a part of evaluating your program, you need to have a clear picture of what exactly is happening in its implementation. Documenting your program’s activities and services involves being able to clearly identify the steps that were taken as part of the implementation, in what manner and sequence. Where did the steps conform to your plan and where did they differ?

  • Documenting how your program was implemented is important to knowing whether it “worked.” A program can fall short of its desired outcomes for a variety of reasons, but here are two common ones: either your ideas about the cause of the problem and the best strategies for solving it were wrong, or they were right, but the program was not implemented as it was intended. This issue is particularly important when implementing evidence-based programs, where being able to achieve the predicted results is based on delivering the program with fidelity to the model. But it also holds for other programs. How do you know what to attribute your successes to if you are not clear on what actually happened? Well-documented implementation is key to achieving an evaluation that can tell you not only if you are achieving desired results, but how to continually improve your services.

Writing Reports and Telling Success Stories

Once you have collected and analyzed your data and begun to interpret the findings, you will begin to have a picture of what progress has occurred. An important next step is to write up the results and report back to stakeholders – the people such as staff, participants, funders and community members who have a reason to care about your program and its outcomes. To read this resource, click on the title above.

Pausing, Reflecting, and Beginning Again

To complete the cycle, the feedback from your evaluation process should be used to identify and begin working toward program improvements. What you have learned about your outcomes will help you to secure funding and community support, and will also inform both future program plans and the next cycle of your outcome evaluation process. To read this resource, click on the title above.

We and selected third parties use cookies or similar technologies for technical purposes and, with your consent, for other purposes. You can consent to the use of such technologies by using the “Accept” button, by closing this notice, by scrolling this page, by interacting with any link or button outside of this notice or by continuing to browse otherwise.