Putting Data to Use

Where do I begin in analyzing my data?

Once you have chosen the right evaluation tools for your prevention program, trained your staff to use the tools, and collected data from your program participants, you will find yourself faced with the challenging task of analyzing your data. Programs collect data for a variety of reasons, but most importantly, data should be gathered to inform program improvement efforts. You should never ask participants to fill out surveys, participate in focus groups, or be the subject of observational assessments if the resulting data are not used in a meaningful way.

Knowing how to make the best use of data will not only advance the knowledge, skills, and abilities of your staff and improve program practice, it can also be used to make a powerful case to gain additional funds and support for prevention.

A good first step is to construct graphs and tables to present your data in ways that are both understandable and compelling. At a minimum, your tables and charts should state the number (N) of subjects who participated in the evaluation and the average scores (Mean), when applicable. When reporting on survey scores, it is useful to know and report if the differences between pre and post test scores were statistically significant, in other words, whether the differences between means could have happened by chance or if the differences were the result of services. The results of a t-test can give you that information. Microsoft Excel has functions for creating graphs and generating basic descriptive statistics, including t-tests*.

Next, you and your team should be prepared to discuss what you see. If some of your results vary from what you desired or expected. A few of the questions you might ask include:

Was The Evaluation Process Effective?

  • Were the evaluation instruments appropriate for measuring the targeted outcomes?
  • Were the instruments appropriate for the population we served?
  • Did we administer instruments according to established protocols?
  • Were the data-entry errors?
  • Did we collect quantitative and qualitative data from other sources? Were the findings similar?

Was Implementation Effective?

  • Did we implement the program with fidelity? If not, was fidelity weak or were adaptions made?
  • Were staff selected, trained and supported to provide the services as intended?
  • Was there sufficient intensity and dose of service to produce the desired outcomes?
  • Did participants report satisfaction with services?

Were Expectations Realistic?

  • Were we serving the population the program was designed for (and tested on)?
  • Did the program model we used address the problems we were trying to solve?
  • Were our benchmarks too high? Too low? What information did we use to set benchmarks?

The line of inquiry should begin to reveal what is working, what is not, and why. The conversation should result in recommendations and plans about how the program will adjust services to improve results. This is part of the Continuous Quality Improvement (CQI) process. Outcome reporting and CQI should inform and shape practice and program decisions, including not only what models you are using, but how you are delivering them. A fuller discussion of CQI, can be found here.

Finally, you will need to disseminate your findings. Share what you have learned with interested stakeholders and those who make policy and funding decisions. The prevention story needs to be told. Your evaluation reports can not only advance the knowledge, skills, and abilities of your staff and improve program practice, they can also be used to make a powerful case to gain additional funds and support for prevention.

Where can I find information on reporting evaluation results?

FRIENDS has developed resources in various areas about data utilization and reporting. The FRIENDS Online Learning Center includes courses on both data management and CQI. To learn more, visit http://friendsnrcelearning.org/. CBCAP State Leads can access technical assistance in Data Utilization by contacting their TA Coordinator.

Collect and Examine both Qualitative and Quantitative Data

Through your evaluation efforts, you may collect data in many different forms using a variety of methodologies and tools. For example, you may have data that comes from surveys, assessments of skill development or developmental change, face-to-face interviews, or observations. Regardless of how you collected the data, it will fall into one of two categories: qualitative data or quantitative data.

Qualitative Data

Qualitative data are descriptive; it is expressed in words.

Sources of qualitative data include such things as progress reports, policy and program documents, and comments provided in response to structured, semi-structured, or open-ended questions on surveys or other sources such as focus groups. Qualitative data may also be collected through observation.

Quantitative Data

Quantitative data is expressed numerically.

Sources of quantitative data include things like surveys with Likert scaling, assessments or reports where the results are expressed in numbers. Examples of quantitative data include survey scores, counts of numbers served, percentage or participants who maintained stable housing, gained employment, etc.

What is the value in having both qualitative and quantitative data?

Using Qualitative Data In Program Evaluation Telling The Story Of A Prevention Program

Combining compelling stories of individual or community change with statistics can provide a powerful portrait of a program. While you may organize data in any way that is most effective, it is essential to think critically about statistics and stories gathered through evaluation to make sure that they present the picture that most accurately represents the program and participants. Remember, the context within which data was gathered must always be considered when reporting statistics and stories as fact.

FRIENDS offers a guide to reporting qualitative data, Using Qualitative Data in Program Evaluation: Telling the Story of a Prevention Program. This guide includes examples of data collecting activities and reporting, a glossary of terms, and practical content on:

* This web page does not go into depth about statistical analysis. As your program grows and your budget allows, you may want to consult with an external evaluator for more rigorous evaluation. The following resources are available to help you understand the statistical properties of your data:

Seeing Statistic. A Web-book resource for learning statistics. Provides a searchable table of contents, glossary, and search function. Available for free. Website: http://www.seeingstatistics.com/

Social Research Methods. Extensive resources for those involved in applied social research and evaluation. The Knowledge Base provides detailed information on all aspects of research including measurement and analyses. Information is appropriate for the advanced or knowledgeable reader. Available for free. Website: http://socialresearchmethods.net/

We and selected third parties use cookies or similar technologies for technical purposes and, with your consent, for other purposes. You can consent to the use of such technologies by using the “Accept” button, by closing this notice, by scrolling this page, by interacting with any link or button outside of this notice or by continuing to browse otherwise.