Measuring Outcomes & Evaluation

A quality program has a system for measuring outcomes and using that information for ongoing program planning, improvement, and evaluation.


A quality program has clearly defined goals and has identified specific outcomes to measure progress towards its goals. Goals and outcomes are aligned with the essential elements of a quality program, and the program has a plan for regularly gathering data and evaluating performance against its chosen outcomes. Evaluations should include assessment of program activities, staff performance, and student engagement. Evaluations should be based on quantitative data collection as well as qualitative feedback from staff, participants, families, and other key stakeholders. Evaluation findings should be used to shape plans for future program improvement and professional development.


Indicators & Performance Levels

(Click on the indicator to expand the performance level examples, or download the full set of indicators and performance levels here.)

A quality program:

Performance Levels
Rate your program in each of the indicators using the following system:

1 Must Address and Improve / Standards Not Met
2 Some Progress Made / Approaching Standard
3 Satisfactory / Meets Standards
4 Excellent / Exceeds Standards

Organizations are expected to strive for a satisfactory performance level (3) on all of the quality indicators.

Over time, programs should continue to strive for an excellent performance level (4).

1. * Has measurable program goals and intended outcomes that are aligned with the organizational mission, vision, and identified needs.

Performance Level 1

The program goals and intended outcomes are unclear, or the relationship between the program goals and intended outcomes and the organizational mission, vision, and identified needs is unclear. As a result, the goals are not measurable. Staff members are unaware of the mission and the program goals.

Performance Level 2

Broad program goals and intended outcomes generally relate to the organization’s mission. Goals are vague and difficult to measure. The relationship of the goals and intended outcomes to identified needs is vague or unknown. Staff members are aware of the mission, vision, goals, and intended outcomes of program, but are not clear how their work contributes to meeting them.

Performance Level 3

Program goals and intended outcomes are developed based on the needs of participants. Goals and intended outcomes are specific, measurable, and aligned with and support the organization’s mission. Staff members meet to discuss the goals and intended outcomes of the program and to ensure the program activities work towards meeting the goals and intended outcomes.

Performance Level 4

Program goals and intended outcomes are developed based on the identified strengths and needs of program participants. Goals and intended outcomes are specific, measurable, achievable, relevant, and time-bound (SMART), and are clearly aligned with the organization’s mission. Activities are designed to support both short- and long-term goals. Staff members, participants, families, and other stakeholders are actively engaged in developing, assessing, and evaluating goals and intended outcomes.

2. * Develops and/or plans for program evaluation that includes gathering both qualitative and quantitative data.

Performance Level 1

Program evaluation may occur but is not a planned process. Instead, evaluations are informal and occur irregularly. Evaluations are not consistent over time. Therefore, there is no comparable data for program stakeholders to review.

Performance Level 2

The site director develops a plan for program evaluation. The plan includes collecting only qualitative or quantitative data. The collection methods used are informal, and the site director is solely responsible for gathering data.

Performance Level 3

The site director develops a plan for program evaluation with input from staff and stakeholders. The plan includes collecting both qualitative and quantitative data, and includes all data necessary to report to funders, parents, and other stakeholders. The site director and other staff members use surveys and observations to gather data. The evaluation plan includes a system for using evaluation results, which includes reviewing results prior to and during program planning and while shaping management and operational practices.

Performance Level 4

The site director partners with staff members and stakeholders to develop a plan for ongoing program evaluation. The plan includes collecting both qualitative and quantitative data, and includes all data necessary to report to funders, parents, and other stakeholders, as well as youth development outcomes, academic and cognitive 1development outcomes, and observable and non-observable aspects of program management and operations. The site director and other staff members use surveys, observations, self-assessment, and other means to gather data, which is then stored in electronic and paper files. The evaluation plan includes a system for using evaluation results to improve the program and inform program decisions.

3. Measures participant progress by quantitative and qualitative data to identify outcomes.

Performance Level 1

Participants’ progress is assessed informally through anecdotal information from participants and/or program staff. Data is rarely recorded.

Performance Level 2

The program has a participant self-report method for measuring participants’ progress. Data is captured,but accuracy is unknown. Staff members receive anecdotal information verbally from participants, but it is not always recorded.

Performance Level 3

The program measures participants’ progress in a few ways, which may include participant self-report, staff observation, pre-year and post-year surveys, parent surveys, and other tools. Assessment is integrated into the program and informs the development of future activities. Participants and families are informed regularly of their progress. Staff members record anecdotal information received verbally.

Performance Level 4

The program measures each participant’s progress in a variety of ways, including participant self-report, staff observation, pre-year and post-year surveys, and parent surveys. The program is in regular communication with the school and families about the participant’s progress, in accordance with FERPA. Assessment is integrated into the program and informs the development of future activities. Participants and families are informed regularly of their progress. Staff members also ask families and other stakeholders to submit written anecdotal information, which is kept in participants’ files.

4. Identifies and shares promising practices.

Performance Level 1

The site director and staff members do not meet to discuss their work and do not participate in professional development events,so they are unaware of what practices are effective.

Performance Level 2

The site director and staff members meet occasionally to plan and discuss the program activities. Promising practices and related information are shared informally and irregularly. The site director and staff members occasionally participate in professional development events.

Performance Level 3

The site director and staff members meet regularly to discuss their program activities and track promising practices through writing successful curricula and activity guides so that the promising practices can be replicated. Staff members share these practices among themselves and occasionally with colleagues from other sites. The site director and staff members regularly participate in professional development events.

Performance Level 4

The site director and staff members monitor and track promising practices through writing up curricula and activity guides. Staff members regularly share these practices at staff meetings. Staff members also share their promising practices with colleagues from other sites through meetings, listservs, and at conferences. The site director and staff members regularly and frequently participate in professional development events

5. Makes summaries of evaluations and/or other collected data available to the general public.

Performance Level 1

The site director does not alert program stakeholders when an evaluation is conducted. The evaluation summary and related data are not made available.

Performance Level 2

The site director alerts some program stakeholders when an evaluation is conducted through informal conversations. The evaluation summary only is available upon request.

Performance Level 3

The site director alerts all program stakeholders when an evaluation is conducted through a formal method of communication, such as an e-mail or newsletter. The evaluation summary and related data are posted,and copies are available upon request.

Performance Level 4

The site director alerts all program stakeholders when an evaluation is conducted through multiple formal methods of communication, such as e-mail, meeting minutes, and newsletters. The evaluation summary findings and related data, including both strengths and challenges, are communicated. The entire evaluation or an executive summary is clearly posted and copies are distributed to all participants, families, partner organizations, members of the Board of Directors, local principals, and other stakeholders.

6. Creates an internal method for assessing program activities.

Performance Level 1

Occasional feedback is received through informal conversations with participants, families, and other stakeholders to assess program activities.

Performance Level 2

The site director has created or located an internal method for assessing program activities. The method uses one type of assessment (i.e. surveys) and is implemented irregularly. Only the site director reviews the information collected. Sometimes the information is used to inform modifications in program design.

Performance Level 3

The site director, with input from staff members, has created or located an internal method for assessing program activities. The method uses several types of assessment (i.e. surveys, focus groups, verbal feedback, etc.) and is implemented regularly. The site director always reviews the information collected, and staff members are encouraged to review the information as well. The information is used to inform modifications in program design.

Performance Level 4

The site director, in collaboration with other staff members, participants, and other program stakeholders, has created or collaboratively decided upon an internal method for regularly assessing program activities. The method uses several types of assessment (i.e. surveys, focus groups, verbal feedback, etc.) and is implemented regularly. The site director, staff members, and participants always review the information collected. The information is used to inform regular modifications in program design and delivery. All information collected is stored in paper and electronic files to enable the site director, staff members, and participants to review program progress over time.

7. Creates an internal method for assessing staff performance.

Performance Level 1

The site director occasionally observes staff members’ performance and gives them verbal feedback.

Performance Level 2

The site director has a simple internal method for assessing staff performance. The method uses a one-way assessment (i.e. observation) and is implemented irregularly. The site director only shares the information collected verbally with staff members. Sometimes the information is used to inform modifications in program management and operations.

Performance Level 3

The site director, with input from staff members, has created an internal method for assessing staff performance. The method uses both one-way assessment (i.e. external observation) and two-way assessment (i.e. self-assessment) and is implemented regularly. The site director shares the information collected with staff members. The information is used to inform staff members’ goals for the coming year and to collect suggestions to determine professional development opportunities.

Performance Level 4

The site director, in collaboration with other staff and program stakeholders, has created an internal method for assessing staff performance. The method uses several types of assessment (i.e. observation, self-assessment, etc.) and is implemented regularly. The site director shares the information collected with staff members, and asks them to reflect on their own performance. The information is used to inform staff members’ goals for the coming year and to collect suggestions to determine professional development opportunities. If a staff member receives a negative review, a corrective action plan is developed. All information collected is stored in paper and electronic files to enable the site director to review program progress over time.

8. Creates an internal method for assessing participant engagement levels.

Performance Level 1
The program requires medical forms. No tracking is done to ensure completed records of participants are received. Forms that are submitted are kept on file but rarely used. Therefore, staff members are not always aware of the special health needs of participants.

Performance Level 2
The program requires medical forms and tracking is done to ensure all forms are received. Forms are kept on file and are reviewed if there is a medical concern or emergency. No review of forms is done to make the staff aware of special needs. Staff members may only become aware of the issue during an emergency that prompts them to review a participant’s form.

Performance Level 3
The program requires medical forms and receives them from each participant. Forms are reviewed by staff members and special health needs are flagged; forms are then kept on file. Staff members are informed of relevant special health needs of participants, such as food allergies, at the beginning of each year. Adjustments are made to the program design as necessary based on participants’ health needs. Any information shared with staff members is done so in consideration of confidentiality rules.

Performance Level 4
The program requires medical forms and receives them from each participant. Forms are reviewed by staff members and by a nurse or health specialist and special health needs are flagged; forms are then kept on file. Staff members are informed of relevant special health needs of participants, such as food allergies, at the beginning of each year, and again in the middle of the year. Adjustments are made to the program design as necessary based on participants’ health needs. The site director or other staff members maintains relationships with school nurses to receive updates on participants’ health needs as they change. Any information shared with staff members is done so in consideration of confidentiality rules.

9. Includes feedback from stakeholders in the program evaluation.

Performance Level 1

Feedback from program stakeholders, such as participants, staff members, families, and community leaders, is not included in program evaluation. They are not involved in the evaluation process.

Performance Level 2

Feedback from a few program stakeholders, such as participants and staff members, is included in program evaluation on an ad hoc basis, if they volunteer to speak with the site director or evaluator.

Performance Level 3

Feedback from several program stakeholders, such as participants, staff members, families, and community leaders, is included in program evaluation. Stakeholders are invited to be involved in the evaluation process, and are given the opportunity to speak with the staff leading the evaluation or the evaluator. There is a section in the evaluation dedicated to stakeholder feedback.

Performance Level 4

Feedback from all program stakeholders, including participants, staff members, families, and community leaders, is a critical component in program evaluation and is collected on an ongoing basis. Multiple ways to include stakeholder feedback are included as a part of the evaluation design. Stakeholders have the opportunity to communicate directly with the staff members leading the evaluation and/or the evaluator, and their feedback is embedded throughout the evaluation. Stakeholders also have multiple opportunities throughout the year to review and provide feedback on progress evaluations.

10. Uses evaluation findings for continuous program improvement.

Performance Level 1

Evaluation is conducted on an infrequent basis or not at all. When evaluation is conducted, the site director doesn’t share the findings with staff members. The site director rarely considers the evaluation findings when designing program activities and policies.

Performance Level 2

Evaluation is conducted on an infrequent and/or irregular basis. The site director shares findings with staff members who ask to see them. The site director sometimes reviews the evaluation findings before designing program activities and policies.

Performance Level 3

Evaluation is conducted regularly. The site director shares findings with staff members and program stakeholders. The site director always reviews evaluation findings before designing program activities and policies. The evaluation findings are reflected in changes made to the program design.

Performance Level 4

Evaluation is ongoing and evaluative feedback is collected throughout the year. The staff and program stakeholders are involved in all stages of the process. The site director shares findings and feedback with other staff members and program stakeholders. They discuss and brainstorm ways in which to make improvements to the program throughout the year. The site director and staff members always use the evaluation findings to design program activities and policies. The evaluation findings drive the changes made to the program design.

Research, Tools and Templates, and Resources

Research, Tools and Templates, and Resources

Suggested Stakeholders

The following stakeholder groups may be appropriate to involve in surveys and focus group discussions around this element:

  • Program Administrators
  • Program Staff
  • Program Participants
  • Parents
  • School Teachers
  • School Guidance Counselors
  • School Principals
  • Staff of Partner Programs
  • Other
Taking Action

Taking Action, Suggested Stakeholders, Try This!, and Tips for Success

RIGHT NOW: ADDRESSED WITHIN THE FIRST 30-60 DAYS OF ASSESSMENT. 
 Program director meets with staff to revisit the organization’s vision and goals, assess how activities are aligned with the goals, and determine what evidence of success is available.

THIS YEAR: ADDRESSED BY THE END OF THE PROGRAM YEAR.
The director works closely with staff to clarify program goals and to begin to define how they will be measured. Internal methods of assessing program success and staff performance and capturing promising practices are developed. Findings are shared with staff and key stakeholders and inform a plan for continuous program improvement.

NEXT YEAR: ADDRESSED AT THE BEGINNING OF THE NEW PROGRAM YEAR.
The director meets with staff and stakeholders to develop a plan for how to gather and use information to encourage continuous learning and improvement of programming. Staff are trained in evaluation methods (such as data collection, program observations, or interviews) and are involved in designing the evaluation questions. The director shares key findings of the self-assessment process with an external evaluator. The director and staff meet periodically with the evaluator to provide feedback and ask questions.

The following stakeholder groups may be appropriate to involve in surveys and focus group discussions around this element:

  • Program Administrators
  • Program Staff
  • Program Participants
  • Parents
  • School Teachers
  • School Guidance Counselors
  • School Principals
  • Staff of Partner Programs
  • Other
Try This!

The following activity can help your team suggest priority issues for evaluation and contribute to the evaluation design and/or data collection. You will need plenty of chart paper, markers, and copies of your organization’s mission and goals. Allow about 1 1/2 to 2 hours for this exercise.

  • As a full group, review your organization’s mission and goals.
  • In small groups, have participants brainstorm a list of key aspects of the program to be evaluated. Ask the groups to also think of possible evaluation methods for each program aspect, and ways in which various stakeholders (e.g. youth, school administrators, parents) could be involved in the evaluation process.
  • Record the responses and post each group’s answers. Conduct a gallery walk so participants can read each other’s responses and add ideas. Return to the full group and finalize the lists by eliminating duplicate ideas. Ask the group to prioritize the most important aspects of the program to be evaluated.
  • Debrief the exercise and determine next steps for implementing an evaluation.
Tips for Success

Being Prepared for Evaluation

Having your programs evaluated is less daunting when you make an ongoing effort to be prepared. Here are tips to easing the burden when its time to measure your success.

1. Have a Plan 
You can’t measure performance if your goals aren’t clear from the beginning. All of your program’s stakeholders should be clear on what you’re aiming to achieve and how you plan to meet your objectives. There shouldn’t be any surprises when your programs are being evaluated!

2. Create a Logic Model
A logic model is a visual representation of your goals. Logic models have four main parts: inputs, activities, outputs, and outcomes. Inputs are what you need to provide services or create products, such as staff and time. Activities are actions taken, which require inputs, with the goal of fulfilling the objectives set out in your mission statement. Outputs are the direct results of your activities. Outputs are often numerical; for example, 10 families attended a program. Outcomes are the goals you plan to achieve. They should link directly to your outputs. For example, if 10 families attended a program, the outcome is that those families’
literacy has increased. When evaluating your programs, a logic model will provide the outcomes to be measured.

3. Collect Data (everyday!)
While it sounds obvious, collecting data year-round should be viewed as a priority to your program’s success. Investing time in collecting key information, such as daily attendance records, will save you from feeling pressured when it comes time to report statistics. By having records organized and centrally located, you will always be ready for a program evaluation.