Main Report Page | Main
Resource Guide Page
This is a companion guide to the report,
Putting It All Together: Guiding
Principles for Quality After-School Programs Serving Preteens.
Principle 6: Continuous Program Improvement
(targeted staff training, monitoring and coaching, data collection and
Programs strengthen quality through
an ongoing and integrated process of targeted staff training, coaching
and monitoring, and data collection and analyses.
"Continuous Program Improvement" is the glue that holds all of
the other guiding principles together. Programs that are continually
striving to strengthen quality need to engage in three key practices:
1) continuous and targeted staff training, 2) monitoring and coaching
to support implementation on the ground and 3) data collection and
analysis of program strengths and weaknesses.
Staff training must be an ongoing process that is built
into an organization's culture. While many organizations offer introductory
training to new staff or "refresher training" at the beginning of
the program, providing training throughout the program cycle allows
staff to build upon their prior knowledge and develop further competencies.
Training is not effective unless there is some form of monitoring
to see if it is being implemented effectively.
Monitoring and coaching fill a gap that sometimes exists
between training and program improvement. Monitoring includes conducting
program observations of staff "in action" and documenting the findings
in a way that allows the information to be shared quickly with staff
for "real time" program improvement.
Program directors and staff must review data from program
observations, attendance records, surveys or other sources for trends
in program strengths and weaknesses. When used well, data collection
can increase the effectiveness of direct-service programs. After
the program has been up and running long enough, programs can begin
to collect and analyze outcome measures that can help staff think
about the program in terms of the benefits of participation to the
clients instead of program activities, and help an agency understand
which practices are more effective than others. Agencies that choose
to collect outcome measures must limit the number of outcome measures
collected and only survey participants if there are resources to
collect and analyze the data (several resources are provided in
the list below for agencies interested in surveying youth about
outcome measures common to youth development programs).
An organization committed to quality will continually assess its
programming in order to improve. Several tools, ranging from self-assessment
tools that can be done by program staff to instruments for external
observers, have been developed to help agencies accomplish this
task. An overview of some of the assessment tools that may be of
most use to after-school programs serving preteens is provided at
the end of this section.
Examples of this Principle in Action
- Staff are supported through ongoing staff development and training.
- Staff are able to translate what they have learned in workshops
- Program observations are conducted and the information they
yield informs staff training and program improvement.
- Staff development and training is clearly connected to and builds
on the program’s vision, purpose and goals.
- Staff retention is high.
- Program goals and objectives are measurable and meaningful.
- Youth outcome measurement data is collected regularly and used
to influence program improvements.
- Staff evaluations are regularly conducted with staff and used
to guide staff development and program improvement decisions.
- Organization regularly assesses program quality through self-assessment
or outside evaluation.
Tools to Assist with Continuous Program
The after-school field has begun to define quality through generally
agreed upon guiding principles, including the ones in this guide.
Assessment tools associate those standards with specific indicators.
There are a multitude of different assessment tools because there
are many different types of programs, the nomenclature for the standards
vary, organizations are looking to use tools in different ways and
the tools have been developed simultaneously by different groups.
Although the number of assessment tools can be overwhelming, a
program can benefit from the variety by choosing an assessment tool
that works well with the needs of the organization. While none of
the assessment tools speaks specifically to the six principles in
this guide, most tools will cover all of the principles and allow
an organization to begin the continuous program improvement process.
This section provides a link to a comprehensive review of assessment
tools and a short description of some of the tools likely to be
of the most use to after-school programs serving preteens, including
the Assessment of Afterschool Practices Tool (APT), the Program
Quality Self-Assessment Tool (QSA), the Quality Assurance System
(QAS) and the Youth Program Quality Assessment (YPQA).
Nicole Yohalem and Alicia Wilson-Ahlstrom put together a comprehensive
guide, Measuring Youth Program Quality: A Guide to Assessment Tools,
which reviews nine assessment tools. They chose only tools that:
1) include setting-level observational measures of quality, 2) are
applicable in a range of school- and community-based program settings,
3) include a focus on social processes within programs and 4) are
This guide compares the purpose, structure, content and technical
properties of the following youth program quality assessment tools:
- Assessing Afterschool Program Practices Tool (APT) developed by the National Institute on Out-of-School Time
- Out-of-School Time Observation Tool (OST) developed by Policy Studies Associates, Inc.
- Program Observation Tool (POT) developed by the National AfterSchool Association.
- Program Quality Observation (PQO) developed by Deborah Lowe Vandell and Kim Pierce.
- Program Quality Self-Assessment Tool (QSA) developed by New York State Afterschool Network.
- Promising Practices Rating Scale (PPRS) developed by the Wisconsin Center for Education Research & Policy Studies Associates, Inc.
- Quality Assurance System (QAS) developed by Foundations Inc.
- School-Age Care Environment Rating Scale (SACERS) developed by Frank Porter Graham Child Development Institute & Concordia University, Montreal.
- Youth Program Quality Assessment (YPQA) developed by High/Scope Educational Research Foundation.
Following is a brief description of four of the nine assessment
tools reviewed in the Yohalem and Wilson-Ahlstrom guide. These four
are highlighted because they do not rely on outside evaluators,
they can be conducted by program staff and they are applicable for
the preteen age-group. A fifth assessment tool that is still in
draft form is also highlighted as a reference because of its relevance
for California after-school programs.
Neither the Foundation nor P/PV is endorsing any of
these tools. The following information is intended only as a resource
to agencies that are seeking a tool.
- Assessing Afterschool Program Practices Tool (APT)
The APT was developed by the National Institute on Out-of-School
Time (NIOST) and the Massachusetts Department of Education to
assess 21st Century Community Learning Center Programs. The tool
is designed for well-established programs that serve a stable
group of youth (grades K-8) over an extended period of time.
The APT measures positive program climate, supportive relationships,
practices that support individualized needs and interests, and
practices that stimulate engagement and skill-building. The tool
helps programs identify strengths and weaknesses and guide the
development of staff development and improvement goals. There
are two tools within the APT: an observational tool that focuses
on observable program practices and a questionnaire that examines
those aspects of program quality that occur "behind the scenes."
The instrument is designed for users to make observations in one
full program session (afternoon) and is flexible in terms of administration,
use of scales, number of observations, etc. NIOST recommends administering
the tool two or three times throughout the program.
While the tool has been primarily used by the Massachusetts Department
of Education, the developers have recently made it more widely
available. The tools themselves are free and downloadable once
a program has completed necessary training. Training is available
through an online tutorial or in person. For more information,
visit http://www.niost.org/Training-Descriptions/the-assessment-of-program-practices-tool-apt or call 781-283-2546.
- Program Quality Self-Assessment Tool (QSA)
The Program Quality Self-Assessment Tool (QSA) was developed
by the New York State After-school Network (NYSAN) with the 21st
Century Community Learning Centers in mind. It was designed for
school- and community-based programs offering a broad range of
services (serving K-12) and not those with a single activity.
After the instrument was completed in 2005, New York State began
requiring that all 21st CCLC-funded programs use it twice a year
for self-assessment purposes. While the instrument can be downloaded
from the Internet, it is intended to be printed and used on paper.
It was designed for self-assessment purposes only, not for external
assessment or evaluation. It can be used as a planning tool or
for ongoing program improvement. The tool is organized into 10
essential elements of effective after-school programs: 1) Environment/Climate,
2) Administration/Organization, 3) Relationships, 4) Staffing/Professional
Development, 5) Programming/Activities, 6) Linkages Between Day
and After-School, 7) Youth Participation/Engagement, 8) Parent/Family/Community
Partnerships, 9) Program Sustainability/Growth and 10) Measuring
Outcomes/Evaluation. Each element has a list of standards of practice
that provide guidelines for ensuring quality outcomes. It is intended
that program staff rate each indicator based on their observations
of the program.
The tool is free and downloadable from: www.nysan.org/section/quality.
Programs can call or email NYSAN to for more information on using
the QSA Tool and to receive technical assistance: Jennifer Siaca,
Coordinator, NYSAN, 646-943-8672, email@example.com.
- Quality Assurance System (QAS)
The Quality Assurance System was developed by Foundations,
Inc. to help after-school programs conduct quality assessment
and continuous improvement planning. It was designed to be general
enough for use in a range of school- and community-based programs
serving grades K-12.
It focuses on quality at the site level. Programs using the QAS
start with an initial assessment from which observers identify
areas in need of improvement and develop specific improvement
strategies. Programs then do follow-up assessments. Foundations,
Inc. recommends administering the QAS twice a year: an initial
assessment at the beginning of the year and a follow-up assessment
toward the end of the year. Training observers should take about
two to three hours, and the observations would take place in one
The QAS can be filled out online. A summary report and charts
and graphs are automatically generated from the data. The assessment
tool can be tailored to the actual goals and activities of the
program. It is based on seven "building blocks" that Foundations,
Inc. considers fundamental features that underlie effective after-school
programming: 1) program planning and improvement, 2) leadership,
3) facility and program space, 4) health and safety, 5) staffing,
6) family and community connections and 7) social climate. In
addition, there are three "program focus building blocks" for
users to select from: 1) academics, 2) recreation and 3) youth
development. Programs vary and may offer one, two or all three
focus areas and are assessed accordingly.
There is a $75 annual site license fee. For more information,
contact Foundations, Inc. at firstname.lastname@example.org
or go to http://qas.foundationsinc.org/start.asp?st=1.
See an article by the Harvard Family Research Project to find
out more about the QAS: http://www.hfrp.org/var/hfrp/storage/original/application/c2dda25c82b71415c1ea669ffbf55925.pdf.
- Youth Program Quality Assessment (YPQA)
The YPQA was developed by the High/Scope Educational Research
Foundation. It was designed to evaluate the quality of youth programs
and identify staff-development needs for structured programs serving
The overall purpose of the YPQA is to encourage individuals, programs
and systems to focus on the quality of the experiences young people
have in programs and the corresponding training needs of staff.
The instrument may be used by a trained independent rater such
as an outside consultant, researcher, program evaluator or agency
administrator. It may also be completed by a team of program staff,
such as supervisors, youth workers, teachers, social workers,
curriculum directors and parents. It focuses primarily on those
features of a program that can be observed and that staff can
change. The seven major areas covered include: 1) Engagement,
2) Interaction, 3) Supportive Environment, 4) Safe Environment,
5) Youth-Centered Policies and Practices, 6) Policies and Practices
and 7) High Expectations and Access. Information is gathered through
observation and interviews and used to score the quality indicators.
Item scores are combined to create an overall program quality
There are three pieces to the tool:
an administration manual, Form A (which focuses on youth experiences
during the school day) and Form B (which assesses the organization's
infrastructure). High/Scope has several training options, ranging
from one- to three-day trainings, designed to support different
uses of the instrument. To order the tool, call 800.40.PRESS.
For more information, see http://www.highscope.org/Content.asp?ContentId=117.
A meeting report written by the forum for Youth Investment discusses
the YPQA: http://www.forumfyi.org/node/330
Karen Pittman, Executive Director of the Forum for Youth Investment,
describes why the YPQA makes such a critical contribution to youth
programs nationwide: http://www.forumfyi.org/node/113
- California Afterschool Program Quality Self-Assessment Tool
The California Afterschool Program Quality Self-Assessment Tool was developed by
the After School Programs Office of the California Department
of Education (CDE) and the California Afterschool Network. It was designed for after-school programs
in California as they strive to achieve and maintain quality programs
and meet the statutory requirements (program, fiscal, monitoring
and evaluation) that are unique to the state. It is an internal self-assessment tool that facilitates self-analysis among program staff and stakeholders, to assess the degree to which the program is being effectively implemented and to work collaboratively to improve policies, procedures and practices.
The QSA Tool is organized into 11 quality elements: 1) Program Design & Assessment, 2) Program Administration & Finance, 3) Community Partnerships & Collaboration, 4) Alignment & Linkages with the School Day, 5) Program Environment & Safety, 6) Youth Development, 7) Staff Recruitment & Professional Development, 8) Family Involvement, 9) Nutrition & Physical Activity, 10) Promoting Diversity, Access, Equity, & Inclusion, and 11) Effectively Supporting English Learners. The indicators in each content area include research-based
"quality indicators" that guide programs toward quality outcomes,
as well as indicators that will support programs in meeting statutory,
fiscal and audit program requirements. The tool uses a four-point
scaled rating system for each indicator. The tool is free and downloadable from http://www.afterschoolnetwork.org/qsatool.
Questions should be directed to the After-School Partnership Office,
If none of the assessment tools detailed above is appropriate for
your program, other tools are available. The Harvard Family Research
Project has compiled several lists of assessment tools in recent years. That list can be found in: Little, Pricilla
M. Harvard Family Research Project. June 2007. The Quality of School-Age
Child Care In After-School Settings. Child Care & Early Education:
Research Connections. No 7. http://www.researchconnections.org/childcare/resources/12576
A more detailed chart, "After-School Program Quality Assessment
Breakdown," including the assessment tool, the tool developer, the
age targeted, the format and some notes, was created during a Harvard
Family Research Project after-school conference in 2005. It has
information for most of the tools listed in the more recent article
cited above. http://www.hfrp.org/var/hfrp/storage/fckeditor/File/summit-2005-breakdown.pdf
Where to Go for More Information
Achieving Quality in the DYCD Out-of-School Time Initiative: Strategies from 15 Programs (2010)
Final Report on the Palm Beach Quality Improvement System Pilot: Model Implementation and Program Quality Improvement in 38 After-School Programs (2008)
In September 2005, New York City's Department of Youth and Community Development launched a large-scale initiative to provide young people throughout the city with access to after-school programming. Basic attendance and program data were collected for thousands of students who attended more than 100 participating programs. In-depth methods, including surveys, observations and interviews, were employed at 15 program sites. Over the course of the evaluation, researchers identified program features that were positively associated with high levels of program participation and with desirable social and academic outcomes, including: (1) rich program content and exposure to new experiences; (2) opportunities for youth to interact positively with peers and with staff; and (3) effective supports for staff that equip them to meet the needs of participating youth. The report also describes successful program strategies for implementing these features. For example, in one program in which researchers observed a marked improvement in staff-to-student relationships over two years, the program had begun offering intentional opportunities for youth to express their feelings and concerns. For more information on the DYCD initiative, visit http://www.nyc.gov/html/dycd/html/afterschool/ost_evaluation.shtml.
Author/Publisher: Russell, Christina, Karen Walking Eagle and Monica B. Mielke. Policy Studies Associates.
This report reviews the results of a pilot Quality Improvement System (QIS), created by a consortium of out-of-school-time providers and an intermediary in Palm Beach County, Florida. The consortium first identified clear goals, then developed a research instrument (the Quality Program Assessment) to measure areas in which programming lived up to or fell short of these goals. The assessments fed directly into the creation of a Program Improvement Plan, a unique procedure developed by each site to target its specific areas of weakness. A total of 38 after-school programs participated in the pilot round of the QIS, serving youth from elementary through high school. Researchers found that, over the two-year pilot program, quality scores generally increased from the baseline to the post-pilot, with the greatest improvements observed in areas targeted for improvement. Researchers identified several strategies that have important implications for the design and deployment of quality improvement initiatives, including intense participation from staff, a low-stakes approach to accountability, and a focused link between performance data and guided planning.
Author/Publisher: Smith, Charles, Tom Akiva and Juliane Blazevski. The David P. Weikart Center for Youth Program Quality, HighScope Educational Research Foundation.
Supporting Success: Why and How to Improve Quality in After-School Programs (2008)
Sheldon and Hopkins use the evaluation of the Communities Organizing Resources to Advance Learning (CORAL) initiative, a multisite after-school program in California, to discuss using continuous program improvement to support staff and improve the quality of services. Reading this report will help a practitioner gain understanding of how a program can use staff training, monitoring and coaching, and data analysis together to improve program quality. A similar version of this piece, with more extensive data analysis, can be found in the American Journal of Community Psychology (2010) 45: 394-404.
Author/Publisher: Sheldon, Jessica and Leigh Hopkins. Philadelphia: Public/Private Ventures.
Inside the Black Box: Assessing and Improving Quality in Youth Programs (2010)
There is growing evidence that the level of program quality is directly related to youth outcomes at after-school programs. In the past few years, several tools have been developed to allow program staff and/or researchers to assess program quality in key areas. This paper compares and contrasts different tools that are currently being used in the field, including a description of the tools' intended purposes, how quality is defined within the tools, and different aspects of their structure and methodology. This paper helps practitioners, researchers and policymakers interested in quality assessment to identify the most appropriate tool for a given purpose and context. The authors report more extensively on this topic in Measuring Program Quality: A Guide to Assessment Tools, which can be found at http://www.forumforyouthinvestment.org/node/297
Author/Publisher: Yohalem, Nicole, and Alicia Wilson-Ahlstrom. American Journal of Community Psychology 45: 350-357. (Note: This publication is not available for free online. It may be available through research library databases.)
Promising Practices in Out-of-School Time Professional Development (December 2007)
This brief encourages after-school programs to take a more intentional approach to professional development in order to align staff training with program goals. The authors encourage program directors to be thoughtful about professional development goals, incorporate staff feedback into training, use professional development standards from other fields to frame a long-term strategy, and incorporate methods to evaluate the effect of professional development on program quality. Though professional development in the after-school field is often synonymous with workshops, other approaches can make it a richer endeavor, including peer mentoring, small learning communities and observation. The document includes references to other tools that may be useful in developing a professional development strategy.
Author/Publisher: Out-of-School Time Resource Center.
Outcome Evaluation: A Guide for Out-of-School Time Practitioners (January 2008)
Many after-school programs may want to consider undertaking an outcome evaluation to learn whether changes occur for participants and if these changes are associated with a specific feature of the program. This document reviews what information might be gained from an outcome evaluation, when such an evaluation should be considered and what the basic steps include. This piece provides a useful introduction to the topic, which will necessarily be followed by extensive follow-up reading before a successful outcome evaluation can be attempted.
Author/Publisher: Allen, Tiffany, and Jacinta Bronte-Tinkew. Child Trends.
Compendium of Assessment and Research Tools (C.A.R.T.) for Measuring Education and Youth Development Outcomes
The Compendium of Assessment and Research Tools (C.A.R.T) is an online database that provides information on instruments that measure aspects of youth development programs. The database offers research tools in specific areas of interest. The instruments are grouped under the following categories: program characteristics, program quality, teachers' role, student engagement, parental/guardian/family support, internal sources of support, external sources of support, school profile, organizational structure, quality of work life, community characteristics, family involvement in school, family characteristics, attention to/awareness of service-learning, national/state/local policies, academic, rationality/connectedness, social role/status, social competence, group interaction/teamwork, school effectiveness, school climate, community involvement, cultural pluralism, student acceptance, school involvement, service-learning program outcomes, reciprocal learning program outcomes, reciprocal learning relationships, social empathy, self-concept, resilience, career exploration/readiness, and citizenship.
Created with support from The Star Center and the W.K. Kellogg Foundation's Learning In Deed Initiative.
Quality Time After School: What Instructors Can Do to Enhance Learning (2007)
This report examines youth experiences in five of Philadelphia's Beacon Centers, which are school-based community centers providing a range of services to all community members and emphasizing after-school opportunities for youth. Grossman et al. found that: 1) group management was one of the most important factors in promoting youth engagement, learning, enjoyment and regular participation; 2) positive adult support was critical to enhancing youth learning and engagement; and 3) the more input or voice participants felt they had to shape an activity, the more engaged they felt and the more they liked the activity. This report provides some concrete suggestions on staff training that are likely to improve programming.
Author/Publisher: Grossman, Jean, Margo Campbell and Becca Raley. Philadelphia: Public/Private Ventures.
The Evaluation Exchange (Fall 2002)
This issue of The Evaluation Exchange examines the use of evaluation for continuous improvement. It incorporates advice from experts, outlines innovative evaluation practices, and provides insights into the evaluations of a wide range of initiatives.
Author/Publisher: Harvard Family Research Project, Vol. VIII, No. 2.
Good Stories Aren't Enough: Becoming Outcomes-Driven in Workforce Development (2006)
This report examines six nonprofit workforce development organizations that have adopted a culture of continuous program improvement. The report looks at how organizations got started and institutionalized the process. It highlights how data can be used and what precautions to take when deciding how to collect information.
Author/Publisher: Miles, Martha A. Philadelphia: Public/Private Ventures.
Using Quality Assessment Tools to Evaluate OST Linkages (Fall 2006)
This article discusses how out-of-school-time (OST) programs are using quality assessment tools to evaluate and promote linkages with families, schools and communities.
Author/Publisher: Westmoreland, Helen. In The Evaluation Exchange, Harvard Family Research Project, Vol. XII Nos. 1 & 2.
Building Quality Improvement Systems: Lessons from Three Emerging Efforts in the Youth-Serving Sector (2007)
The report looks at quality improvement efforts in three large-scale networks: Girls Incorporated, Michigan 21st Century Community Learning Centers, and YouthNet of Greater Kansas City. The report provides information about the kinds of quality improvement processes being designed and implemented, and the consequences of the design choices.
Author/Publisher: Wilson-Ahlstrom, A., & N. Yohalem, with K. Pittman. Washington, DC: The Forum for Youth Investment.
The California After School Program Quality Self Assessment Tool (August 2008)
This tool was developed collaboratively by the California Department of Education and the California Afterschool Network based on research on program quality assessments. The tool can be used by program directors or staff to assess program quality in nine areas: 1) program design and accountability; 2) environment; 3) administration and finance; 4) alignment with the school day; 5) youth development; 6) family involvement; 7) community partnerships; 8) staff development; and 9) provision of diversity and equity. The tool is designed to identify areas for immediate action, stimulate discussion among staff, and lead to an action plan for improvement.
Author/Publisher: California Department of Education and California Afterschool Network.
After-School Initiative's Toolkit for Evaluating Positive Youth Development (2004)
This toolkit includes evaluation questions that after-school program staff could use to assess youth outcomes. The questions cover 45 youth outcomes in the following eight areas: 1) academic success; 2) arts and recreation; 3) community involvement; 4) cultural competency; 5) life skills; 6) positive life choices; 7) positive core values; and 8) sense of self. In addition to questions, the toolkit provides tips on developing and administering surveys.
Author/Publisher: Denver, CO: The Colorado Trust.