top of page

title. Designing and Implementing an Outcomes and Evaluation Framework and System for Hudson Guild

date. 2015 - 2017

city. New York City

Role: Outcomes and Evaluation Manager

Challenge: Hudson Guild is a multi-service community development agency in Manhattan. I was hired as the first full-time staff to spearhead designing and implementing an Outcomes framework for all five departments (Adult Services, Mental Health, Early Childhood, Youth Development and Education, and Arts) and rollout a data collection system with a centralized database housing outcomes information organization-wide. An Outcomes Framework, or broadly, Theory of Change, allows the organization to map the goals of each program and collect data to measure the progress of meeting those Outcomes. Outcomes, in this context is defined as change we see in the target participant's lives that the program intended to change and contribute to.

Process: I worked with staff across all levels of the agency from front-line implementers to senior management to co-create and define appropriate outcomes and corresponding outcome indicators for each program. I provided methodological expertise while they supplemented content knowledge. I guided the teams to create and iterate upon data collection tools to capture the outcomes and methods for gathering information that would inform program improvements. While piloting data collection, I worked with IBM, our software vendor, to configure and enhance the information management system for our programs' needs and subsequently trained program managers, departmental heads and case managers in using the system.

Outcome: The organization approved a defined framework for outcomes that I presented to them, incorporating buy-in from Senior Management, Program Staff and the Board. We piloted and revised collection tools that capture the outcomes of programs within Youth Development and Education, Arts and Mental Health. We also rolled out one year's worth of data in the information management system for the Youth Development and Education Department.

Understanding the Context:

Because my role sat within "Administration" and I operated in a sense, like my own department, I had to familiarize myself with the programs and operations within the organization that I would be designing a new system for. I spent the first two months of my role being a "participant observer" in many of the programs offered: I attended the community wellness activities, guided art exhibition tours, sat-in on the after-school program classes and early childhood classes, and had lunch with seniors at the senior center. Acting as a participant observer allowed me to become familiar with the activities of each program, how they aligned with the intended goal of the program and also for me to empathize with challenges within the program.

Because the system that I would be implementing a data collection system with our new outcomes framework, I also interviewed with program managers and department heads to understand their current data collection efforts and challenges. The process allowed me to understand their staff capacity, how new data collection efforts could combine or embed within old ones. I also reviewed the data collection tools they were using, including current assessments and reports.

Co-creating and Mapping Outcomes:

Defining outcomes was continually a conversation as different stakeholders saw the meaning of "success" in a program differently from each other and the feasibility of measuring such success also played a role. To begin the process of mapping our Outcomes, the corresponding indicators and the activities aligned with those outcomes, workshops with teams within each department and walked them through the process of mapping the frameworks. The workshops asked the staff participants to work collaboratively to discuss what outcomes were in their program, and together assess if the corresponding outcome indicators met the criteria of being SMART (Specific, Measurable, Attainable, Relevant and Time-Bound). The products of the conversation were mapped on large posters and were 'featured' as part of an walk-through exhibit for staff to look at, comment, give feedback or ask questions. Outcomes were subsequently revised as we further discussed with more senior management and also considered how relevant data would be gathered.

The working version of the Outcomes Framework Map was presented to the whole organization through a Prezi presentation and other visuals. This allowed each program to see how they fit in a larger whole of the organization goals.

Formalizing Data Collection Tools

 

With the outcomes framework in place and the knowledge of the data to collect to reflect programmatic outcomes, I co-designed data collection instruments with program managers. In some cases this meant taking previous surveys, reports or assessments and revising them. In others, it meant searching for other validated assessments that were relevant and feasible to implement. We scoped a compendium for the most appropriate Social-Emotional Learning Assessment for our After-school program, found a screening tool for the Mental Health Clinic that could act as an assessment, created surveys for participant feedback in the Arts program. We also re-organized case management processes and attendance reports to capture indicators that would lead to or reflect outcomes.

After each round of piloting data collection with the programs, we would revise the process and tools based off of how process was going with staff, the level of ease, or time it took. Some departments faced many pain points with the process, which meant we had to re-think the data collection procedures in light of the outcomes. For example, with the housing program, we reduced the number of indicators to collect down to the two most basic, telling and easy to capture.

Configuring the Database, or IMS

As we continued to formalize and pilot data collection, we needed the centralized database to streamline the process as well rather than duplicating work. I worked with our partner, IBM, that was deploying the software, SafetyNet, in making suggestions for enhancement requests that could smooth the user experience as well as capture our data needs. Many times this required us to improvise solutions and workarounds. This was a continued work in progress as ai trained staff in the database and learned of any gaps the system needed to address.

bottom of page