Understanding and Improving Performance

2 downloads 0 Views 1MB Size Report
briefer version of this report will be generated in the near future for wider ... collaborating with agency staff to determine what to measure based on these logic models. .... 1) In the first phase, a data analyst looks to the research literature on a ...... analysts to to segment data to answer key questions: how does that trend line ...
Understanding and Improving Performance Measurement in Private Child and Family Service Agencies

Final Report

Prepared for the Joint Research Fund by Chapin Hall at the University of Chicago and The University of Chicago, School of Social Administration

Joint Research Fund, Understanding and Improving Performance Measurement

Acknowledgements This is the final report for the research project, Understanding and Improving Performance Measurement, funded by the Joint Research Fund at Chapin Hall and the University of Chicago. The Joint Research Fund supports projects that take advantage of multidisciplinary perspectives and expertise from both Chapin Hall and The University of Chicago. Joint Research Fund [email protected]

The project investigators and report authors are: Nathaniel Israel, PhD Policy Fellow Chapin Hall at the University of Chicago 1313 East 60th Street Chicago, IL 60637 [email protected] www.chapinhall.org

J. Curtis McMillen, PhD David and Mary Winton Green Professor The University of Chicago School of Social Service Administration 969 E. 60th Street Chicago, IL 60637 [email protected] www.uchicago.edu

The authors would like to thank the informants at the nine agency sites and the research assistants on the project: Dani Adams Savanna Felix, A.M. Stephanie Ortega Sarah Young Suggested citation Israel, N. & McMillen, J.C. (2017). Understanding and improving performance measurement in private child and family service agencies. Chicago: University of Chicago and Chapin Hall.

2

Executive Summary This project was developed to understand the developmental and workforce challenges private U.S. child and family service agencies face as they build performance management systems. These agencies have been encouraged to build systems that help them monitor their agency performance and use data to improve performance. Methods Nine agencies were studied to help understand how agencies were mounting, growing, aligning and sustaining performance measurement efforts. Case studies were written for each agency, based on agency visits, key informant interviews and reviews of agency documents. We sought agencies of various sizes and that were at different places in the maturity of their performance measurement programming. Fourteen volunteer agencies, recruited through personal contacts and emails from associations that serve these agencies, were screened for inclusion. We completed the case study process at nine agencies, conducting informant 77 interviews with some 80+ agency staff. Some of these agencies were in the Midwest region of the U.S. Others were from the U.S. west coast. Most analyses to date were conducted on the written cases, although all 94 interview manuscripts have been coded for later analyses. This final report contains all nine case studies. Pseudonyms are used for agency names and sometimes for agency systems.

Findings All agencies were struggling to achieve their desired ends in performance management, especially the ability to improve client outcomes. But most had developed performance management processes in which they had confidence and that met important agency needs. While the agencies faced common challenges in building out performance management, the nine agencies performance management efforts at this snapshot in time looked very different from one another. Foundational decisions driven by immediate internal and external demands shaped the trajectory of the agency’s performance management work. These decisions included whether to invest in people or data systems or both, to build their own data systems or rely on vendor database solutions, and whether to focus initially on compliance, payments, employee processes, outputs or outcomes. These foundational decisions led to new challenges that agencies had to address, often to balance earlier decisions. 3

Challenges that arose across sites included:    

how best to invest in data infrastructure, increase buy-in to performance measurement by increasing program fit and reducing burden, ways to get fresher data out of systems, and ways to use data to “move the dial” on important outcomes.

The case studies identified numerous strategies agencies were using to address common performance management challenges, including ways to train performance management staff, systematize performance management activities, develop agency specific data solutions, and drive performance improvement. Properly catalogued, these strategies may provide an important resource for agencies. The study also identified developmental triggers for performance management changes, related to changes in bureaucratic oversight, lost contracts, lost monies, and changes at the board level. Some agencies were able to leverage special funds to improve their performance management functions, but most struggled to finance the unfunded mandate of performance management from the thin margins of government service contracts. Several also shared a common vision of performance management, the ability to call up, on demand from their desktops, fresh if not real-time data, on their most important metrics that they could filter as needed by key variables, such as program and site. Continuing Work. A rich qualitative data set has been created to enhance continuing work. We will further analyze the data for publication and strive to better understand implications of the findings. Policy experts have been engaged to provide ongoing input on next steps. A proposed policy summit will proceed with other funds after we have a better understanding of the policy levers at play. A briefer version of this report will be generated in the near future for wider dissemination. What agencies might find most helpful is a performance management road map that helps them understand the implications of key decisions in the maturity of their capabilities in this area. This could facilitate technical assistance to agencies.

4

Four Regions Child and Family Services AGENCY DESCRIPTION When a lone pastor 120 years ago decided to lead local efforts to help children, he could not imagine the statewide Four Regions (4Reg) Child and Family Services agency as it is today. From its headquarters in an urban glass office tower, its senior administrators lead four regional administrative teams serving sites in 40 different counties with 800 employees.

PROGRAMS 4Reg runs several large lines of service across the state:    

foster care programs – traditional, relative and specialized for medical and emotional and behavioral health, early center-based childhood education, such as Head Start programs, in-home services to families referred for reasons of child abuse and neglect, such as family preservation programs, and in-home visitation programs for families at risk of abuse and neglect, including a doula program, Healthy Families America, Parents as Teachers and Nurse Family Partnerships.

It also runs a variety of community based programs that are site specific, such as community schools, parent education classes, and more. Its web site notes 70 programs in total.

CASE STUDY PROCESSES Two interviewers (McMillen and Park) visited the 4Reg headquarters on two days and interviewed a total of 15 employees and observed two meetings where data was the topic of discussion. Some of the employees were interviewed together (two regional directors), the COO and CEO, and program analysts with their 5

supervisor. A team of seven administrators participated in the debriefing session, which occurred a few months following the interviews.

EXTERNAL FACTORS 4Reg operates in a state with performance based contracting for some programs and an accent on data, including online dashboards operated by the state for two service lines. Their largest program, foster care, is rated monthly by the state on a scale from 1 (best) to 4 (worst and in danger of losing the contract). This rating system is moving to new program lines soon. “I was at a meeting the first time they passed out the performance list by rank and all the CEOs saw where they fit against the other organizations in the state,” said one senior administrator. “That’s when things started to change.” This external influence is a major driver of efforts at the agency. The agency also has a history of board involvement in looking at data, using data and encouraging 4Reg to become more sophisticated in its databases and uses of data. While other forces – accreditors, United Way, etc. – also apply pressure for some level of accountability, these were the two external factors most frequently mentioned during our interviews with staff.

DEVELOPMENT OF PERFORMANCE MEASUREMENT CAPACITY 4Reg as an agency has a unique history of purposefully targeting the development of its performance measurement capacity and using that capacity as part of its branding and marketing to funders. This initiative began about seven years ago. “Before that, we collected a butt load of data,” said one administrator, “But we didn’t do anything with it.” What was done with data, in the past, also had something of a punitive focus, using poor outcomes and peer reviews to punish poor performance. The agency did have a number of quality processes in place, including peer reviews, operating under a Director of Quality. The push at 4Reg to develop greater performance measurement capacity came from a board member who encouraged this path and supported it with two donations of $100,000. The board member then had some of her information systems staff from her company look at the agency’s data structure and help interview for a Director of Information Systems. The agency signaled to staff a new era of commitment to using data through an agency data summit, an event still widely discussed among program directors. At that time, the expressed performance measurement ideal for many was a drillable dashboard on the desktop computer of every employee, targeted to their program and level of responsibility. “When I first started here, that was the dream,” said the Director of Information Services. “Maybe because of me that hasn’t happened. It is a huge amount of effort for a huge amount of cost. I don’t think our social service agencies have the data that makes it easy to do something like that.” Instead, the agency developed and branded a process of determining key indicators to measure in each of the agency’s program lines and then to begin collecting data using those measures. In most cases, this involved an examination of the research literature related to these services, using this literature and iterative conversations with program staff to build logic models behind their interventions and then collaborating with agency staff to determine what to measure based on these logic models. They hired 6

master’s level staff with data familiarity to conduct this work, developed a four-phase diagram of this process, and created a name for this process – Awesome Results Methodologies or ARM1, now widely used internally and in their external marketing. Even with these processes in full action, the agency players we interviewed realized that 4Reg was still not using data to improve services and move outcomes. Administrators searched for a known method they could apply to their agency and decided to use a process used in Wisconsin (KidStat) and in Colorado (C-Stat) in children’s programming. It is based on the use of community policing data pioneered in east coast cities and shown on the TV series The Wire. They sent two staff members out-of-state to observe the process and bring it back to 4Reg. They have now branded this process as well, MaxIt!1 MaxIt! was designed to fix several problems. 1) Limit focus. In this process, “the maximum number of metrics is four, not 104, which is what we had on some of these programs,” said one administrator. 2) To involve more people in problem solving. “It isn’t the program director’s job to figure out how to make it better. It is everyone’s,” said the same administrator. 3). To “move the dial.” The agency, several people told us, is still waiting for the meeting that shows they have changed an important outcome through this process.

PERSONNEL The 4Reg quality team consists of a Vice President, who also oversees HR and other agency functions, a Director of Information Systems, who has a background in database work from the corporate sector, three masters level Data Analysts for the program side of the agency plus one for financial matters, a Director of Quality, who has been with the agency for many years, a Programmer Analyst with a MS in Data Architecture, and five Quality Associates. The COO is also very involved many of these efforts, especially the MaxIt! process. The Data Analysts and Program Analyst are assigned databases for which they have primary responsibility and programs for which they have ARM and MaxIt! responsibilities. They are master’s level professionals from varied degree programs. When asked what the agency looks for in hiring for the Data Analyst’s position, one manager said, “a business analyst mindset,” ”extremely bright” “that can pick up on things” “a passion for social service work,” “and someone who has worked in some sort of a basic data system.” The Program Analyst position is filled by a person with a master’s degree in data architecture. It was explained that this person was not trained as a programmer, but is learning programming along the way.

STRUCTURES AND PROCESSES DATABASES. The agency maintains multiple home grown databases. One is an SQL server “outcomes management system” for two large programs. Since it tracks placements for children in out-of-home care, “it has a crucial role in billing capacity for us,” said one member of the data team, but it is also used for demographic reporting. 4Reg also developed and maintains a database that contains information on peer record reviews and other information related to quality improvement. Data analysts are assigned to each of these homegrown systems and are responsible for maintaining them and creating the myriad of reports that are useful to program staff as well as the larger data efforts at the agency, like MaxIt!

1

a pseudonym.

7

4Reg’s data team pulls information from at least six proprietary program-specific vendors, database programs such as Child Plus, Teaching Strategies Gold, and Healthy Families America. When possible, the 4Reg data team pulls out data from these systems and pushes them into a web-based database solution called ETO, from the database vendor, Social Solutions. “We had a grand plan of putting all of this data into ETO. That was our grand plan. It ended up becoming really complex pretty quickly” said the agency’s Director of Information Systems. They ended up limiting the “pull and push” into ETO to one line of programming. The agency also contributes data to three systems maintained by state funders. They receive reports from these systems (two programs in the form of a drillable online dashboard), but have no capacity to pull data from these systems on their own to analyze or, their preference, merge them into the databases they maintain. These efforts to pull data out and push them into to other systems are not yet automated. “There is not automation at all,” said the Director of Information Systems. “It is a manual ‘Oh, crap! It’s time to go back into the data’ kind of system.”

PEER RECORD REVIEWS. The agency regularly conducts peer record reviews, led by the Quality Associates under the direction of the Director of Quality. The results from these peer reviews are entered into the designated database. The agency pointed with some pride to documents that showed that they had reduced the number of deficient items on their peer review forms.

RISK MANAGEMENT REVIEWS. 4Reg considers their Risk Management Review meetings part of their performance measurement effort. The VP for Quality conducts these reviews when there is an “unusual incident” – usually a tragedy like a client death – that includes record review and a meeting of administrative and program staff.

SURVEYS. Client and family surveys are conducted in SurveyMonkey. They are not imported or merged at this time with other agency data sources. They were not a focus of much conversation in our interviews.

ARM PROCESS. The ARM process as designed moves in three phases. 1) In the first phase, a data analyst looks to the research literature on a service line and then engages program staff in discussions about how their program is intended to work. They have explicit conversations about theories of change operating in the program, create a logic model, and then try to examine what outcomes flow from that model. . The product is a program definition. They also examine what data is being captured. At the time of our visit, 12 of 15 programs planned for ARM had completed phase one.

8

2) In phase two, the program staff and analyst begin to think through what data could be captured and then negotiate with program staff on issues like rating scales, burden of data collection and so on. This phase was described to us as lasting from 2 to 6 months, often depending on how independently the service lines had been operating across regions. The product is a measurement plan. 3) In phase three, 4Reg programs work to build out the phase two plan. As one Data Analyst described it, they tackle the following kinds of issues. “How are we going to design the system? How are we going to design data collection if we are doing a new instrument? What kind of training do we need?” This informant described the efforts in a recent program working in phase two that required large changes. “They moved from paper billing to electronic billing. They trained 100 people in ETO. That was a big, big project. They involved a lot more people than they had previously.” “There was a dashboard. They did some changes to the dashboard to make it more user friendly for the staff.” As an example of a more typical effort, she described a program that began using goal attainment scaling and training the team on that and using ETO to enter some new data. “It’s a pretty lean ETO build out. It’s doing case notes, one intake form, one closing form, and the goal attainment scale.” 4) In phase four they collect data and begin analyzing it, often using the MaxIt! processes Program staff report they were wary of the process before it started, but have been impressed with the respectful nature of the interactions. One regional administrator said it was not fun, but “It's a necessary process that we go through these first phases, to define our intervention and what we want to accomplish and what our measurements are and all of that. It's very important that we went through the process.” Some informants pointed to a counseling program as one where the process created a mind shift. “It was an ‘Aha’ moment for a lot of the clinicians. They couldn't sit there and say, ‘Well, I'm spending an hour with Susie once a week, so clearly, she's doing better.’ It really forced that group of people to be much more scientific about the work that they did and how they knew if they were helping people or not.” But the change from the ARM process most mentioned across program staff was the streamlining of many peer review forms. “In some cases, these forms were 18 pages long. I am not exaggerating!” Staff across program and administrative levels recognized that the pay-off for the ARM would not be until they were using data to make programs better. As one regional director said, “So what? We have all this information now. What are we going to do with it? [MaxIt!] has given us that next step.”

MAXIT! We observed one MaxIt! meeting. A program director ran the meeting, in that he was responsible for presenting the data that was discussed. Other program staff, other program directors, data analysts, the 4Reg Director of Information Systems, and the COO were present. The CEO appeared later. Eventually, the plan is that consumers and more staff will be invited, but 4Reg considers the process fragile at the moment. The process we observed was largely the presentation of data on four key metrics, with questions being posed by the rest of the participants. Some were process questions. “What happens when…” Some were personnel questions. “Does it matter who is on duty?” Some were questions about the availability of data. In general, the assembled members were digesting data, but looking for the data to tell them what to do next, but the data – metrics over time – were not designed to do that. In the MaxIt! meeting we observed, the program in question had not yet “moved the dial” on its key performance metrics and as a result, the program director looked nervous as he presented a flat line on his

9

metrics. He was forced to explain why the metric had not improved despite this being the focus of a prior meeting. We asked a separate program director about her recent first MaxIt! meeting. “I was nervous,” she said. “But it was the best meeting of my life.” She felt supported by the administrators and staff present, she explained, energized that others had assembled to help her solve program problems.

Some of the data requests we observed seemed ill-matched to the data available. For example, one person asked to see adverse event data for each configuration of staff members working a shift. But the program employed many shift workers working in a large number of possible combinations. To the observers, this request seemed unfeasible, but the data staff in the meeting said they would look at it and report back. The Director of Information Systems said after the meeting we observed that the goal of the meetings at this time are to get people interested in data, to get good data in front of them and then they will work later on training those present to ask better questions of the data.

COMMITTEE STRUCTURE. 4Reg uses a two-tier committee structure for its performance measurement work. Quality Committees by service line review data from quarterly reports, discuss the need and progress for new practices. Chairs of these committees serve on an Agency Quality Council, which reports to the executive leadership team.

REPORTS. The quality team produces an agency-wide quarterly report with a large number of components. They include sections for number of clients served, measures used for the MaxIt! process, output from peer record reviews, current corrective action plans in operation, unusual incident reports, ongoing risk management issues, ongoing research projects, fiscal performance, and staff turnover. The report we saw for the last quarter was 15 pages in length. It used ample color, with green-yellow-red used to designed performance levels. No data was charted over time.

STRENGTHS AND OPPORTUNITIES 4Reg operates with an abundance of strengths in its performance measurement efforts. Two are especially prominent. 1) It has serious buy-in from the executive leadership, which expects results from its investment in data structures and personnel. The CEO and COO are active cheerleaders for performance measurement. They are so confident that they are innovative leaders in this area that they have branded and trademarked their processes. The resulting pressure keeps performance metrics in the forefront among competing demands. 2) 4Reg has invested in different types of personnel, in importing from the corporate world a Director of Information Systems, in hiring a cadre of bright data savvy folks they call Data Analysts, and in hiring a person with a background in data architecture. This group has an outstanding knowledge of their data structures, even if they cannot always use the data in ways they hope.

10

3) The agency has done serious, respectful work in aligning data with program logics, but doing so has been a large effort involving lots of time, retraining and data system investment. 4Reg has an opportunity to lead the field in innovative uses of data if it can craft workable processes for their MaxIt! meetings. In our view, they need to build out the process by finding effective cheerleading statements, catalogue effective ways to interrogate the data presented, and find ways to more tightly tie the MaxIt! meetings to formal process improvement efforts. These meetings may need more lower tier staff involved, to build improvements around street-level knowledge of agency processes. But, right now the pressure is on both program and performance measurement staff to show that they can use data to move outcomes. The agency administrators want a big win, to keep momentum up and resources flowing into their performance measurement efforts.

11

Grand Lakes Agency for Social Services (GLASS) INTRODUCTION The physical nature of three sites visited at Grand Lakes Agency for Social Services – out of total of 40 in the agency – were quite different. One building was a slick revitalized slick corporate center, perched near downtown, but still clearly in the lower rent district. Another was standard office park fare, with shrubbery hiding it from the nearby interstate. The third had some quaintness, some softness, to it, a small oasis in a distressed town. Similarly, the attire of the staff at GLASS was equally varied. One administrator wore a power suit. Another wore jeans. Maybe the character of GLASS is too large, varied, and complex to quickly characterize.

AGENCY DESCRIPTION Although its main goal is “serving more clients,” GLASS is already one of the largest nonprofit service providers in its state, with interests in skilled nursing and senior care, as well as a full range of child and family services. It is a large agency, with 2000 employees and an annual budget near $100 million. According to federal tax forms, it raises over $10 million dollars annually in private money. GLASS grew out of the service work of a protestant denomination, incorporating in the 1930s. At this date, GLASS continues this religious affiliation, although agency administrators worry that it is a limiting factor in its commitment to growth.

PROGRAMS. GLASS provides services across the life-span and multiple service sectors, to include skilled nursing care, assisted living, home health, criminal justice re-entry programming, housing programs and more. In his case study interviews, Dr. McMillen only asked about and visited programs related to children and family services. These services include foster care, independent living programing, services for unaccompanied minors, adoption from state care, and home-based services for families reported for maltreatment.

12

CASE STUDY PROCESS Dr. McMillen visited GLASS offices in three cities over two days. He interviewed seven GLASS employees. Three of the seven were requested by the study team and four were chosen as informants by GLASS’s Director of Quality. An exit interview with three of these interviewees was conducted on site before Dr. McMillen left the headquarters.

EXTERNAL FACTORS Grand Lakes’ informants reported external pressure for performance measurement numbers from other sectors, but in “child welfare, there’s very little pressure there externally,” a senior administrator said. With child welfare, it’s about reputation, with programs judged on the following. “’Are you responsive to me? Do you treat your clients well? Do you do your job? Do you take the kids when I call you? Do you get your reports in on time? Are you a good partner to me and help me out when I need?” The state has no performance based contracting and is not regularly looking at outcomes. Instead, GLASS has seen a greater emphasis on performance management from private funding agencies. Their private funders want information about outcomes, whereas their state funder wants outputs. Their accreditor, the Council on Accreditation (COA), was not seen as an influence on how Grand Lakes does performance management. “We use COA as a guide,” one employee said. “And we are COA accredited. We don't have a problem with how COA does things, but I guess we've done it for so long. I've moved beyond it.”

INTERNAL FACTORS The drive to increase performance measurement capacity at Grand Lakes is internal. “The culture of our organization is growth and opportunity,” said a senior administrator. It has a board-led initiative on “leveraging growth.” “But the other part of our culture is that we hold people accountable to fiscal management.” The focus on financial accountability was pervasive in answers to numerous types of interview questions. When asked for instances when data had been used, the examples were often financial. When asked to prioritize the kind of data most examined, the answer was financial. When asked about the metrics that were used to evaluate programs, the answers were financial. A program director told of a time when she noted in her data that a number of cases were preparing for termination. “I pulled in my program manager…What I can project is that in three to four months, these numbers are going to drop off our census, which is our budget, because we get paid per diem per child. We sat down and looked at the numbers together.” They came up with a client recruitment plan to ensure new referrals would be forthcoming.

13

This is from a different program director. Interviewer: Can you tell me the story of the last time you went to [your data system] to find out something? Program Director: It provides me access really quick to my financial status, to really know how I am producing. We were providing the state some information about where we were in our spending and I said, “Oh!” We had to provide an estimated bill for the month and when I went to Harmony …I realized I had already spent the money that was there.” This is from a different program director. Interviewer: What are the metrics your programs are evaluated on? Program manager: Financial goals probably come in number one in my seat, second is quality. This is from an agency administrator. Interviewer: In your job, on a daily, weekly basis, what data are you looking at? Administrator: I’d say most of my data is data that relates to financial performance of the division, first. Second, I would say compliance. ..I am looking at things like our daily census. How do our budgeted number of children versus actual [numbers] for each center across the state.

DEVELOPMENT OF PERFORMANCE MEASUREMENT CAPACITY Little was revealed about turning points in the development of performance measurement capacity at Grand Lakes. It was described as “a gradual evolution” by one member of the performance measurement team. Prior agency Presidents and CEOs were viewed as champions of performance measurement by some employees, at least in terms of garnering resources to build out a performance measurement team. Grand Lakes had a long-serving Vice President for Quality, according to multiple sources, and during this person’s tenure, the emphasis was on collecting data and meeting accreditation standards. One agency administrator said that upon this person’s retirement, they sought “an upgrade,” a person who could create more of a vision for performance measurement efforts. This person, however, was not yet an expert on performance measurement, but was hired for intellectual acumen. Another factor in the development of Grand Lake’s performance measurement strategies comes from its involvement in the health and senior services sector. “It’s the naughtiest thing to say” said one administrator. “But we are not learning it from the child welfare world… I hate to say it, but sometimes fields that aren’t driven by social workers are a little more advanced in the data world.” Some administrators started attending a Health Information Management Conference, at the suggestion of an IT director who had worked in health care. “They have some pretty advanced things,” this administrator said. This has contributed to a current focus on “data integration,” “business analytics” and “business intelligence.” Grand Lakes’ President is ready to move forward fast with improvements in performance measurement, and lists four main goals: fresh data, data at the fingertips, integrated data, and predictive data. The President’s thoughts on each topic are below.

14

Fresh data: “All of the data we have is stale. My financials are completed the 18th day of the following month, so I’m looking back 20 days later. My quality data is on the quarter.” Data on the fingertips: “The problem for our team is that our staff do not have data at their fingertips to make decisions.” The reports I get, the President said, “are manual. They come to me weekly from my VP, but they are manual reports that they track and update.” Someone has to “attach it as an email and send it. Hopefully, as we move toward what we think is business intelligence, those reports will become automated and our key performance indicators will be in some dashboard.” Integrated data: “We just had a twenty team meeting about the fact that my HR data is not integrated with my financial data. I have to build a bridge by IT to get that data to match.” “If I wanted it every day, I could get a report on how many kids I have in care. But I would never know – if I have 100 kids in care, do I have 100 workers too? It doesn’t match. It isn’t integrated.” Predictive data: “Ultimately, I hope to have data that would actually be before the outcome, so we could have some predictions that we are moving in the wrong direction, or need to apply more pressure, or so I can move my resources someplace else” The President wants indicators “of progress and probability.” GLASS has invested in a software system – Logi Analytics (www.logianalytics.com) -- designed to serve as an integrative data tool across programs and to bring drillable dashboards across programs to staff at multiple levels. Senior staff have heard about both the vision and Logi-Analytics and are excited. One administrator became animated as he talked about the data he might be getting. “I’d love to have something that gives me real-time data and a summary of how my programs are performing in key metrics: financial, compliance and outcome related. That would be daily, real time.” He continued. “Even better, if I could drill down to the staff level that would be my dream… Let’s pretend one of my key metrics is length of stay. Our goal is to average no more than nine month’s length of stay in out of home care. So I know, as a Division, how we’re doing. If I see that number start to go off track, well what if I could drill down and see instantly how each of my centers is performing so I know who is dragging down the average. I could focus on that center and drill down even more and say which supervisor team is driving down that average. I could pinpoint where the greatest amount of action and correction needed or support is.” This person also looked forward to the predictive portion of the vision. “We’re starting to talk about predictive analytics in child welfare. Could the dashboard tell me, based on how we are trending, “You are going to end up missing this outcome or this budget target or this compliance requirement if you don’t do something right away.” The vision of getting business analytic products at the fingertips of staff was not yet on track at the time of the site visit. Agency data personnel were having trouble getting data out of their many databases and into Logi Analytics. A date several months later was set as the roll-out of Logi-Analytics in non-child welfare services.

STRUCTURES AND PROCESSES

15

PERSONNEL. GLASS organizes its performance management function along three service lines: senior services, children and families, and home and community. A Director of Quality for GLASS reports to the President. “My focus is more about organizational effectiveness,” this person told me. “My job responsibilities are the obvious things like accreditation and privacy and corporate compliance officer and all of those things. But it’s also about helping the programs and support services integrate better to get goals met, which means integrating their data, which means integrating the ways they work with each other and communicate with each other.” Three Directors of Quality for different service lines report to the GLASS Director of Quality. Together, they compose a quality steering team. Each service line quality director also reports to a program vice-president. Each of the quality directors in a service line has people that report to them. “Not very many, but they have people who work for them.” They may conduct chart reviews, do data entry, or conduct surveys. Together, the agency budgets 1.3 million dollars annually for the performance measurement function. The Children’s Services line has a data manager who performs work solely for this line of services, but is considered part of the IT team. She was highly valued for her ability to tweak the system as needed, provide the reports most desired and to train staff on how best to use the system.

REPORTS. The service line quality directors create quarterly Performance Management Plans, or PMPs, that are, in effect quarterly report cards on specific programs. The VP for Children’s Services, for example, receives 21 PMPs per quarter, all at about the same time, from the Quality Director in his line.

MEETINGS. GLASS has instituted monthly meetings – QAPI meetings – where data is discussed. This is an attempt to move toward the use of fresher data (from quarterly to monthly). QAPI stands for Quality Assurance and Performance Improvement and is an idea and acronym borrowed from the U.S. Government’s Center for Medicaid and Medical Services that was designed for skilled nursing facilities. The meetings had been in use at GLASS’s senior care facilities and was recently imported into its child welfare division. The hope is that the quarterly reports will soon be monthly and reviewed at each monthly QAPI for each service.

DATABASES. The Children’s Services programs use a vendor-provided product called Harmony, which houses client level data. The child welfare staff liked this system and were distressed that the state’s efforts to implement a new SACWIS had de-emphasized Harmony. Here a program manager describes how life used to be with using Harmony and how it is now with SACWIS. “What I used to have in my data was a three-page report that said, here’s the sup[ervisor]. Here’s the case worker. Here are their 15 cases.” “Now, I pull a case load report and I get out a 200 page thing that is broken down by worker and then birth parent.”

16

Then, she described how worker reports are logged into the two systems. “They enter their report to have their supervisor review that report and approve it. It becomes five different routings and two different times you have to print. If the supervisor clicks the wrong thing, the whole thing can close. It is just what used to be a click approval in our system and print so we can submit the report; it now takes two days.” Not only was SACWIS considered more burdensome than their prior in-house system, but less accurate. Not even the state trusts its data. “We are being bombarded with spread sheets from the state, saying ‘SACWIS says this. What does your data say?’” Finally, the SACWIS system is not delivering on its promise to provide comparative data across providing agencies. “That’s the crux,” said one program director. “I don’t know where I stand to my partners. We’ve asked this from the state. I’d like to be able to go, ‘Hey, let’s look at this and see how we use this data in a different way. So, what is this agency doing? Who has the same contract…Like I said, I feed information to the state, but very little circles back.” Therefore, GLASS staff were often double entering data, into Harmony and into SACWIS. Typically, a case worker would complete a form, and then support staff would enter the data into both Harmony and the state system. Ultimately, GLASS wants data in SACWIS and other children’s services databases to feed into the LogiAnalytics software to provide workers with dashboards at their fingertips. But so far, GLASS is struggling to get any meaningful data out of SACWIS, let alone integrate it with other systems.

FUNCTIONS. The Director of Quality in the children’s services line is responsible for peer record reviews, annual surveys and the use of administrative data to create PMPs for the programs she serves. Peer records were described variously. “Great tools,” said a program director, but “sometimes they hinder.” She went on to describe a state-required peer record review form that was 15 pages long. A second program director had the same complaint and shared a state-required form with over 250 items. The Quality Director in the Children’s service line conducts regular surveys of a large number of stakeholders: foster parents, birth parents, independent living clients, and workers who refer clients to GLASS services. These used to be done on an annual basis, but were changed in recent years to be timed with service milestones to increase utility; for example, right after discharge or right after a foster home is certified. When surveys are not anonymous and results are very positive, the Quality Director will notify a program director that a client or foster parent of a specific worker in a specific service received high marks so this person can be congratulated for good work. They also started surveying birth parents in the office instead of via mail or online to increase response rates. Current response rates on surveys were 20-30% which were called “fabulous” by one employee because these rates were much higher than in previous years. There were a few examples of using the quality directors to fix problems. A Vice-President said his first action when he recognizes a problem is to send in his Director of Quality. This is, he said, not just because she has skills related to quality, “but as a clinician and because she has been in the field a long time.” He trusts her.

17

This director provided an example of one program fix. The agency had received a new contract and the requirements were complex. “It was a nightmare. No one understood what it was. It didn’t make sense. … [They] gave you this training and it was two pages of policy and this PowerPoint training that you have to read through. Who’s going to remember that? There are 50 things. So we created a guide that talked about what they need to do and a checklist for certain things.” This person reported that checklists were a common fix. “We put down every step of what needs to be in this report, what needs to be in this packet, whether it is an adoption packet or a licensing packet. It really is a checklist. It has a little box here. It has a list of what needs to be done.”

STRENGTHS AND OPPORTUNITIES GLASS has committed to building a performance measurement system that works for them and has invested substantial money and effort into these efforts. Despite low external pressure for such systems, it envisions a high-level business intelligence system that integrates data across functions and brings fresh data to the fingertips of GLASS administrators, program directors and supervisors. It is thinking big. It has also systematized a number of important performance measurement processes. It has quarterly reports by program on performance. It also has envisions new ways – through its QAPI meetings -- to bring people together to use data. Finally, it has some history of using its best staff to address problems, not just identify them.

It is not clear, however, how and when GLASS will be able to implement its performance measurement vision. It is working to build out this system with in-house staff that have not previously implemented such a thing. In the corporate world, this is the kind of implementation for which big consulting firms are contracted to accomplish. There are many ways that current efforts can be bolstered. GLASS is not effectively using visuals with data over long time frames to identify trends. It does not yet have a system to tie identified problems to improvement efforts. GLASS also needs to move beyond a focus on financial and compliance indicators to a focus on outcomes and quality service. Finally, many of its staff want GLASS to hire a data expert to work exclusively with the SACWIS system to help them get useful data in and out of this statewide system.

18

Helping Our Youth Today AGENCY DESCRIPTION There was a time when social institutions were built outside of city centers, seeking the healing properties of pastoral living. That explains why the lake view from the conference room at the exurban headquarters of Helping Our Youths Today (HOYT) would make a housing developer salivate. From this peaceful setting, HOYT operates child and family service programming in almost 40 sites across four states. Its origins lie in two turn-of-the century religious-based orphanages that merged some 80 years later, with rapid geographic and program expansion in the 1990s and 2000s through state and federal contracts.

PROGRAMS HOYT’s largest contracts are for foster care case management in multiple regions of two states. It also operates:     

the former orphanage facilities as residential treatment facilities, over 10 early child education centers, in-home, intact family services for families from the child welfare system in three states, adoption programs, and a large number of counseling and community based programs.

HOYT is proud to run programs using named evidence-based protocols including Multisystemic Therapy (MST) and Functional Family Therapy.

CASE STUDY PROCESSES Dr. McMillen visited two of HOYT’s campuses, in two different regions of one state. He spoke with 11 staff members individually, some by phone, and held a de-briefing meeting with four staff members at the end of the second day of visits. Four of the interviews were with individuals requested by the research team. HOYT’s Vice President for Quality chose the other individuals to be interviewed. They included program and 19

Performance and Continuous Quality Improvement (PCQI) staff. Interviews with staff members from other states were conducted by phone.

EXTERNAL FACTORS Several members of the PCQI team named the Council on Accreditation (COA) as an important shaper of their performance measurement efforts and that they have worked to stay up-to-date with COA’s changing expectations. On a more regular basis, state child welfare authorities appear to place the most pressure on HOYT to prove performance. HOYT lost a foster care contract three years ago in one state and that served as something of a wake-up call, at least for the agency’s board of directors. “That was a turning point,” said an agency administrator, “because I don't think the board at that point and time had recognized that we weren't tracking as well as they thought we were in [that program].” In another state, monitors from the child welfare authority routinely meet with providers to go over performance indicators. It is an agency priority to improve those numbers to assure that these contracts are secure.

INTERNAL FACTORS The agency’s performance measurement efforts benefited from strong support from a particular agency CEO in the 2000s, a person whom current PCQI staff considered their “booster.” The agency has also benefited from board expertise in performance measurement, with a quality professional from a manufacturing field now chairing the board’s program committee. The CEO reports that the board has been involved in reviewing performance measures for several years and was able to list many that they had examined recently, from kindergarten readiness to staff turnover. He said it would be common to look at about three key indicators per board meeting.

DEVELOPMENT OF PERFORMANCE MEASUREMENT CAPACITY HOYT has been purposefully building performance measurement capacity for almost two decades. In 1999, a COA accreditation effort led to the implementation of a large number of PCQI changes in short order. Said one administrator: “All the processes were developed in isolation, to get ready for COA in 1999. It was like, ‘You have to do this. You have to do that. You have to do this. You have to do that,’ in order to be in compliance with COA standards.” According to this administrator, “we had to go back after that to build buy-in.” It was at that point that the agency began to more thoughtfully build the performance measurement and quality improvement processes in place today. The first thing the agency tried to get right was doing peer record (chart) reviews. Plus, there were efforts to change the culture. “I remember there was a lot of police attitude about QI,” said the same administrator. “There was a lot of culture change, that I tried to help instill… to help say, ‘Hey, we're here to be partners and help make a change,’ and not so much of the ‘You did something wrong,’ only.” In 2003, HOYT moved heavily into operations in a new state, based on perceived opportunities in that state’s foster care privatization movement. This was perceived as a heavy lift by the PCQI team, who helped with the start-up efforts. In 2005, HOYT’s then CEO purposely imported additional PCQI expertise into the agency with an outside hire to lead their PCQI efforts. This person’s first priority, according to current staff, was to train the entire 20

agency on the importance of PCQI efforts and how everyone could participate. This led to the operationalization of what HOYT calls Quality Improvement Teams. Every HOYT staff member participates in at least one team that meets three times a year to. These are described below. Next, the new Vice President “helped solidify our other processes,” said one informant. These ended up being called the “14 mandated tasks” of PCQI, a list and term still used in the agency over a decade later. “It really helped define us as a team and help people understand their role in them,” said a current administrator. In 2012, the agency lost the aforementioned foster care contract. In 2013, a new agency CEO decided he wanted graphical dashboards to better convey performance. The board’s program committee worked hand in hand with the PCQI team to determine priorities for the dashboards and even the types of figures it would contain. It was decided that the program committee of the board would review full program dashboards and that the board itself would see something less. The goal is to move from 15 page program reports to 3 page reports.

75%

In 2015, the current structure and format of the reports were formalized, with a heavy emphasis on doughnut charts (see figure), used to show the degree to which performance was measuring up to a standard. Now with a new CEO, PCQI staff wonder about his commitment to the PCQI processes that have been developed and the investment in PCQI professionals deployed throughout the agency.

STRUCTURES AND PROCESSES Many of the processes of the PCQI team are institutionalized and calendared over a year in advance, such as peer review teams and QIT meetings. The advanced calendaring comes with an intended message – these meetings are not optional and staff are expected to schedule around them. One thing that sets HOYT apart from its peer agencies is its use of the QITs. These teams are at various levels in the organization – service teams, programs, directors, vice presidents – and virtually every employee sits on a QIT. Each QIT meets three times a year with a PCQI staff member. In this meeting, several things are expected to happen: 1) a review of performance data included those from peer review teams; 2) creation of action plans to improve services; 3) review of lessons learned from priority reviews from other parts of the agency; 4) identification of issues to be raised up through the organizational hierarchy. Usually these are things the local team cannot solve on their own without additional resources. These can be varied. Examples of issues resolved through this process were getting the lawn of a residential center leveled to reduce injury risk to development of a more humane bereavement leave policy. One PCQI Director described his role in QIT meetings as “coach, it's to be there to, if they get stuck, to help them move along.” Some teams need more coaching than others, this person noted. Others added that the risk of these meetings is they become complaint sessions and staff have to be diligent in moving the meetings toward more productive efforts. The program reports used in the QITs are compiled by the performance measurement team and are referred to as “risk management reports,” a term long used in the agency, although the purpose of these reports seems to have changed over time. They include data from peer review meetings, vendor databases and reports sent by program staff to the PCQI staff’s data analyst, who creates the reports using Microsoft Excel. 21

Peer review teams meet quarterly to review a random sample of client charts. These meetings are also staffed by a CQI professional. CQI professionals may staff 12 or so QITs and peer reviews, composing a good deal of their time. Peer review results are immediately entered into SurveyMonkey for the CQI data analyst to compile into a report using Microsoft Excel. Incident reports are also entered into SurveyMonkey from paper copies sent to a PCQI team member and that data is periodically downloaded into an excel spreadsheet. HOYT also considers its priority review meetings to be a core performance measurement process. These are meetings held after a critical incident. These are staffed by members of the PCQI team and are designed to yield lessons learned that might be applicable across programs. One CQI Director said a recent lesson from a tragic event was “timely, thorough, documentation… Every minute that goes by between the event, and what you are going to write about it, is time that dims in your memory.” The HOYT performance measurement team meets with program staff to attempt to determine the best indicators at the least burden for each program. These are also discussed with the PCQI leadership and the program committee of the board. Although the performance measurement team lamented that this work sometimes gets pushed back to meet other demands, the program staff Dr. MCMillen talked with appreciated the opportunities to shape the indicators used to measure their performance. An indicator mentioned by several employees as a meaningful one was percentage of children served who were kindergarten ready (from the early education programs). Several people mentioned how helpful it would be if each program had a measure as clearly important and communicable as that. Foster care programs were monitored on permanency outcomes. Some PCQI staff wished they used more standardized measures for client progress.

DATA MANAGEMENT Much of the performance measurement function is based in Excel. “I live in Excel,” said the performance measurement teams’ designated data analyst. HOYT has what they referred to as a client management database, created for them in the 1990s. It is used today primarily as a case note system. It has no other central database in use for performance measurement. The agency has no ability at the corporate level to generate basic statistics on client demographics within or across programs. When this information is needed by funders or other stakeholders, the PCQI staff have to report that they simply are not able to produce that kind of data. The agency has also little database expertise among its current staff, although at least one team member is eager to learn these skills. “Outcomes currently are collected on paper,” in the aggregate by program directors, explained one employee. Program directors are sent a Microsoft document form at specified time points with specified deadlines for the director to complete using what raw data they have on hand. “And I enter them into an Excel spreadsheet to be able to calculate how we're doing,” one employee said. “We don't have a database that collects outcomes on an individual client-level basis. It's hard to do more higher level statistical analysis when all you have is an outcome that essentially tells you that 30 percent of clients that closed achieved this outcome. I can't connect that fact to which clients those are.” Its programs use a number of database programs. Its foster care programs use their states’ SACWIS systems and their early childhood education programs use the Ready, Set, Goals data program mandated for federally funded Head Start centers. For some of these program-specific databases maintained by others,

22

the CQI team can pull data out. For others, it cannot. When it is pulled out, it is maintained in Excel spread sheets. The members of the PCQI team I spoke to were aware that the agency would benefit from better data systems and, eventually, more expertise in data management. Several employees spoke about the piloting of a new off the shelf system in the near future. Other staff members spoke about an eventual desire to have drillable dashboards on the desktop. “We need to go to real‑ time data,” this person said. “What my vision is, and I know it's out there, we just have to get there, is that case managers, the most important people who have the most connection to our families, they need data today.”

PERSONNEL HOYT has an Executive Vice President for Quality who reports to the CEO. This person has been at the agency for many years. Under this positon are a Research Director, three Directors of Quality for specific lines of service, four Quality Coordinators, a Coordinator who handles Medicaid compliance, and a Data Analyst. Directors of Quality and the Data Analyst have master degrees in human services fields. The Directors have previous experience in PCQI. The team was built through substantial investment in the HR side of performance measurement Quality Coordinators are hired without prior PCQI experience. They are then sent to a COA training and given “hands-on” training by the Director of Quality to whom they report. It takes about a year for a new Coordinator to learn the role, according to one administrator. “I feel like it's a good year. In six months, you get the basics, you get an understanding. You don't really get your confidence for a good year.” There is substantial experience on the performance measurement team. One member of this team, who had worked at other agencies said, “This is the first place I've worked where I didn't know the most about CQI.” An administrator said that at times, she thinks she has “a dream team” of PCQI professionals. The team has helped train other PCQI professionals in one of their states is looked to as a leader in that state, and considers themselves leaders in the PCQI movement within private child and family service agencies nationally. This team has been built both from internal and external hires. A former Executive Vice President of Quality was an external hire, while the current person in that role moved up within the organization. Two members of the team noted that because of the collective PCQI expertise, it may be difficult to move up the hierarchy. Both thought they were ready for more responsibility and yet did not see those opportunities available at HOYT and were aware that their knowledge and skills would be valued elsewhere. The program staff interviewed appreciated their relationships with their dedicated PCQI staff and viewed them as helpful if not vital in their jobs. They appreciated having PCQI personnel devoted to their service line. The performance measurement team has been located in multiple places on HOYT’s organizational chart, under the training department, then to finance and finally having its own Executive Vice-President. The team has also been organized both by region and now by service function. They settled on their current structure in 2013, with PCQI personnel assigned to a program line of service.

23

STRENGTHS AND OPPORTUNITIES HOYT has a wealth of performance measurement strengths. First among them are a dedicated, experienced team of PCQI professionals who know the agency and have experience with many PCQI tasks. They are likely sufficient in number and work well with program staff. In turn, HOYT has program staff that are actively using data to manage performance. They are using the risk management reports, and in the case of foster care services, using data and data dashboards from their states’ child welfare data systems. These staff were valuing data, using data, looking for quality performance and quality problems and acting on this information. Both program staff and PCQI staff reported positive relations between the two groups and a culture where PCQI is viewed as helpful, instead of compliance driven or punitive. That HOYT’s performance measurement team has managed to get data in front of its employees without basic data structures in place is a testament to the creativity and organization of its performance measurement professionals. The use of SurveyMonkey to get data into a useable form is clever. The management of a huge collection of Excel spread sheets to get data out to program teams is impressive. HOYT has many mature structures and processes in place that other agencies struggle to create, let alone sustain. The calendaring of CQI events -- peer reviews, quality team meetings, surveys, reports – is impressive. People know HOYT’s 14 mandated PCQI activities throughout the organization. Performance measurement activities happen as they are supposed to happen. Several of these processes possess strengths. The priority reviews seem to dig deep and offer lessons learned. The peer reviews contain both items that measure presence (compliance) but also items that strive to get at quality. The successes of the QITs seem to be in the area of moving improvement issues up the chain of command. The PCQI staff are actively meeting with programs to align measurement with program needs. While HOYT has invested in its performance measurement team, it has not invested in database structures. The dream of drillable dashboards on the desktops of workers and administrators is a distant one without data structures that capture individual level data. Individual level data in agency databases allow data analysts to to segment data to answer key questions: how does that trend line look if we just look at clients of a certain race, gender, state, program, team, payor? Drillable dashboards puts that capacity in the hands of other staff. Even if HOYT were to begin these investments now, it will be some years before they have these systems online and more years before anyone at the agency will be able to query any data in this way, let alone provide this capacity to workers and administrators across programs on their desktops. The lack of data structure also means that many programs are submitting data to the performance measurement team on Word documents from tally sheets not seen or monitored by anyone else. The ability to routinely monitor and validate these data comes at a very high cost in terms of time and effort, and is not being done at HOYT. While it is remarkable that HOYT’s QITs meet regularly to review data and issues, the process as described appears flat. The meetings were sometimes used for basic corporate communication, team building, and as an outlet for frustration in addition to reviewing performance data. The QIT meetings were not used frequently enough for problem identification, exploring the root causes of these problems and generating and sorting through potential solutions. Deeper digs on fewer, important issues that lead to administratorchartered improvement teams that test and refine solutions could lead to more substantial gains.

24

Other opportunities for improvement involve tweaking current processes. Meaningful tweaks could include looking at data over longer periods of time to note trends, cross training on the tasks conducted by the team’s data analyst, peer review upgrades to focus more on quality and using the meetings as opportunities to learn from one another. In short, this is an agency with many impressive performance measurement processes in place and one very apparent need: the need for development of a centralized, coordinating database structure on which to base and build on these efforts.

25

Multiple Christian Family Services AGENCY DESCRIPTION They refer to it as corporate as in, “here at corporate, we see things like this.” But the sleepy vibe of the main office of Multiple Christian Family Services (MCFS) couldn’t be less corporate. When Dr. McMillen signed out of the visitor’s log at the end of the first day of interviews at MCFS’s one story, un-renovated and largely undecorated 1960’s single story suburban headquarters, he noted he had been the day’s lone visitor. The building belied the breadth and activity of MCFS’ programs and 475 employees. MCFS is a legacy child welfare agency, tracing its roots to the 1870s when several Christian congregations forged their resources to start an orphanage. Like many legacy organizations, it is has been melded from many mergers and acquisitions. It is independently operated and officially separated from any religious denomination.

PROGRAMS The agency’s pseudonym reflects the large number of programs operated by MCFS. How many? When I tried to pin staff down, answers varied from 20-40. It depended on how one counted across contracts, grants, and sites. At the least, MCFS operates:         

three residential units for youth with emotional or behavioral problems; foster care programs, including traditional, medical and specialized foster care, in eight counties; a short-term shelter program for youth; international and domestic adoption programs; a day school that serves its residential communities and others; outpatient counseling programs at multiple sites; community-based (nonclinical) services in six different sites, including food pantries, activity centers, drop in centers, and art programs; four different versions of in-home service programs to families involved with child welfare services, operating in multiple sites; a youth camp.

26

CASE STUDY PROCESSES Dr. McMillen interviewed nine employees over two days at the MCFS headquarters. A one-hour debriefing session was held at the end of the site visit with two members of the performance measurement team. MCFS provided a wide range of program reports and output from its performance measurement team. This included a Performance and Continuous Quality Improvement (PCQI) orientation document, a PCQI annual report, quarterly dashboard reports for many programs, a description of their process for determining performance measures, and forms used in various activities.

EXTERNAL FACTORS Staff identified three primary external drivers of their performance measurement activities: the state child welfare authority, the United Way, and the Council on Accreditation (COA). Primary among these is their contractual relationship with the state child welfare authority, who introduced performance-based contracting for their foster care and group home programs a decade ago. The state carefully tracks a small number of performance indicators for these programs. MCFS lost at least one program contract based on these metrics. In 2013, the state’s interactive dashboard for these indicators went live online, allowing each program to compare their performance measurement with other organizations and to drill down on these metrics to sites, supervisors, workers and cases on their own agency’s data. Staff also noted that the agency’s longstanding relationships with multiple United Way organizations were continually evolving. While they used to only provide output numbers to the United Way with continued funding all but guaranteed, United Way’s requirements have been evolving. In the most recent push, United Way wants impact measures for all services. This included programs like food pantries, where MCFS struggles to find outcome indicators that satisfy the United Way. MCFS was an early adopter of COA accreditation and has worked to update their performance measurement functions to match the requirements of COA. Two of the most impressive documents shared with us – a justification of all agency measures, and the description of the performance measurement function at MCFS – were created explicitly for COA reviewers.

DEVELOPMENT OF PERFORMANCE MEASUREMENT CAPACITY The current leader of the performance measurement team, an agency Vice President,2 has been in the job since the early 2000s and another member of the performance measurement team joined in the mid-1990s. At that time, the agency had a client information system that allowed the team to generate program client numbers and demographics. This system was only fully replaced in the past year. Staff members mentioned only a few additional turning points in the development of MCFS’ performance measurement capacities. In the 1990s, an influential board member introduced a form of policy based governance that involved the board writing grand outcome statements for the agency. This influence remains active to this date. The performance measurement team provides the board with data on one grand outcome at each board meeting.

2

Some unique job titles have been changed to protect the anonymity of the agency.

27

At some point, the board began receiving financial dashboard reports from the agency’s CFO. The board liked these and the reporting extended to program dashboards. In the late 1990s, a member of the performance measurement team received training in the statistical software package SPSS and began exporting client data to SPSS to conduct data analyses. This increased their capacity to develop standardized reports at regular intervals. The agency continues to use SPSS to conduct most analyses today. Six years ago, an agency administrator ordered the agency to stop collecting and entering time log information for all of its service employees. This had been perceived as a large burden with little payoff. “I'd get reams of paper on everybody telling me what they were doing all day, which isn't necessarily bad, but I'm like, ‘Well, I don't care what they're doing all day. I care what they've done at the end of the day.’” A 2012 acquisition of another agency’s foster care program brought with it a performance measurement professional and her forms and procedures. Although this employee did not stay with the agency, MCFS adopted for all of its programs the format of the dashboards this person had used. When the 2013 statewide SACWIS interactive dashboards went live, agency employees began to see the power of data and using it more actively. This also increased the demand for the MCFS performance measurement team to deliver live dashboard capabilities. A recent effort involves face-to-face meetings between each program and the performance measurement team to better understand what success looks like in each program and to tailor the quarterly report process to indicators that mean something to clinicians in these programs. These efforts are not yet complete and have been de-railed by crises and the regular reporting demands of the performance measurement team.

STRUCTURES AND PROCESSES PERSONNEL. For the purposes of this report, we consider the following staff member positions to be a part of the MCFS performance management team. The Vice President has two direct reports related to performance measurement, a Director of Data Systems, and a Director of Training and Performance. The Director of Data has one direct report on this team, a database manager and the database manager has two reports, both of whom conduct data entry. The VP, the Director of Data and the Director of Training and Performance all have other responsibilities other than serving the agency’s performance measurement enterprise, including computer hardware and training. The database manager has corporate database experience. “I make half as much as I was making, but I wanted to do this because I wanted to feel like I was giving back…I tease them and say, ‘I'm going to get me a real job if you guys don't treat me better.’ The VP, a social worker, is in her fourth decade at MCFS, having worked in many of its programs. The Director of Data was hired out of his undergraduate psychology program 20 years ago. The newest member of the team, the Director of Training and Performance, previously served as a program director. She holds ambitions to one day be an agency CEO and sees this position as a learning opportunity and transition to senior management.

28

DATA MANAGEMENT. Agency data resides in a database housed on an off-site server hosted by the database company with whom MCFS contracts. This is a database not created for the social services; MCFS is the company’s only social service client. While hoping to rename the database soon, they refer to it as Asteroid.i Most of the data in Asteroid is manually entered by two data entry staff members at the corporate headquarters. Members of the performance measurement team access Asteroid via VPN. No program staff have access to the database, either to put data in or get data out. No reports are generated directly through the Asteroid database, nor are database filters used to query data. All data reports are run by importing data from Asteroid into SPSS. SPSS syntax has been created to generate the numbers that populates the agency’s quarterly reports.

REPORTS. Each agency program receives a one-page quarterly report from the performance measurement team, with 5-10 indicators on it. Data on each indicator is reported from the prior quarter and the current quarter, with an indication of percentage change from the prior quarter. Benchmarks or expectations have been created for each indicator for each program. If the program met or exceeded the benchmark, it is highlighted in green. If the program missed the benchmark substantially, it is highlighted in red. If it came close to the benchmark, it is highlighted in yellow. This way a program can see at a glance whether it is meeting its goals or not; a green colored report is good news and one covered in red is bad news. Few indicators are graphed over time. The one exception for a longer graph was for restraints in the residential programs. Programs with multiple sites see indicators charted across sites.

FUNCTIONS. MCFS compiles a large amount of data, largely because it encompasses so many programs. For its COA reviewers, it created an impressive spreadsheet of indicators, justifications, data sources and collection and analysis responsibilities for the domains of organizational climate, case reviews, client satisfaction, key performance indicators, outcomes, risk management and utilization review. Here we look at indicators that assess compliance, effectiveness or progress.

Standards and compliance. The agency gathers a great deal of compliance data for its most regulated programs, foster care and residential care. The foster care chart peer review form, for example, assesses “evidence of compliance” with over 50 standards. The residential program report goes into detail on use of restraints and unusual incidents.

Treatment effectiveness and progress monitoring. The two programs under performance based contracts, again foster care and residential care, have explicit definitions of client success monitored internally and externally. For residential care, the primary outcome of interest is sustained favorable discharge (d/c to a less restrictive placement that maintains). For foster care, it is permanency achieved (return home, adoption, guardianship with relatives), with a goal of 40% of cases achieving permanence in a year. For the in-home programs, two indicators mark success: absence of a new report of maltreatment and absence of placement into substitute care. Outpatient clinical services monitor progress toward identified goals and scores on the Behavioral and Emotional Rating Scales and the Behavior and Symptom Identification Scales. Progress is assessed by

29

examining pre- and post-treatment scores. It does not appear that clinicians receive any within-treatment scores on these measures. Programs for which MCFS struggles to find meaningful indicators of success include its shelter program, food programs, arts and activity programs, and a father involvement program. The CEO said he worries about losing funding for their food program. “We have a food distribution program. We've had it since Reagan passed out cheese. What's that? Thirty years ago? We get money from United Way in _. They really want to know the outcome, when people are fed. What are we doing to reduce the number of people who use [the food bank]?” Answers to interview inquiries found at least one indication of how the agency uses data to inform an informal model of change. In the foster care program, managers monitor the indicators on caseworker-child contacts, caseworker-birth parent contacts and birth parent- child visits, assuming that favorable indications of these process will influence the outcome of interest, permanency of the child.

Financial Solvency. Each report has a financial indicator, the proportion of revenue over expense. The benchmark for this is .95. This benchmark has become problematic over time as program directors view this as their target. As an administrator said, “If we lose five percent on every government contract, we're in trouble. The [program] leadership only thinks they need to worry about that 95 percent. Because they'll be green [laughs]!”

Other. Agency personnel take an annual organizational climate survey administered electronically by the Alliance for Strong Families. Clients and client families partake in client satisfaction surveys, for which program managers are responsible for distribution and collection. These data were discounted by program managers for reasons of low participation.

STRENGTHS AND OPPORTUNITIES The performance measurement team at MCFS operates with an abundance of strengths. The staff on this team is dedicated to finding the measurement strategies that are most meaningful to programs so that their monitoring efforts translate into service upgrades for MCFS clients. The team has deep knowledge of agency processes and appears to have fostered genuine good will among their employees. The team possesses expertise in database management and in the use of statistical software to analyze agency data. Finally, the team has the ability to articulate its mission and structure in agency documentation. This is a performance measurement team that has thought about what it should be. MCFS has a number of stakeholders that are eager to see and use data. The recent use of the state’s online data dashboards for the foster care and residential programs has whetted the appetite of programs for data, especially data on their own computers that they can use to drill down to supervisor, worker and case levels. The agency has a governing board that likes to see data and is in the habit of asking administrators about data. The COO is very interested in being able to look at data on his desktop. MCFS has generated a culture of regular performance monitoring. Chart reviews are calendared and programs receive sheets of randomly selected clients for whom chart reviews will be due. They regularly create quarterly performance measurement reports for each of the major programs and meet with the programs to present and discuss the data. They have a standardized format for these reports with color coded elements to facilitate interpretation.

30

The performance measurement team has successfully created redundancy. Multiple team members can access the Asteroid database, export data and run analyses using SPSS. The case study process generated a number of ideas on how MCFS could move to its next level in performance measurement.

Greater accent on discovering and celebrating successes. Performance measurement efforts risk focusing too often on places where agency programs are falling short. The color coded dashboards risk being awash in red. The chart review process that focuses on presence of documents focuses on discovering noncompliance. While the informant interviews encountered stories that demonstrated successes were identified through performance measurement, they also identified language where “corrective actions” involving HR were often the result of performance measurement reporting. The MCFS performance measurement team should diligently use its data to identify successes and make a fuss over them it ways that are motivating to program staff.

Chart indicators across time. The agency overuses “percentage change since last quarter” which is reported for every indicator on a program’s dashboard. When performance indicators were charted over time, they were charted with five time points, the last five quarters. This short time line does not allow a visual understanding of natural variation among indicators and does not allow the use of run charts or statistical process control analyses that help the agency identify when things have really changed.

Chartered quality improvement projects. The performance team members described a dropped handoff problem. A problem gets identified with and in the presence of a program director and then the performance measurement team assumes the program director will work to resolve the problem. There are two issues here: no person with line authority has assigned a problem to be fixed (thus no one will be held accountable for its fixing), and no defined process has been initiated to explore the causes of the problem and fix it. A more formalized improvement effort is recommended.

Resources for Improvement. Several people at MCFS described the agency as financially stressed and in and out of survival mode. In a more ideal improvement environment, limited resources would accompany improvement teams to assist in fixing identified problems.

Peer review upgrade. Multiple MCFS interviewees identified the chart peer review process as being focused on the presence of documents in the chart and not the quality of service represented in the chart. MCFS should consider, at the least, assessing the quality of the documentation in their charts.

Data on the desktop. Several top administrators and many program directors have no access to data. When they need a number on any aspect of programs, they have to call the performance measurement team and wait for an answer. There is an appetite for the ability to access and interact with performance data.

More common indicators. The MCFS team has tailored their quarterly dashboard reports to each program. However, this does not allow much comparison among programs, allowing the agency to identify well and under-performing programs. The agency should consider creating a small number of indicators shared across programs.

31

Pacer Youth Services INTRODUCTION The most striking thing about Pacer Youth Services, a small agency located in a small city, was its use of technology. Pacer has no web site. Agency employees did not have agency emails. Computers, when present, did not contain their own word processing or email or spreadsheet programs. Agency personnel used VPN to connect to a state agency’s servers, where Pacer employees logged in and accessed the state agency’s programs and stored their files. This Spartan use of technology speaks to how frugal small child serving agencies need to be to keep afloat on state contracts and how much of a shift it was for Pacer to begin to embrace performance management.

AGENCY DESCRIPTION AND PROGRAMS Pacer Youth Services started in the 1950s as a youth farm, where the county judge sent delinquent boys to straighten out. These were kids, according to Pacer’s current Executive Director, “who looked like Brando, and they’ve got cigarettes rolled up in the t-shirt sleeves and the whole thing.” Over time, the farm began serving youth from the child welfare as well as the juvenile justice system and began to compete for state contracts. In the 1970s the farm transitioned to become a group home at another location. In the 1980s, Pacer began providing foster care services. Today, it serves 16 youth in group care, 10-15 young people living preparing to age out of foster care, 80 foster kids, and has programs serving intact families to prevent foster home placement. In total, Pacer employs 50 people, many of them direct care staff. Pacer has 12 case managers, four supervisors, a Quality Coordinator, a Clinical Director, and an Executive Director. Many of the agency administrators have been there for many years and all started as Pacer case managers.

CASE STUDY PROCESSES Dr. McMillen spent one day at Pacer and interviewed seven employees, including the Executive Director, Clinical Director, Quality Coordinator, and four supervisors. He debriefed at day’s end with the Clinical Director and Quality Coordinator.

EXTERNAL FACTORS 32

Several agency actors saw Pacer’s performance management activities as driven by state public sector mandate. Pacer hired its first Quality Coordinator three years ago after the state initiated a performance rating system for some of their programs and the state began encouraging even small agencies like Pacer to monitor program processes and outcomes. “Some of our [public agency contacts] were encouraging me to be proactive,” said Pacer’s Clinical Director about that period of time “‘This is coming,’” they warned. The agency Quality Coordinator saw her agency as directly responding to performance management requests from their largest state contractor. “They do influence,” she said. “If a minor request is suggested, it is taken into actuality.” After a state monitor wondered whether Pacer could look at unusual incident reports by shift, Pacer started doing so, even though they had to re-create that data by hand. Pacer is not accredited and it applies for little outside private funding.

INTERNAL FACTORS Pacer’s Board of Directors appears to have no influence on performance management activities at this time. “They want to see financial reports, balance sheets, plus and minus programs for the month,” said the Executive Director. “I don’t think there is anybody on the board that would really be interested in how the [the agency’s state produced performance] dashboard is doing.”

DEVELOPMENT OF PERFORMANCE MEASUREMENT CAPACITY For many years, Pacer kept or found data on an ad hoc basis when it was requested by United Way. Then, about four years prior to the site visit, the Clinical Director said it became obvious other agencies were able to report on data that Pacer could not. I was “attending those meetings and assessing what other people were doing.” Pacer was falling short in its ability to tell its story, to make the claim that it was providing quality service. At the same time, the state was moving toward benchmarks and dashboards for its contracted agencies. At the urging of Pacer’s state partners, the Clinical Director began collecting what data she could from case records. “I started to pull data and compile assessment and took the benchmarks. I literally got all this paper out and I took the benchmarks that we were being assessed on and compiled that data and ran those numbers myself. Case to case. I would spend hours combing through those files and doing the internal file audits. I had lots of spreadsheets that I was trying to keep track of everything and it became incredibly overwhelming.” “Finally,” the Executive Director said, “about three years ago, we were like ‘You know what? We need a CQI person because this is taking way too much time away from what we do.’ Obviously, it is not something that is going away.” A job ad was posted for a Quality Coordinator. “I was inundated with resumes for this position. When I was looking through them, everyone had great credentials and impressive degrees,” said the Clinical Director. But one agency employee also stood out. “She was a case manager,” explained the Executive Director. “She seemed to thrive on that data stuff.” And, said the Clinical Director, “She knew was our agency mission was. She knew our atmosphere.” Hiring for a new administrative position in such a small agency “was a big decision,” said the Executive Director. “We were talking about it for six months and crunching the numbers and doing the cost benefit analysis and finding the money.” He explained how he came up with the money. “Here’s what we’ve got to do. We’ve got to drum this up here and got to boost these programs up to this, so that little administrative 33

stuff that comes in goes into the pot, so there will be enough of that that we can create a position.” He lamented that the public agencies with whom his agencies contract “don’t really pay for this. You’ve got to find it out of your administrative fees. Of course, the larger the program is, there’s five dollars here and five dollars there and it adds up.” That makes it tough for smaller programs that have less to add up. The Quality Coordinator described the situation into which she was hired as “a state of panic.” She said she inherited little in terms of data, structure or processes from which to build out her job. “There was absolutely no knowledge… Nobody was trained to do math or data or anything along those lines. They had no idea.” Looking back, the Clinical Director said, “I didn’t realize how much we were doing incorrectly.” Starting from little, the Quality Coordinator reported that it took some time for agency employees to grasp the concept of data and what data meant in a social service context. She had to teach them. “That was our biggest battle,” she said and this was the message she wanted them to hear: “Data is what you are doing every day. What you are doing everyday interprets into data. We’ve got to find a way to bridge that and come together because without that, we don’t have jobs because we don’t have contracts.” The second battle, according to the Quality Coordinator, was getting staff to prioritize the performance management function, which was concretized by attendance at peer review meetings. “I would start out many CQI days [peer reviews] with 12 employees. By noon, we wouldn’t have our files complete yet and I would be down to six.” The Quality Coordinator credits a state wide coalition of quality coordinators who work at private child and family service areas for the growth she has experienced on the job. They taught her the ins and outs of the job and served as crucial cheerleaders. “That group has just encouraged me completely along the way.” She added, “I cannot imagine what life would be like if I didn’t have that meeting every other month.” As an example of what the Quality Coordinator learned from this collaborative, she talked about how they agency began tracking unusual incidents. “I went to a meeting and an agency presented on their internal unusual incident reporting system, how they track their UIRs, how their case managers were doing it. We spent two hours talking about it. It was just phenomenal.” Soon after, she was meeting with state personnel who asked how her agency tracked UIRs. She explained that they weren’t. She told them, “I just went to this training. I am completely on board. This is something we need to implement.” The combination of a push from the state and the information from the collaborative led to a new Pacer initiative to monitor UIRs. The performance management work at Pacer has matured quickly. “We have come a long way in the last three to four years,” said the Clinical Director. “We didn’t understand how valuable a CQI position would be and it has been very, very important.” Many at Pacer attribute one big win to the improved capacity to manage performance. Its foster care program has been rated as a top performer based on its metrics. When asked to what she attributed that, the foster care supervisor said, “The Quality Improvement Coordinator! I think it’s been a really big effort.” Another program rated by the state has not yet achieved a top rating, but the program has shown consistent progress. “That’s great!” said the supervisor of that program with evident pride. Plus, she added, other people within the agency who have looked at these numbers have congratulated her on the progress.

34

STRUCTURES AND PROCESSES PERSONNEL. Pacer’s Quality Coordinator was the sole employee dedicated to performance management. This person sees herself as well suited for the job. “I love paperwork, oddly,” she said. She was also one of the few people at agency who did not shy away from numbers. The Quality Coordinator is also responsible for the agency’s special events. Both the Clinical Director and Executive Director reported interest in providing some additional employee help for performance management work as soon as funds are available.

REPORTS. The residential programs at Pacer used a spreadsheet of unusual incident reports created for them by the Quality Coordinator. This sheet is created from paper forms, with no automation. It takes “about two days, two full days” per month for the Quality Coordinator to create the contents of the report. No report, scores or tallies resulted from peer reviews. They yielded a text file of what is missing from a case record. Following the monthly internal audit conducted by the Quality Coordinator, a supervisor will receive a Microsoft Word document for each case detailing what is considered missing in the record. All metrics that the agency uses at this time are mandated through their funders. The agency has no home grown measures and has not initiated a process to decide how best to measure their programs’ performance. Some metrics are more valued than others. For example, in the services to intact families, the metric of number of family-worker contacts is monitored closely to see if workers are doing their job. But the metric of hotline reports served by this program is not, because the agency considers it unsafe to value that metric for fear that workers would not report unsafe situations to the proper authorities.

FUNCTIONS. Pacer’s performance management efforts focused on three main functions. 1. Peer reviewers monitored case files, randomly picking some from one of three programs each month. Peer reviews were largely, but not entirely focused on compliance. “I would say 80% is all about compliance,” said the Quality Coordinator. 2. The Quality Coordinator conducted Preventative Audits of its programs that are monitored each month through a state issued dashboard report. This is currently for three contracted programs with another expected to come online soon. “The first two weeks of every month, I spend on nothing but auditing,” said the Quality Coordinator. “I go into the state child welfare database system and comb through each case record there, seeing if everything that the state will be pulling from to create its dashboards is there.” She elaborated, “Every case note. Every case record. For the first two weeks of every month. It’s intense. The whole point is to be as pro-active as we can.” 3. Monitoring unusual incident reports.

35

MEETINGS. Peer review meetings occur monthly, after a period when they were quarterly. The meetings are led by the Quality Coordinator. “I make myself available during those peer reviews from start to finish, regardless,” she said. The Quality Coordinator meets with the appropriate supervisor after each unusual incident report is initially recorded to see if it qualifies as a reportable UIR, and to make sure the appropriate documents are completed. The Quality Coordinator also meets weekly with many service teams. For example, she meets with the intact services supervisor and the program’s two supervisors weekly to discuss both client issues and how to improve services in ways that would improve the agency’s monitored metrics. This may be short term, as the agency is working to improve the performance numbers for this program.

DATABASES AND OTHER TECHNOLOGY ISSUES. Pacer maintains no databases. They enter data about cases into the state’s SACWIS system for child welfare cases. Asked if the agency had any ability to pull demographic numbers on clients served, the Quality Coordinator said, “Nothing, We have nothing. It has not been an investment.” The agency director said, “It’s probably coming.” He added, “I like the idea of physically having the database that we can draw from. I’m not really sure of the logistics of all that. Data entry. Who’s doing to do that? Who has the time to do that?” Meanwhile, the agency is small enough to be able to hand tally things when needed. The Clinical Director also recognizes the need for an agency database. Our Quality Coordinator “is really walking with the dinosaurs in there. She’s looking at stuff and getting out a calculator.” Pacers stores no data regarding peer reviews. As noted, UIRs are tracked in a spread sheet. The other data that are kept in spread sheets are bed days by foster parents. The foster care licensing specialist tallies foster parent payments using a calculator based on that spreadsheet. While all programs that have computers log into another agency’s server to do any work, some programs even lack computers. At the program site for older youth, there are no computers. Everything is hand written and kept in files. “We need to get with the times,” one supervisor said. Several supervisors reported a general lack of numeracy. “Math. None of that. I don’t get it,” stated one. Another stated, “The whole mathy part, like statistics and all that. Oh my God!” When a third supervisor was asked why the Quality Coordinator was doing all the monitoring for her program, the supervisor said, “I am not data focused. I am not a number crunching person. I’m more of a direct service type.” This attitude may be keeping the performance management function too focused on the role of the Quality Coordinator, rather than the function being owned by the program teams.

36

STRENGTHS AND OPPORTUNITIES One major strength of Pacer’s performance management efforts is that their initial forays into this area are viewed as successful by staff and administrators. The Quality Coordinator is well-regarded and program staff use her o problem solve how to do better. There are a number of things that Pacer can do that others have learned how to do: share performance data with their Board, graph metrics over time, and talk with staff about the metrics they think are most important to understand about their programs. This will move them more toward quality and make them less reliant on compliance data to gauge their performance. Pacer will soon likely have to find new efficiencies in their performance management efforts. The Quality Coordinator’s most valued work, the internal audits, are time intensive and draining. Some of this work may be able to be done by a less expensive or valued staff member or taken on by program supervisors. Pacer needs some automation of performance management efforts; inexpensive tools like online survey programs, may help. To go much further, Pacer will need to build out more information technology infrastructure. First entries into database systems can be expensive and disappointing. But the agency administrators and Quality Coordinator need to be building beginning experience in how databases operate.

Joint Research Fund, Understanding and Improving Performance Measurement

Pine Street Children’s Services AGENCY DESCRIPTION The administrative headquarters of Pine Street Children’s Services is housed on the first floor of a modern steel and glass office building. The reception area gives way to a large expanse of cubicles housing staff from various administrative departments. Employees encountered are cordial, and appear to be busily working at their desktops and attending meetings, some hurrying between rooms. Despite the activity, the center is quiet, with little noise from voices or actions disturbing the hush. Ringing the cubicles are executive offices, several with their door open. The overall impression provided is that these same actions could represent the workings of a hub of administrative activity for nearly any industry.

CURRENT CONTEXT Per the CEO, Pine Street Children’s Services (hereafter referred to as Pine Street) is the largest single contractor of children’s mental health services for one of the most populous counties in the U.S. The organization has had substantial continuity in leadership. The current CEO, though relatively new to the position, has worked at Pine Street for 29 years. The organization went through a merger just over a decade ago, joining two other organizations to become Pine Street. Each respective organization had nearly an onehundred year history of operation, and both have deep roots in the history of child welfare and therapeutic services for children and families in its state.

PARTICIPANTS BY ROLE Interviews were conducted with the Chief Executive Officer, Assistant Vice President of Clinical Training and Evidence Based Practice, Director of Clinical Training, Director of Evaluation Research, Vice President of Quality Management, Clinical Director, and a Clinician. All interviews were conducted in-person, over two days. A feedback session was provided on the second day.

EXTERNAL DRIVERS OF PERFORMANCE MANAGEMENT Pine Street is accredited by its state as a provider, as well as by the Joint Commission. Though these accreditations are seen as useful for meeting regulatory requirements, they were not mentioned by any participants as primary drivers of quality improvement efforts. County and State quality and compliance regimes, however, were identified as drivers of behavior at all levels of the organization. 2

The county, using funding from state legislation, has aggressively promoted training on and the implementation of evidence based practices using with the Managing and Adapting Practice (MAP) program promulgated by Dr. Bruce Chorpita at the University of California at Los Angeles (UCLA) and his training and consultation firm, Practice Wise, LLC. This set of intervention practices is designed to equip practitioners with a core set of intervention skills which can be applied flexibly across commonly encountered clinical problems of children and youth. The approach also uses ongoing charting of practices used to attain goals, and progress made on the goals, to identify how well treatments are leading to expected outcomes. Substantial resources have been devoted to building staff competency in using this approach for treating clients’ identified behavioral health needs. In fact, the CEO stated that, “Our strategy was to hire his staff……They both were out of [local University] PhDs, MAP trainers for Bruce for years. Now, they are our primary trainers in our clinical department.” The trainers stated that nearly two-thirds of all clinicians for whom MAP training would be appropriate have received such training. Several supervisors have also been certified as MAP supervisors. Training proceeds based on both fiscal allocations available and treating staff members’ need for specific expertise matched to the identified clinical needs of treatment populations. Specific state or county-funded programs require the collection, reporting and review of standardized outcome data. The Full Service Partnership (FSP) program, funded by the MHSA and administered statewide, is an example mentioned by several stakeholders. This program is designed to improve residential, vocational, and psychosocial outcomes for youth and young adults. Pine Street staff indicated that they review FSP data provided by the county regarding the progress and outcomes of FSP clients. However, clinical staff interviewed indicated that these outcome data are not routinely provided and available to treating staff. Staff also complete their own battery of measures for these clients, as part of a series of measures provided to all children and youth receiving behavioral health services at the agency. These measures include the Youth Outcomes Questionnaire to track symptom change and the Child and Adolescent Functional Assessment Scale (CAFAS) to assess change in functional outcomes. Thus the institutional commitment to quality monitoring and data use serves to fill the gap between the intent of the program for routine use of outcome data and the lack of a direct feedback loop between the County and direct treatment providers regarding such data. Throughout interviews with informants at all levels of the system, a tension was also identified between state and county contracting requirements and the desire to provide population-based, high-quality care. Cost and reimbursement models were identified as drags on quality processes. One leader stated, “Even if you haven't hit your overall contract amount, if in these different categories of funding you see more consumers, or you see the right number of consumers that they've contracted you with, but you spend more, they don't reimburse you, so you lose money.” Constant monitoring of these outputs was seen as a major impediment to spending time on quality-related activities. In addition, significant unrealized revenues related to county-identified deficiencies in the charting of service provision have led to a renewed focus on Medicaid-compliant charting of medically-necessary mental health services. Pine Street has recently hired former county chart auditing staff with experience with local chart audits to provide internal procedures, training and feedback to staff on successful charting procedures. There have been limited attempts to connect quality and compliance functions, resulting in ongoing tensions generated by competing messages about the relative importance of service quality versus service documentation.

INTERNAL DRIVERS OF PERFORMANCE MANAGEMENT The CEO of the organization indicated that an internal focus on outcomes-based care has long been a hallmark of work at Pine Street. She stated, “As an organization, we've always been really focused on practice. We developed a core practice model, probably in 1999, 2000, which is now the core practice model for the state and for the county of [redacted]. We went through a whole process.” Leadership responsible for training indicated that this practice model is both behaviorally specific and widely used across the 3

agency: “The core practice model training that we have…… was meant to be better behaviorally defined and more intensive than the ones that were being offered, by the county and state. We've trained almost all of the existing staff on the core practice model, but there's always attrition…..Those principles are then pulled forth into all the other training.” Training personnel also developed a method for tracking staff fidelity to the Practice Model: “The original proposal for collecting data about how well people were adhering to the core practice model was to use directive supervision as a way to do that because we craftily designed all of the behaviors composed in the core practice model to be the same behaviors that were evaluated on the directive supervision. Make it all aligned and then we could take the directive supervision data and say, ‘These are ratings on the core practice model behaviors.’ That was really well and good until more recently we decided to stop using directive supervision…..” Several program staff, when asked, appeared hesitant to provide a reason why Directive Supervision was no longer in use. The cause of the change notwithstanding, it appears that this was a significant shift, and that a replacement process for tracking the use of core practices across clinicians and programs in the agency had yet to emerge at the time of these interviews. Other internal processes to monitor service quality and operations were also mentioned. A Quality Management administrator indicated that they are beginning to promote a ‘Quality Cup,’ a strengths-based approach to identify good practices by region across measures of consumer improvement. This administrator indicated that at this time, “…because we didn't have enough outcome measures, because our compliance was low, we actually went to look at outcome measure compliance. Hopefully, next year we can look at outcome measure, at the actual scores of improvement.” The CEO also described an agency-wide performance management process, stating, “This year….we meet with the vice presidents every quarter and we look at every element of the program. We look at finances, staffing, customer satisfaction, consumer satisfaction, if we have the data. We call them Ops reviews…..The finances really drive everything that we do, unfortunately.” This was contextualized by another administrator who stated that, Last fiscal year we had some pretty major financial challenges. The quality care plan, those goals were set aside. We used to have a project management office that is no longer there. They were supposed to actually measure all the efficiencies across all the sites. We have to ask other staff to do double duty to that. When we went into the remediation mode, all trainings were held off, were put on hold.

DATA SYSTEMS DESIGN Stakeholders described a series of data collection and reporting platforms designed to help inform decisions for different functions of the agency. There appears to be a profusion of systems used to capture and report on important processes and outcomes at the agency. One stakeholder listed seven different electronic system for data entry and reporting (one EHR system, one claiming system, one system providing child welfare data, a medication management system, an outcomes measurement system, two systems for incident reports, and a payroll and time sheet system). There does not appear to be a clear process for managing and integrating data across these systems and their functions. However, staff did indicate a conscious effort made to integrate as many outcome measures as allowable by copyright into their Electronic Health record application. One administrator stated, “That allows us to warehouse that data rather than have it unusable for the larger purpose, and stuck into paper files. For me, I think that has been a big difference between us and other large agencies, even in LA County that we have that integration into the electronic health record.” Additionally, the measures are built into the system with clinical output in mind, in order to enhance clinical decision-making, “There are reports that have been built in to enhance the application of the findings from those measures. The reports for those that we have a 4

copyright available, it will score it for them and allow them to see the results.” Staff also had a number of suggestions about the type of Information Technology (IT) supports which could facilitate greater clinical use of data from these measures: “….. if outcomes were on tablets and that the kids could just punch in their answers and it would score right there and then they could talk about it, there would be an enormous up-tick in, number one, the compliancy of the measures and the clinical conversation around the measures.” They also noted the difference between well-funded demonstration projects and resources for ongoing data capture and reporting: “…..all of those dissemination, implementation issues worked really well in this when we had nine million dollar MacArthur grant to pay for a nice Web interface. But when you have it with the Excel Dashboards, there's just a lot more practical challenges to getting people to even open up their Excel document, to make those updates.” Taken together, these comments indicate that staff continue to try to find solutions to the often overwhelming challenges of integrating data across different platforms, data types, and shifts in resources for Information Technology supports to bridge the gap between data entry for compliance and data use for improved clinical decision-making.

CULTURE-BUILDING AND DECISIONS SUPPORTS: There appeared to be two competing priorities at the time of the visit, which were not clearly reconciled. These were a focus on compliance to reach revenue targets or cap revenue losses, and a focus on quality as has defined the agency’s approach in the past. This appeared to be an ongoing source of discussion, without a clear resolution for persons at each level of the system. The use of a ‘Quality Cup’ to motivate a focus on clinical outcomes, as well as multiple respondents’ descriptions of how data have been tailored for use by specific audiences provide evidence of ongoing efforts to build a quality culture even in the face of resource constraints. One Administrator involved in clinical training efforts stated that: “…[D]ata needs to be managed. It needs to be loved and massaged and prettied up and sold. It needs to be sold.” She went on to describe an instance in which providing the appropriate graphic helped motivate providers to sign up for new training in an area of practice needing improvement: We were able to go to them and say, "We need people to consume this data, but we need it to look like something they can find on Facebook." We need an infographic with little hearts and people, and being like "50 percent." [laughs] They did that, because we weren't familiar with that software and they were, and they created these lovely...Infographics.I pitched to the director's leadership and she pitched to manager's leadership. She had the harder job, because it was like a 150 people in that audience, but when people saw that infographic, they're like "Oooh." It's all about if it's reaching them in their language. Then people were really on board with the idea, like "Everyone is affected by this, we really need to do something about this." Then we saw an up-tick in interest in the trainings, signing up for the trainings.

There has also been attention to the need to translate data into meaningful supports for practice: We have been……[providing] a lot of training and support and so we have a meaningful outcomes training that is required for all staff when they first come into the agency that goes through both, the pragmatics of use of that data, like how do you even manage it, where do you find it, what buttons do you click. But then [it] also goes through the clinical applicability of the measures. How do you have a conversation with the family about consent, how do you have a conversation with the family about the scores, how do you describe what these scores mean to a family, and then how do you use them in your treatment planning.

These efforts bespeak on ongoing and deep commitment to insuring the usefulness of data for practice. These initiatives are occurring in an environment of seemingly competing efforts for a focus on compliance 5

or quality. A front-line staff responded that there are now multiple reviews of documentation for compliance, and that these initiatives may be diverting time and attention from the clinical work at hand: The idea is, so that there are lots of eyes on it to make sure nothing's missing and everything's done that needs to be done. The reality of it is, everybody has a different understanding of what needs to be done and how it's supposed to be done. I'm told by three or four different people that either, "This wasn't done right," or "This is missing," or, "That person told you wrong," and I need to redo things. …it's maybe frustrating to me because I have to listen to somebody else tell me what to do with my paperwork, when I already have two other people telling me. For me it's like, OK, I have a feeling like I know what I'm doing, and I'm getting it done, but I have so many people giving me other task[s] to do, not just my supervisor.

This lack of clarity regarding the process and lack of clearly replicable standards of what is appropriate documentation across reviewers has led to unnecessary frustration on the part of front-line staff. Ultimately, compliance and clinical quality improvement efforts need not be in tension with one another, as one administrator noted, “I have shared ……that if we change practice for the better, then we won't have to worry about the financial piece, all of these things will fall into place. We have these problems because we have bad practice.” Identifying appropriate, targeted training to improve practice is a function of the ability to frame and communicate clinical outcomes data in a way which makes clinical sense to practitioners and their supervisors. One clinical training administrator provided an example of how clinical outcomes data at one center indicated a need to shift training efforts to be more in line with clients’ presenting problems at that site. Presenting that data to staff at that site confirmed their clinical experience and led to a shift towards training in a treatment protocol more closely aligned with clients’ presenting problems. Such win-win situations in which data are used to solve meaningful clinical problems promote a culture of data use and continuous quality improvement.

SUMMARY Pine Street has an extensive history of focusing on quality as the route to organizational and community success. There is ample evidence that the resources invested in quality management and improvement efforts at Pine Street provide substantial value-added to their service delivery efforts. Clinical training staff at Pine Street have exceptional expertise in training personnel on the flexible use of evidence based practice elements. These efforts are also supported by a unique funding stream provided through the Mental Health Services Act. In this instance, organizational quality improvement efforts and funders’ resource streams and mandates dovetail in a manner which clearly supports the use of increasingly effective treatment practices. Pine Street also has a tremendous asset in the form of research and evaluation staff dedicated to making practice and outcome information accessible to persons at all levels of the organization. The extensive use of Infographics represent one way in which these staff work to bridge the gap between aggregate numbers and figures and representations of those data which are intuitive to clinical supervisors and staff. Numerous examples of efforts to bring data into day-to-day clinical decision-making speak to these individuals’ ongoing commitment to making data meaningful for and impactful in the lives of children and families. The agency also has made a considerable investment in staff who can bridge the gulf between agency culture and the requirements of county and state funding bodies. These staff, though focused largely on compliance issues, have also indicated in their responses an understanding of the primacy of quality in framing any compliance issues. 6

OPPORTUNITIES. The primary concern voiced throughout interviews across stakeholders is that efforts at regulatory compliance will overwhelm efforts to individualize and provide high-quality behavioral health care to children and families. Creating greater alignment between efforts at quality improvement and documentation compliance would strengthen the rationale and may streamline the processes involved in these efforts. There appears to be little communication and effort to integrate practice improvement and documentation trainings. Such integration might help frame documentation improvement efforts in clinical terms understandable by front line staff, and might require documentation staff to be able to translate their requirements into ways of documenting which both serve regulatory and clinical quality improvement efforts. There also appears to be a substantial opportunity for streamlining and integrating data management platforms and data sources. Creating an inventory of all systems used, data provided by each system, and uses of such data at each level of the system may be a useful starting place in identifying where crossplatform communication is most critical and where data duplication may be eliminated. Efforts to create cross-platform Key Performance Indicators may also be a useful way to organize such efforts. The generation of such performance indicators can provide a focus on the key data elements which must by entered, communicated, and acted on regularly and at all levels of the system. Generating targeted performance improvement initiatives using the indicators can reinforce that data are not simply for review, but are primarily to be generated for their use in decision-making. Success in such an effort may allow for wider support for the use of data-based decision making and quality improvement cycles when faced with important clinical and strategic dilemmas.

7

Owanee Children’s Services AGENCY DESCRIPTION Owanee Children’s Services’ administrative headquarters perches on a hillside, shaded by trees, above a residential neighborhood. Inside, the center hums with activity: trainings, open co-working spaces filled by staff, and cubicles throughout which people work, talk, and move. The impression of the center is one of an ever re-configuring pattern of activity. The agency began in 1985 as a single residential treatment home, and has grown organically by pursuing new contracts and developing new services, as well as via mergers with other agencies. Owanee has become one of the largest providers of children’s behavioral health and child-welfare-related services in a large state; it has also begun providing school-related services in an additional state. The agency provides a wide variety of services including family finding and adoption services, as well as intensive behavioral health services. Behavioral health services are provided in clinic (office) settings, in the community, and in schools. The agency has an explicit ‘no-eject, no-reject’ treatment policy, and consistent with this policy it has been seen as a leader in the development of Wraparound and other intensive community based services in its state.

PARTICIPANTS Interviews were conducted over the course of two days in the winter of 2015. Participants included: the Chief Executive Officer, Chief Operating Officer, Chief Financial Officer, Director of Information Technology, Information Technology Support Manager, the Executive Director of Strategic Development and Evaluation, Executive Directors of two County-based programs, and a Lead Clinician.

EXTERNAL DRIVERS OF PERFORMANCE MANAGEMENT The Owanee Children’s Services (hereafter referred to as Owanee) operate in twelve different counties. As a county-based system, these counties have substantial leeway in how they operationalize and administer state Medicaid funds for behavioral health services. Owanee also often braids funding from multiple childserving sectors, frequently including the child protective service sector, to pay for various aspects of highintensity services. The resulting, complex set of documentation and performance requirements has strongly influenced administrators’ development of performance management capacities. For instance, Owanee has 8

an in-house Information Technology (IT) system designed to “connect different types of programs and their data needs.” This has led to a profusion of reports. The IT Director indicated that there are “hundreds of canned reports” which can be accessed to meet data reporting requirements. However, many of these are subject to the changing information desires of counties such that now, “most of these [reports] are not relevant.” This changing landscape of documentation requirements and reporting needs has led to ongoing efforts to adapt the reporting capabilities of the current system so that it is more dynamic. In addition to the requirements of external funders, Owanee’s focus on serving children with very intensive needs led them to seek to meet the requirements of accreditation from the Joint Commission on the Accreditation of Healthcare Organizations (JCAHO). JCAHO has increasingly focused on using measurement to implement ongoing quality improvement efforts with clearly defined impact. The Director of Training indicated that JCAHO now, “wants a ‘tracer’ of how training leads to model-fidelity” in treatment. The Director indicated that there is ongoing interest in determining how staff-level data impacts their ability to develop treatment competencies. She indicated that some staff currently see such data collection as ‘Big Brother’ and that they are perceived as being used punitively rather than as helping develop staff strengths and competencies. There are some structural, external supports for staff engagement and development. State legislation has provided funding for training a diverse workforce. Owanee has used these dollars to develop employee resource groups for self-identified LGBTQ, Black, Asian-American and bi-lingual staff. These efforts have led to the identification and increased awareness of staff micro-aggressions directed towards underrepresented groups. They are now working to use these insights to generate a curriculum which includes “tools that we can replicate” across contexts and programs.

INTERNAL DRIVERS OF PERFORMANCE MANAGEMENT Owanee has developed and proceduralized a specific model of practice, based out of a particular understanding of attachment theories and informed by principles of behaviorally-based intervention. This model, referred to as ‘Unconditional Care’ is based in the philosophy of ‘no-eject, no-reject’ and states that any child can be served through Owanee’s programs, and that no child will be refused care because of their clinical presentation. All employees new to Owanee are trained on the model of care. Over the past few years there has been a move to update the treatment model to incorporate a dual focus on using attachment-based interventions designed to disconfirm maladaptive beliefs about relationships and using behavioral interventions to understand triggers of problematic behaviors and to strengthen and increase the client’s use of positive behaviors. Efforts to further define supervisory supports for the practice are underway but have not yet resulted in codified supports for supervisors to routinely use. Similarly, efforts to define and measure critical elements of this practice have not yet resulted in a series of clinical practice or outcome indicators which could evaluate the efficacy of elements of the intervention approach. In the absence of these practice-based indicators of quality, the quality management culture at Owanee has focused on the use of widely available metrics to insure fiscal stability, the delivery of services to clients, and to comply with shifting quality standards across jurisdictions. These efforts continue to form the core of agency performance management. Program directors continue to use productivity metrics, overtime data, and compliance and training plan documents to manage performance. Fiscal reports are used to track contract spending. One program administrator, in describing how performance is managed, stated that the use of such metrics “used to feel arbitrary.” However, when these data were provided to the program staff

9

with rationales for their use, new policies around their use, team incentives for maintaining high rates of billable hours, and followed by highlighting staff who achieved the expected rate of billable hours, this was followed by resulting ‘much higher’ rates of billable hours, and a change in staff attitudes towards the use of the metric. The takeaway for the program administrator was that it was important that the metrics were ‘tangible for folks’ and that the process, “doesn’t feel so micro-manage-y.” This orientation to quality which is measured using compliance and output indicators has led to a profusion of efforts to ensure appropriate documentation is in place to meet service and billing audit standards. Given the number of different jurisdictions in which Owanee delivers services, the magnitude of these efforts created tension between administrative and program delivery staff. One staff noted that, “We have to be careful with our urgency,” in addressing perceived problems in adherence to documentation standards. Some staff members see the use of compliance and billing data as intrusive and part of a punitive rather than a helpful approach to achieving quality. A recent effort to reduce the scale and scope of documentation requirements was noted as successful by several central office administrators, but a program director indicated that, “We’ll believe [requirements have been reduced] when we see it,” and that it did not, “feel like it [has] happened” yet in a meaningful way. Though there has been both an historical and current emphasis on using fiscal and compliance indicators as barometers of program success, Owanee staff also describe a series of experiences over the past six or seven years which have heightened the awareness of the importance of developing an internal performance management system and culture. Much of this has come from staff members’ concern about their ability to identify which aspects of program are necessary and effective in improving children’s functioning, and which aspects of treatment programs are unnecessary or ineffective. The director of Evaluation indicated that several years ago she came to a point in her work where she was not sure which Owanee interventions were effective, or why. Then she was trained on Multi-Systemic Therapy, and she discovered that their framework for understanding what is working and not and how to address it when it is not working was, “..exactly what we need to be doing.” Throughout the years which have followed, Owanee programs have engaged in multiple data collection and reporting efforts to better understand the process or outcomes of treatment. Though many of the reports available on the agency-wide intranet reflect a primary focus on collecting rather than acting on data, new efforts are underway to focus on providing feedback tailored to identifying which care practices are effective. In contrast to the state of affairs described by the IT Director, when he states that the current IT system was “designed to be a big black hole,” whereas data feedback “for clinical purposes is still tiny,” these efforts reflect an intentional, ongoing effort to collect and feedback data specifically for improving the quality and outcomes of intervention. Three such efforts were described by Owanee staff. The first, described by the IT director, is the largest in scale, but also the least developed effort. The IT director indicated that they are working to reduce the data collected to a ‘minimum data set’ needed to improve the quality of services and meet compliance requirements. Right now, he stated, it “feels like we’ve barely started” with this winnowing task. One barrier to progress reported by the IT staff is that they are, “Just dealing with fires,” evidenced by a to-do list which reportedly runs more than 500 items long. A second effort, which has been underway now for several years, is the use of the Partners for Change Outcome Management Scales (PCOMS). These scales, which tap session-by-session variation in the therapist and client’s alliance, are entered online and graphed over time for each participating client. These data help constantly center the therapist’s practice on efforts perceived by the client to be consistent with their 10

readiness for change and the interventions which help them change. Owanee pays for training on the use of the measures, and has supported interested staff members in becoming trainers on the approach. Owanee also pays for access to the PCOMS website where results are entered and client engagement in treatment is graphed over time. The use of these measures is voluntarily undertaken by clinicians and is not examined in relation to any other treatment outcomes, or used to assess for differential engagement across groups of clients with different clinical, contextual, or demographic characteristics. The third ongoing effort is being undertaken by the director of Evaluation and one of her staff. Clinical outcome data which are obtained from the Child and Adolescent Needs and Strengths (CANS) tool are now rolled up into Quarterly, Bi-annual and Annual reports and provided to programs. These data include clinical, functional, strengths, and family data. Per the director, staff are just starting to use the CANS data for performance management, and there is “not [the] structure or process [yet] in place for communication [of the CANS data] with [all] staff.” The effort was designed to bridge the gap between the processes and outcomes of performance management, given that, “If they’re not seeing the outcome…..my guess is that they’re not invested in the process” of quality management. The director reviews these reports, often in person, with each program. As another program director stated, if it’s “more collaborative staff feel included,” and that you, “have to partner to make it happen.” The feedback process is used to iterate the reports so that they are maximally useful in understanding how practices lead to client outcomes. The staff member evolving these reports is using Microsoft’s Business Intelligence (B.I.) application to rapidly create reports which can be automatically run and disseminated. This effort is exemplary for its evolutionary approach to report design, focus on outcomes, and explicitly participatory process.

DATA SYSTEMS DESIGN Owanee has designed an intranet which is intended to create common threads of data across programs, geographies, and regulatory requirements. Owanee’s breadth of programs implemented across two states and a multitude of counties, each with their own service sector and locale-based requirements, would strain nearly any effort at standardization. Yet this system has continued to evolve over time and to provide key program compliance, output, and fiscal status metrics. Some clinical data have also been integrated into this system. The human resources and training departments rely on other data systems, which do not transmit data outside of their own database structure. Thus human resource and training data are not centrally integrated or used to answer cross-cutting questions, such as, “What skills are wellrepresented in the individuals we hire?” or “Are we recruiting and retaining staff whose cultural backgrounds and linguistic capabilities match well with the clients we serve in a particular locale?” The training director indicated that analyses of recruitment and training efforts have “..no integration with outcome data.” The human resources director indicated that many critical processes, such as staff evaluations, are still paper-based. As another administrator stated, it would be “nice if we didn’t need to pull information from so many places” in order to manage performance.

11

SITE VISIT SUMMARY The Owanee Children’s Services encompass a large, diverse set of programs developed out of a commitment to serve any child, no matter the type or intensity of the behavioral and emotional challenges which they faced. The development of performance management capabilities across Owanee’s programs is undergoing a shift from primary reliance on fiscal, compliance and output-based indicators of performance, to process and outcome based indicators of performance. Support for these shifts is most deeply rooted in practitioners’ and managers need to understand which interventions work and to continually focus clinical interventions on practices most likely to benefit a given child and family and their often complex needs.

STRENGTHS. Personnel at Owanee understand that they work there because of a shared desire to help children and families with often very complex needs. Staff have internalized this core value and vision. Furthermore, Owanee’s ‘Unconditional Care’ approach has been codified in a book which serves as a practice guide for all treatment staff to utilize. There is a brief, intensive training provided to all entering staff on not only how to use this attachment-based approach but also how to pair it with effective behavioral interventions. This attention to equipping staff with a core set of practice skills is exceptional. Recent efforts to identify barriers to the advancement of diverse staff are also an example of Owanee’s quality-focused commitment to its core mission. Engaging in a likely difficult but important, ongoing effort to identify and act on behaviors which facilitate (or undermine) inclusion and advancement is an example of how a need identified in one department (Human Resources) has implications for practices across all departments and likely ultimately impacts child and family outcomes. Furthermore, Owanee staff have been empowered by administrators to utilize practice based process and outcome measures, including the PCOMS and CANS tools, to better understand the outcomes of care. The use of the PCOMS tool has been entirely voluntary and led internally by interested employees. This groundup approach to selecting and implementing measures has likely reduced commonly encountered resistance to the imposition of quality management metrics, and allowed for adoption of the use of the tool to spread based on its perceived usefulness. The use of the CANS tool is of note in that the methods being used to make sense of the outcome data it provides are also participatory. The use of repeated in-person meetings by administrators, report developers, and program personnel to understand results generated by these ratings, offer a route to meaningful use of the tool for performance benchmarking and improvement. Additionally, the skill of the individual developing these reports by utilizing newly available analytic and visualization software is a clear asset to the effort. The use of rapid prototyping and deployment of results based on end-users specifications likely greatly improves their prospects for uptake and meaningful use, and are consistent with an empowerment-oriented approach to performance management.

OPPORTUNITIES. Owanee’s administrators and program directors have generated a cultural and data-driven response to the challenges of performance management in a vast agency. The agency culture focuses on inclusion of all children and the need for the provision of responsive services to these children and youth. Data-based decision-making is clearly evident in administrators’ and managers’ use of fiscal and output data. There is an opportunity to extend these advances in meaningful ways which may further shift the agency staff to better

12

realizing the potential of their values and approach to improve the morale and self-efficacy of employees and to improve the outcomes of children and families. The focus on collaboration and participatory care modeled with children has yet to be replicated across functions and data silos in the agency. Administrators and staff are able to easily identify areas in which a more collaborative, integrated use of data can reap large rewards in terms of reducing duplicate effort and achieving positive impact. Incentivizing the leadership team to use data collaboratively and across functions would have clear and likely immediate impacts on program management and efficiency. One such application could be in the use of outcome data to inform future training efforts. The identification of clinical sub-populations who are particularly effectively or ineffectively served may help identify where training efforts are currently effective, and where they could be bolstered. Finally, there is not yet a clearly identified role or voice for families and youth in evaluating or directing Owanee’s quality management efforts. Defining meaningful roles for families and youth in such efforts, as well as providing training and coaching resources for families and youth to meaningfully take on these roles, may help transform the discussion about quality from one focused on outputs to one focused on achieving important functional outcomes for children and youth. Though the current administrative and quality structure reflects persons who have a commitment to achieving meaningful outcomes for children and youth, it does not yet actually give them a voice at the administrative level for defining and ensuring how that will happen. Such roles may help counterbalance the idea that quality efforts are an imposition on the work with the idea that efforts to achieve high quality services are the work.

13

Sloan Creek Youth Services AGENCY DESCRIPTION Sloan Creek CEO Ron Kastle3 may balk at the notion that Sloan Creek is a run like a family. “I don’t pay my Dad to be the maintenance guy who shows up once a week and pay him $40,000, “Kastle said, referring to the kind of practice he thinks give small agencies like his a bad name. “We don’t operate that way.” But, for this social service outfit, close-knit is an insufficient descriptor. “We’ve all been together for 15 years.” Said Kastle, referring to his senior managers. “We have a really dynamite team.” The agency social worker with the least longevity has been at Sloan Creek five years. Clothing and workplace banter are not surprisingly casual. The Sloan Creek headquarters where we conducted our interviews is located in a low-slung 1970s professional building in a residential district of a western, drought stricken medium size city. The foster family program was next door, with its two group homes located elsewhere – one in town. Offices and common areas were spacious, dark and quiet. While most of the interviewing time was in the board room or staff offices, we did not encounter a single client or family or staff other than those we interviewed in our day at Sloan Creek.

PROGRAMS Sloan Creek’s three programs – group home care, foster care and adoption – are largely unconnected programmatically, with minimal client movement from foster care to group homes and vice versa, but with substantial movement from foster care to adoption programs. The agency’s original effort was foster care. A group home, housing 6 younger male teenagers was added in the 1990s by a local human services pioneer. A second group home facility, built more recently, houses 12 slightly older males, most with histories of substance use. All are dependents of the public child welfare system. While other group homes in the area have been closing, Sloan’s waiting list serves as source of agency pride. The foster care program, supervised by the agency director, encompassed four case workers, all master level professionals, and services roughly 45 foster homes and 60 foster children at a time. Each of the foster families was receiving the traditional foster home rate; there were no specialized or therapeutic homes. The adoption unit involved one administrator and two social workers. They process 40 to 50 home studies annually. 3

Sloan Creek and all names in this case study are pseudonyms.

14

CASE STUDY PROCESSES Drs. Israel and McMillen jointly interviewed six staff members in a single day, toured the agency headquarters and foster family offices, reviewed Sloan Creek’s 2014 annual report, the Performance and Quality Improvement section of their Council on Accreditation (COA) self-study, reports from a benchmarking effort, and de-identified print outs from several agency databases. Five of the interviewee participants were chosen by the researchers, based on job titles found on the agency web site. The CEO chose the other staff member for participation. All participants expressed the voluntary nature of their participation and signed informed consent forms. The staff members were forthright, some eager to show and explain their work and the rationale behind their performance measurement efforts.

EXTERNAL FACTORS The agency reported few accountability pressures. They operated on public system contracts that were not performance-based and the public agency did not require outcomes or process measures to be reported. Sloan Creek staff members were not actively involved in writing large grants or pursuing new contracts that required information on past performance. Staff members wrote mostly small grants to sponsor recreational outings and other events. To demonstrate their quality in these applications, they relied on indicators of staff longevity. There was no staff grant writer or development officer. Sloan Creek’s CEO reported tension with the agency’s group home licensing authority, but over how the authority classified specific unusual incidents instead of disagreeing on overall quality of care or outcomes. Repeatedly, the agency expressed confidence that they provided excellent care, perhaps the best in the state. “[The two Vice Presidents] and I, we want to be the best at everything,” said the CEO. “I cheer for the Steelers, the best team in football history. I cheer for Jack Nicklaus and Tiger Woods, the two greatest golfers. We just want to be the best at everything.” With such beliefs, they need little outside motivation to use additional or new performance measurement strategies. This agency did not appear to be one that was attempting to influence state or county policy. The CEO complained of not being able to get face time with a county department director. Instead, they reported that they waited for guidelines to change and reacted accordingly.

DEVELOPMENT OF PERFORMANCE MEASUREMENT CAPACITY Much of Sloan Creek’s performance measurement and quality improvement structure has been in place for over a decade and has changed little in that time. The move from another accreditor to COA this past year brought some additional procedures: the initiation of a formal PQI team and meetings and the introduction of client and employee satisfaction surveys. It appeared to spur a small amount of additional use of current data. The major innovation was voluntary participation in a new benchmarking effort with other agencies in their state. The early returns on this effort showed Sloan performing better than many of their peers on several measures, solidifying the staff’s view that they are providing excellent care.

15

STRUCTURE OF PERFORMANCE MEASUREMENT EFFORTS PERSONNEL. There are no employees whose job descriptions are dedicated to performance measurement. Their COA self-study names the CEO as the person primarily responsible for performance and quality improvement. The Human Resources Director maintains several agency databases. The Group Home Director maintains the home-grown data solutions created for his program.

DATA MANAGEMENT. Data management systems at Sloan Creek span a wide variety of storage and access configurations based in part on the function they fulfill. In terms of access, the systems range from being accessible only to persons in possession of a specific desktop-based Excel file to Internet-based, web-accessible applications available to all persons with appropriate permissions, at any time. In terms of storage, they range from being hosted on the internet via commercial servers to being hosted on individual desktop computers, unconnected to an intranet. In addition, staff at Sloan Creek mentioned that several formerly desktop-based databases would shortly be moving to their internal intranet, increasing access to these databases. The breadth of configurations is represented below in Figure 1.

Figure 1. Data at Sloan Creek.

Desktop – Based Benchmarking Initiative Individual Barrier Behaviors and Goals Individual Virtual Accounts

Internet - Based E-Adopt

Transitioning from Desktop to Intranet Staff Track Foster Track Rez Track The diverse data capture and reporting systems are created and managed by multiple entities. Individual tracking worksheets are developed internally by the organization. The Staff Track, Foster Track and Rez Track databases have been built out over time by a private database developer who sells a database with core functions and then builds out specific functions in the databases over time based on the agency needs. E-Adopt is a standardized commercially available web-hosted platform for tracking adoption-related compliance and monitoring requirements (such as home studies). The ability of the database developers to adapt the databases to new tracking or reporting requirements needs was perceived as adequate to good,

16

though concern was voiced about the ability of older databases to adapt to internet-based entry and reporting capabilities. Data for the state benchmarking initiative are submitted to the benchmarking organization via an online form and then converted into summary worksheets and narrative for quarterly reports.

TYPES OF DATA. The types of data included in these worksheets, spreadsheets, and databases are diverse, reflecting the program mix at the Sloan Creek Youth Homes. Each of the four core databases (E-Adopt, Foster Track, Rez Track, Staff Track) is designed to serve specific functions for a specific population. “I don't want to measure things just to measure things to keep COA happy,” said the CEO. “I want to measure things that are important.” The four databases largely provide compliance and output tracking functions. The specific data they store and track reflect the different standards required by the county and state to serve these populations, and to insure that providers (including both agency staff and foster families) have the training and certifications needed to continue to supervise and work with these children and youth. The worksheets and spreadsheets provided for the Benchmarking Initiative and used to track individual child and youth behaviors at the youth homes focus more directly on outcomes. The data provided to the state Benchmarking Initiative allow for comparisons to like providers also voluntarily participating in the Initiative on a series of performance dimensions. The Individual Barrier Behaviors and Goals spreadsheet includes daily behavior tracking data for each child. The Individual Virtual Accounts track the running total of dollars that a child or youth can earn while in the youth home.

FUNCTIONS STANDARDS AND COMPLIANCE. Multiple databases provided data to ensure that compliance standards were met. Some of the databases used, including the E-Adopt, Staff Track, and Foster Track databases appeared to be geared almost solely to compliance functions. These databases can generate reports on-demand to identify whether or not specific events or have occurred or standards been met (such as a foster parent completing their required annual trainings). Others, such as the Benchmarking Initiative, provided a mix of data, including data relative to regulatory compliance regarding standards of care and outcomes data. Reports from the Benchmarking Initiative are available quarterly, and are provided by the manager of the initiative.

TREATMENT EFFECTIVENESS AND PROGRESS MONITORING. Other data sources were solely devoted to tracking treatment effectiveness. The Individual Barrier Behaviors and Goals worksheets were designed to track hourly and daily performance of desired behavior, and to manage the specific undesirable behaviors that the staff think will serve as the largest barriers to the youth being served in a family setting. The Group Home’s Program Director describes the barriers data.

17

What we do is put all three goals onto the barrier behavior sheet, so if he doesn't do that for the day, I'm going to give him a check if he doesn’t do it. If he does it, he won't get the check. What we do is put those numbers onto a computer system for barrier behaviors. That kicks out a percentage each month. It tells us, "Alex keeping hands to himself, 70 percent. Alex didn't cuss, 100 percent." That means he's doing well. …We try to put the barrier behaviors, we track these because these are behaviors that would preclude them or exclude them from going into a foster home. Similarly, the Individual Virtual Accounts were designed to provide a running tally of dollars earned for positive behaviors that youth can access upon a positive discharge. It started as an incentive system. It came about because we had guys who were refusing to go to school, didn't want to do anything, couch potatoes. We said, ‘We have to figure out something to get these guys motivated. Why don't we mandate that they get into these activities?’ But not just mandate it, reward them for it. Kids respond to money. –Group Home Program Director Over time, the staff realized that these dollar values represented a cumulative indicator of overall client progress. It and the barriers and behaviors tracking system were helpful tools to monitor daily behaviors and to motivate children and youth to engage in positive behaviors in the home and community. Individual Barrier Behavior and Virtual Account spreadsheets are updated daily and provide data on a child or youth’s current functioning in the home and program. These data are used by staff dynamically throughout the course of a day and week to adjust the focus and intensity of their efforts at behavior change with a particular child.

PERIODIC AND PROGRAMMATIC REVIEW OF PERFORMANCE. In addition to daily review of treatment data, performance and treatment data are reviewed regularly at meetings at multiple levels of the organization. This focus on systematic review of data, as well as its use in clearly defined program improvement activities, was catalyzed by the agency embarking on the COA accreditation process. As part of that process, leadership identified the need to formalize data review and performance improvement activities as part of their regular operations. In keeping with this focus on reviewing data and identifying areas of success and need, treatment data are reviewed at program-level meetings with program directors and their staff. Performance and treatment data are also reviewed at weekly Leadership Meetings with the Executive Director and all Program Directors. Staff relayed that these meetings focus both on problem-solving day-to-day concerns and serve as forums for identifying longer-standing treatment and performance issues which may benefit from new interventions. The agency’s Board also reviewed data at each meeting, but focused almost entirely on unusual events such as restraints and AWOLs. Staff prepared these data from individual unusual incident reports and tallied them for the board’s perusal. Staff were asked to justify the numbers, especially if they had gone up.

18

STRENGTHS AND OPPORTUNITIES There are several strengths of the current performance management infrastructure at Sloan Creek. Over time, the leadership has flexibly developed and used a variety of data sources to continuously track and use important program activities and outcomes, mostly focused on compliance and client progress. This is especially true in the group home programs where home grown data solutions allow the clinical team to quantify and track client progress. The ability to grow solutions on their own and use data as it is acquired is likely facilitated by longstanding relationships among the staff and leadership. The desire to compare performance not only to internal standards, but also to external benchmarks is laudable and a sign of a culture interested in continuing to learn and grow in their performance management capability. There are also opportunities to grow current capabilities and prepare Sloan Creek staff and clients for changing future conditions. We offer a set of recommendations below designed to take Sloan Creek to the next level in their use of data to inform and improve their programs.

Cross train at least one other employee on each database. The modest size of Sloan Creek has meant that staff take on tasks as needed, allowing growth in new areas of competence. Once developed, however, the small agency comes to rely on a single person to perform important agency-wide functions (such as performing all data entry and managing all data output for a given set of programs or database). This appears to remain silo-ed within a single person. This puts the agency’s capacity to use data at risk if any of these key individuals were to be kept from the job due to illness or other circumstances for any length of time or leave the agency. This is particularly important in light of the fact that Sloan’s databases include critical data regarding both compliance and outcomes. Equipping multiple persons to interact with key spreadsheets and databases could both reduce the institutional risk associated with a single person holding this knowledge, and provide growth opportunities for other staff.

Shop around for new database tools. Increased uptake and use of data systems throughout the organization may be facilitated to a move towards web-based data hosting and reporting. Such a move could allow greater access to such platforms, and allow off-site staff to complete data-related tasks efficiently. As funders develop integrated care funding and regulatory frameworks, agencies such as Sloan Creek may increase their scope of services and become more interconnected with organizations providing services spanning behavioral health, child welfare, and physical health. Transparency in operations and outcomes are likely to become even more important in such an environment.

Choose systems that have unusual incident forms included in them. Agency staff currently cull these reports monthly to create reports for board members. This is a process that should and could be automated.

19

Plot key indicators over time. The new benchmark effort provides comparisons with other programs. Plotting indicators over time, however, allows comparison with how the agency has done in the past and indicates whether things are getting better or worse and whether improvement efforts are achieving desired results. For an agency that is currently pleased with its level of performance, trend lines will tantalizingly suggest that things could be even better.

Move away from a data focus on risk management to include other types of information. Just as the organization has moved to embrace positive measures of child and youth functioning and progress, the information the board sees could be shifted to reflect this positive growth-based approach. Currently, the board regularly grapples with and tries to make sense of unusual incidents. This likely underutilizes their ability to identify and help build on the successes of the agency and the youth they serve. The agency could, with only minimal effort, move toward graphing and discussing key indicators of youth progress, at least in the group home programs.

Track and use information in the Group Home Virtual Account database. Sloan’s home-grown Virtual Account database is a treasure of process information that aligns with Sloan’s priorities and culture and that can help set Sloan Creek apart from other providers in their region, especially in the area of positive youth development. Sloan could easily report that information out to board members and other stakeholders, including indicators such as the following: percentage of clients that leave with cash earned and the average amounts of those allowances; the percentage of youth who engage in out-of-town community recreational outings each month; the percentage of youth who have had a date each month; the percentage of youth engaged in a community service activity each month.

Schedule data collection. Regular collection and reviews of data are a feature of healthy learning organizations. The agency can build on the regular data reviews it now conducts by developing a calendar of chart reviews and client, family and employee surveys. Indicators derived from these efforts are most useful when examined consistently, at regularly scheduled intervals, over time. This way the agency can better anticipate and prepare for internal changes (such as staff turnover) and external changes (such as increased demands for collaboration or accountability).

Sloan Creek has built a tight-knit, well-sustained organization delivering measurable positive change in the lives of local children and youth. With modest, pro-active efforts, Sloan Creek can increase its capacity for staff learning and growth and help insure a continuing and expanding positive impact in the lives of the children, youth, and families it serves.

i

Another pseudonym

20

Vineyard Child and Family Services AGENCY DESCRIPTION Meetings at Vineyard Child and Family Services, Inc took place at two different sites. The first was a Victorian home, transformed into an administration building. The second was a modern office building, housing both medical services and Vineyard Child and Family Services’ behavioral health and child-welfare services. The modern office building hosted both spaces for treatment services to be provided, as well as meeting spaces for staff meetings. Travelling between offices highlighted the challenges in working across locations, as traffic was difficult and time consuming to navigate. The staff member coordinating the visit indicated that the geographic scope of services, and increasing traffic congestion on the roads made it increasingly difficult to balance responsibilities and coordinate work across sites. Vineyard Child and Family Services’ growth, like the two counties’ growth, has added complexities to managing performance across the organization.

PARTICIPANTS Interviews were conducted over the course of two days in the fall of 2015. Participants included the Chief Executive Officer, Director of Counselling Services, Director of Social Services, Director of Support Living Services, Electronic Health Record Administrator and Outcomes Manager, Program Coordinator of and Early Psychosis Detection and Treatment Program, and a Behavioral Support Counselor.

EXTERNAL DRIVERS OF PERFORMANCE MANAGEMENT Vineyard Child and Family Services recently pursued and achieved accreditation through the Council on Accreditation (COA). During this process, senior leadership determined that the organization’s mission statement had not been meaningfully translated into program-specific, meaningful and achievable goals. Over the past year, through a collaborative, iterative process, meaningful and measurable goals have been developed by the staff from each program area. This process has been described as being more difficult for persons in some program areas than others. In the program serving older adults, the concept of dynamic, meaningful goals was seen as particularly difficult to comprehend and develop. Staff members’ long 21

standing perception of stability rather than development of new skills or competencies as a goal for these older clients was seen as an important mindset-related barrier to overcome in shifting towards more meaningful and achievable goals. Senior leaders also indicated that funders were a driver of performance management efforts. In one instance, outcomes reporting in a program designed to detect and treat early psychosis is funded by Proposition 63, the Mental Health Services Act. The program has a series of performance measures required to be reported to the county which then provides the data to the state to draw down funds for the service. County leadership in the two counties in which they provide services differ on what they desire to fund. They specifically mentioned that leaders in one county were always looking to fund programs which sounded or looked new or innovative. This was seen as a significant challenge, as the majority of costs incurred by the agency are ongoing operational costs. This was also seen as a performance management challenge, as new programs required the adoption of new performance management systems or generation of new performance management capabilities. Administrators also described working to anticipate the types of performance measures that external funders, public and private, would understand and appreciate. This was most clearly seen in a new focus on trying to apprehend and develop relevant ‘social return on investment’ indicators for programs.

INTERNAL DRIVERS OF PERFORMANCE MANAGEMENT SHIFT IN LEADERSHIP. A series of personnel changes have resulted in ongoing attention to the development of quality management capabilities. The retirement of the previous CEO led to the succession of the current CEO from within the organization. The new CEO was described by one manager as “visionary and goal-focused.” This potentiated a shift in focus towards quality management, and a series of personnel changes to support this new focus. One key change was in the Continuous Quality Improvement (CQI) Director. A staff member mentioned that the previous CQI director “didn’t focus on data at all.” The staff member stated that the implementation of a new outcomes tracking measures, the Child and Adolescent Needs and Strengths (CANS) measure “was a big push” and indicative of the beginning of a culture change within the organization.

PROGRAM MANAGERS’ INITIATIVES. Program staff described the adoption of a specific practice model, called ‘3-5-7’, and a specific measurement tool (the CANS tool) as important advances in moving towards an outcomes-based system. Two senior managers reported that attending a training for 3-5-7 led them to believe that this practice, with its focus on explicitly framing and addressing attachment and loss issues, could serve as a general practice model for their child-serving programs. The practice organizes intervention around the completion of three tasks, the asking of five questions and the use of seven skills. These managers have since worked with the developer to adapt and integrate the practice into their typical flow of treatment. Managers also attended a training on the CANS and identified that it was a measurement approach which could be used throughout their behavioral health programs. The CANS tool is designed to be used collaboratively with clients, and versions exist which explicitly address traumatic events and symptoms. The CANS tool is also free to use and utilizes a readily understandable scoring system. Results obtained from the 22

CANS can be aggregated across multiple levels of an organization or system. Because of these features, it was identified as a good fit for the organization. Administrators purchased a data entry and reporting system, and have been working with the system developers to generate increasingly meaningful feedback from the CANS data entered. They have also explicitly cross-walked treatment elements in their treatment for transitional-age youth with indicators on the CANS, in order to make explicit how the CANS is to be used for treatment planning and progress monitoring. Vineyard Child and Family Services has implemented several other measures for use in continuous quality improvement efforts. These include the adult version of the CANS, the Adult Needs and Strengths Assessment (ANSA). In addition, the Physical Heath Questionnaire-9 (PHQ-9) and satisfaction measures have been incorporated into continuous quality improvement measures. Vineyard Child and Family Services stakeholders indicated that there are also new programs and treatment models which are being adopted (such as a new collaborative venture with a primary care provider) which will require careful adaptation of measures or adoption of new measures as well as ways to track progress on those measures.

TECHNICAL SUPPORTS FOR QUALITY MANAGEMENT AND IMPROVEMENT Vineyard Child and Family Services administrators have made investments in several data capture and management systems. These include, but are not limited, a billing and case management system by Anasazi, a CANS data entry and reporting system by Advanced Metrics, desktop and mobile telephone applications for the treatment program (SOAR) designed to detect and address prodromal schizophrenia, and internally developed spreadsheets for managing referrals and wait times. These information systems differ in their ability to provide meaningful feedback to persons at all levels of the organization. One key staffer indicated that “electronic records will help manage information, get quicker outcomes back to staff quicker.” Additionally, an administrator stated that the Anasazi EHR is “great from the management perspective” in that it provides data to administrators about the number of clients served, revenue generated, caseload sizes, and productivity. There are also monthly CQI reports generated from the data system tracking SOAR clients. These report provide feedback on indicators such as which goals the program is helping clients achieve, how outreach is being targeted, and an internal spreadsheet which tracks contacts with agencies. The spreadsheet is also used to send out reminders to practitioners to remind them to complete their client assessments. Though these capabilities are useful, gaps remain. Reporting is specified as the responsibility of each program’s Director. However, their ability to get and usefully integrate these reports into their operations is hampered by system constraints. For instance, one CQI staff member indicated that current reports are “not aesthetically pleasing” and are “hard to digest” Even worse, in two years of using the CANS, they have generated “no CANS reports.” Quality Improvement staff indicated that they are continuing to develop such reports, but did not have a firm date as to when multi-level feedback would be provided. This means that any CANS data feedback must be generated internally by Vineyard Child and Family Services staff. In the meantime, CQI staff have utilized compliance staffers to enter CANS data into the database provided by Advanced Metrics. They have also confirmed the workflow required to “get therapists the data they need” and have incorporated CANS data into the 2015-2016 fiscal year program goals. These workarounds 23

indicate Vineyard Child and Family Services staffers’ persistence in making meaningful use of the data at hand, even when it requires substantial additional effort to work around current IT system infrastructure bottlenecks.

DEVELOPMENT OF A QUALITY-FOCUSED CULTURE There have been ongoing efforts across multiple years to build a culture and infrastructure of quality management at Vineyard Child and Family Services. Examples of plans for improving quality including the 2008 Strategic Imperative, 2011 Continuous Quality Improvement Plan, and the Agency Goals for 2014-15. These plans have become successively more concrete and connected to authority to act, measurable goals to act on, and measures to benchmark and track progress over time. This shift towards quality management has also led to personnel changes to support these new actions. Across multiple levels of the organization, moving towards using numerically based quality indicators has required change. One program manager indicated that the previous CQI director at Vineyard Child and Family Services was ‘staff-focused’ and “didn’t focus on data at all.” This culture existed at multiple levels of the system. Staff indicated that previously, administrators “told us what to do” regarding quality improvement resulting in feelings of ‘powerlessness’ but now staff and clients “came up with the solutions we’re trying.” Now, when there is a problem staff at some programs feel empowered to “feed ideas back up” to managers to implement. This culture of more open problem-solving was on display in weekly leadership meetings and staff meetings; information is also communicated a weekly newsletter to staff. However, there are still significant culture-building activities which are underway at Vineyard Child and Family Services. One administrator indicated that ‘not all program managers are bought in’ to current processes to thread measures through program workflow. Staff at the front line level also have questions about using measures to track their competency in helping meet client goals. The program director of the Adult program stated that the “battle is mostly won” in helping people get “all on the same page” in using the ANSA to track clients’ progress in meeting their goals. One of the most significant themes to emerge was the idea that outcome indicators have been integrated into Vineyard Child and Family Services’ core practice model (3-5-7). Across multiple programs, the activities in the program have been explicitly cross-walked to and tracked with CANS-based indicators of progress. For instance, in the program serving Transition-Age Youth the treatment manual includes a series of activities and competencies designed to help a youth successfully and independently integrate into civic society. CANS items are used to track progress towards this integration. Treatment planning in the adult treatment and treatment foster care programs utilizes CANS-based indicators to delineate broad treatment goals and discrete sub-goals to track incremental progress. This drill-down to practice has helped insure the relevance of quality management to work with clients and families. Additionally, this has begun to allow for a new kind of feedback loop: feedback to practice development. A treatment program director indicated that their quality indicators indicated that youth were not engaging with the program, and that a disproportionate number of youth were being discharged unsuccessfully within the first sixty days of treatment. These data made them return to the treatment developer and work with her to enhance the engagement process utilized early in treatment. This ‘virtuous cycle’ offers the promise of a true quality improvement process in which data from treatment then inform the development of the treatment itself. 24

STRENGTHS AND OPPORTUNITIES STRENGTHS. 1. Vineyard Child and Family Services leadership has invested fiscal and social capital in building the buy in and understanding necessary to create meaningful and measurable goals across programs throughout the organization. 2. Vineyard Child and Family Services’ leadership, including the Board and senior administrators, have made a series of personnel decisions which highlight the importance of agency-wide quality improvement efforts. These include the hiring and promotion, at multiple levels of the agency, of persons who support and engage in collaborative, quality improvement-focused efforts. 3. Program directors and staff have woven program outcome indicators throughout their core treatment model and other practices, connecting quality improvement data and practice.

OPPORTUNITIES. 1. Routinely provide accessible and immediate performance feedback for persons at multiple levels of the agency. Expertise is developed based on rapid performance feedback and coaching for better performance. At the time of the site visit, staff and supervisors did not have ready access to standardized, on-demand performance feedback, which may have slowed efforts to systematically improve outcomes at the practice level. 2. Identify shared, measurable outcomes which integrate goals across programs and provide a sense of the collective impact of Vineyard Child and Family Services on the communities it serves. Vineyard Child and Family Services is expanding the scope and 3. Build on the success of existing programs which have worked closely with practice developers to integrate outcome indicators into the workflow of a clearly defined treatment program. Spread the expectation of a Manualized, trainable core practice model across all populations served by Vineyard Child and Family Services staff, along with relevant outcome indicators built into practice. Replicating this successful strategy gives persons a clear task which bridges daily practice and outcomes management, as well as ownership of their practice and its quality indicators.

25