OFFICE OF CONTINUOUS IMPROVEMENT PROGRESS REPORT 2004 Albert Murray Commissioner Shirley L. Turner Director Office of Continuous Improvement Department of Juvenile Justice 3408 Covington Highway Decatur, Georgia 30032 Office of Continuous Improvement Table of Contents Table of Contents ................................................................................................................................ 1 Executive Summary ............................................................................................................................ 3 The Office of Continuous Improvement ............................................................................................ 5 The Standards of Excellence............................................................................................................... 6 Methodology.................................................................................................................................. 6 Rating the Standards....................................................................................................................... 8 OCI Evaluation Reports.................................................................................................................... 11 The Final Part of the Evaluation Process .......................................................................................... 12 OCI's Performance Assessment Questions....................................................................................... 15 The Confidential Survey: "What our youth say" .............................................................................. 15 Evaluation Outcomes ........................................................................................................................ 17 OCI: A Glance Into the Future.......................................................................................................... 20 APPENDIX I .................................................................................................................................... 22 APPENDIX II ................................................................................................................................... 25 OCI Progress Report 2004 1 OCI Progress Report 2004 2 Executive Summary We would like to congratulate the Department of Juvenile Justice on creating an exceptional quality assurance system. US Department of Justice Staff Special Litigation Section, Civil Rights Division The Primary Objective. The Office of Continuous Improvement's (OCI) primary objective is to deter- mine the level of performance and the quality of services provided in juvenile facilities using DJJ's Standards of Excellence (SOE). In quality improvement terminology, the SOE are the "benchmarks" against which facility practices are assessed. Progress Report 2004 provides an overview of OCI's major activities and evaluation outcomes from January 2004 through January 2005. The Evaluation Process. Since DJJ's facilities are distributed across the state, OCI's evaluation process utilizes two teams of professional evaluators. The teams are charged with closely examining the policies, procedures and institutional practices of facilities in their respective geographic areas. Each facility evaluation determines, over the course of several days, the level of performance and the quality of services that are provided to youths under its care (see map insert below). The evaluators then rate the 196 standards contained in the ten "Service Areas" comprising the Standards of Excellence; for example, Medical; Education; Behavioral Health Services; Food Services; Safety, Security and Facility Structure, etc. A comprehensive report is prepared for each facility evaluation The comprehensive report provides a brief synopsis of the results of the evaluation using multiple data sources such as, staff interviews; youth interviews; document review; and, observation of the facility's operations, activities and physical conditions. The Evaluation Outcomes. The process of benchmarking the four Youth Development Campuses (YDCs) and nine Regional Youth Detention Centers (RYDCs) reviewed in this report began in April 2004 with the evaluation of Macon YDC and was completed in January 2005 with the evaluation of Savannah River Challenge YDC. Montgomery The majority of facilities OCI has currently evaluated met or exceeded 79% of the SOE's requirements. In addition, seven of these facilities achieved 100% strength in one or more service areas. OCI Progress Report 2004 3 OCI Progress Report 2004 4 The Office of Continuous Improvement Background In July 2002, the Georgia Department of Juvenile Justice (DJJ) Quality Assurance Evaluations Unit was reorganized under the name of the Office of Continuous Improvement (OCI). The role of the office was to develop a peer review system responsible for conducting performance evaluations in DJJ facilities and programs. The OCI staff, in conjunction with the DJJ Office of Training developed and conducted training for participants that consisted of facility administrators; District Directors; Regional Administrators; program specialists; and, central office administrators. The OCI staff members served as facilitators and consultants of the on-site evaluation process; however, the assessments of service areas were conducted by a team called peer reviewers. April 2004, brought reorganization for OCI. The benefits of having a dedicated multi-disciplinary team of specialists were recognized as the better quality assurance system for measuring performance levels within DJJ facilities and programs. Like the inception in 1998, the reorganized quality assurance system includes a dedicated team of specialists within OCI. The specialty team, rather than peer reviewers, evaluate all quality service areas during a program evaluation and determine the level of performance and quality of services provided. This multi-disciplinary team includes specialists in various areas, i.e., education, health care, due process, classification, safety and security, and food service. The team of specialists is responsible for rating all components of DJJ facility's operations. Performance ratings are determined through team consensus after the following occurs: examination of policies, procedures and other documents; on-site observations; interviews with staffs, youths, parents and volunteers; and interviews with regional and district staffs. Ratings assigned by these specialists are indicative of the level of quality of that particular standard and overall service area. The facility administrator uses this information to improve service delivery. Identified best practices in the field are used by DJJ to improve or influence policies, procedures and practices. In addition to conducting comprehensive program evaluations, the OCI staff is responsible for technical assistance and consultations, unannounced monitoring visits and other quality assurance activities. OCI Progress Report 2004 5 The Departments' Standards of Excellence The constant measurement of performance levels and the quality services have been the DJJ Standards of Excellence (SOE). The SOE were originally developed in 1999 with input from juvenile justice and corrections practitioners; and medical, mental health and education professionals. Resources used in the development of the standards included: DJJ policies, procedures and preferred practices; state statutes; practices in other states; and the Memorandum of Agreement between the United States Department of Justice and the Georgia Department of Juvenile Justice. Also, national standards relative to juvenile justice programs and other regulatory agencies were adapted. Since 1999, the SOE have been revised periodically to reflect new and revised DJJ policies and procedures, input from DJJ staff and identified best practices. Breakdown of the Standards of Excellence There are two types of SOE, performance and compliance. The performance standards focus on outcome measures while the compliance standards require that all elements of that standard are met. During an evaluation, individual standards are rated based upon the two types of SOE. Specifically, performance standards are rated on the application of policies; whether the procedures and practices are aligned with department policy and the mission statement; and the degree of customer focus/service. Compliance standards are rated on the application of policy and verbatim adherence to the standard; the facility either does what the standard requires or it does not. Evaluation Methodology All evaluations begin with an entrance conference with the facility administrator and key staff. During this conference, the purpose of the visit and other information is explained by the team supervisor, introductions are conducted, and the facility staffs have the opportunity to ask questions and share information. The entrance conference is followed by a comprehensive tour of the facility. During the facility tour further staff introductions occur. OCI Progress Report 2004 6 A multi-disciplinary team approach is utilized to provide a broader and more balanced perspective for program evaluations. The evaluation process also incorporates the use of multiple data sources to evaluate the quality of services and compliance. More specifically, evaluators review program records and files to document its adherence to DJJ policy, conduct interviews to document its procedures, and observe the facility's operations to document its practices. This means each evaluator's findings are validated by the program's own internal processes. Standards rating decisions are then based on these substantiated conclusions. Since professional judgment plays an important role in determining a standard's rating, evaluators have to be prepared to explain the basis of their decisions during the team's consensus meeting. During the consensus meeting the rating for each standard is open for discussion. The team may accept an evaluator's findings, request additional information or change the rating as a result of its discussions. The consensus process allows each team member to share their findings and any other relevant information. The discussion of overlapping records, files, policies, procedures and facility practices that transpire during the consensus meeting, ensures that any related information gathered by team members evaluating other service areas, can be incorporated into the final assessment. This process requires the participation of the entire team; generating dialogue that improves the quality of decisions. It also strengthens the validity of the final performance rating for each standard. At the end of each evaluation, an exit conference is held with the facility administrator and key staff. The results and findings of the evaluation and other related information is discussed with facility staff at this time. Questions are answered, recommendations are made, and corrective actions are discussed during the exit conference. OCI Progress Report 2004 7 Rating the Standards Rating decisions are based on substantiated conclusions validated by each evaluator's assessment and though consensus decision making. A performance standard's rating is based on the quality level of the required outcomes and is rated: Unsatisfactory, Satisfactory, Commendable, or Excellent. The rating for a compliance standard is determined by a facility's verbatim adherence. The following diagram contains the definitions the ratings used to measure the level of performance or compliance of a standard. OCI Progress Report 2004 8 These category terms are used in presenting findings during exit conferences at facilities and appear in the computer generated reports of findings. Listed below are examples of report findings of standards that received ratings of Excellent, Commendable, Satisfactory, Unsatisfactory, Compliance and Non-Compliance. Example of an "Excellent" Rating Standard 9.01 Health Department and DJJ Nutrition Inspections The DJJ Nutritionist and the Georgia Department of Human Resources (DHR) inspect the food service area at least annually. The DHR report was posted in the dining area and it contained a score of 100%. The overall condition of the food service area was immaculate. Example of a "Commendable" Rating Standard 8.30 Staffing The RYDC administrative staff is proactive in filling vacancies. Interviews with the Facility Administrator, Administrative Operations Manager, Training Officer and Lieutenants and documentation reviewed confirmed that vacancies are filled expeditiously. The facility administrator has an on-going process for advertising and uses the Juvenile Correctional Officer (JCO) recruiter to fill vacancies. Most impressive was the turn around time for employee packages; particularly those who needed POST certification. Example of a "Satisfactory" Rating Standard 1.01 Mission Statement The DJJ and Regional Youth Detention Center (RYDC) mission and vision statements are posted throughout the facility and are highly visible. Statements are posted in all living units and work areas in a manner that makes them accessible to all staff, youth and visitors. Interviewed staff members understand the mission of the department and the facility. Staff is familiar with both the vision and mission statements. Observed interactions confirm compliance with mission and vision statements. Example of an "Unsatisfactory" Rating Standard 2.30 Job Specific Training According to training records, most health care professionals are current in CPR training. However, there was no documentation completed validating orientation for health care staff. All full-time medical staff should complete a formal orientation program. OCI Progress Report 2004 9 Example of a "Compliance" Rating Standard 3.18 State Mandated Tests The state assessment program is conducted according to schedules and procedures established by the DJJOE and DOE. State-mandated tests are administered according to the guidelines prescribed by the DJJOE and DOE. After interviewing the lead teacher it was determined that most youth were not eligible for the upper level high school tests. There were sample files that included a sample of state mandated test results. Only a few youth took the Stanford 9, Criterion- Referenced Competency Tests (CRCT) and the Eighth Grade Writing Test. Example of a "Non-Compliance" Rating Standard 3.25 Educational Personnel Staffing A continuum of services is not provided to meet the needs of students. The facility currently does not have three full-time special education teachers as required. The facility has not employed a full-time certified special education teacher for at least three months. Regular education teachers are allocated on a ratio of one teacher to 15 students. Evaluators are also trained to be aware of situations in programs which may or may not be part of the evaluation. Evaluators are instructed to contact the team supervisor immediately when serious situations are suspected. The team supervisor then notifies the facility administrator of the facility, so that immediate corrective action is initiated. These situations are known as critical deficiencies and red flag issues. Definition of Critical Deficiencies and Red Flag Issues Critical Deficiencies: Substandard performance or outcome regarding mandatory, priority or essential standards that requires immediate corrective action. Red Flag Issues: Practices or actions that a reviewer believes to be illegal, unethical, a threat to the safety of youth and/or staff, or a threat to the security and order of the facility/program. OCI Progress Report 2004 10 OCI Evaluation Report A report is developed after each comprehensive evaluation. The components of the report are 1) Report Summary, 2) Summary of Performance and Compliance Ratings, 3) Performance and Compliance Standards Findings, 4) Detailed Performance Ratings, 5) Percentage of Performance and Compliance Ratings, 6) Standards Rated Excellent and Commendable, 7) Standards Rated Unsatisfactory and Non-Compliance, and, 8) Methodology. Each report component is unique; containing specific sections of information from the evaluation process (refer to, Appendix II). The report is a detailed document and serves as a tool to assist staff in making better management decisions that lead to continuous improvement in programs and services. Listed below is a description of each section contained in a report. The Report Summary is a brief synopsis of the results of the comprehensive evaluation. It serves as the executive summary of the detailed report by providing a brief overview of each program service area. It also captures basic information such as the review date(s), members of the review team, facility name, and other identifying information. The Summary of Performance and Compliance Ratings is a snapshot of the overall ratings of the performance standards and the results of compliance standards in each service area. This page shows the calculated average performance rating for each service area and the number of compliance standards rated compliance or non-compliance. Performance and Compliance Standard Findings are the result of factual information on the events that occurred during the evaluation. This section is a summarization of each standard's program performance. The Detailed Performance Ratings section is a break down of all of the performance standards by service area and the rating decisions of either excellent, commendable, satisfactory and unsatisfactory. The Percentage of Performance and Compliance Ratings section is an overview of the percentage score calculated by dividing the standard by the total number of standards in that particular service area. For example, if there were two compliance standards in behavioral health and both of them met the criteria for being in full compliance, then they would be rated at one hundred percent. OCI Progress Report 2004 11 Standards rated Excellent, Commendable, Unsatisfactory, and Non-compliance sections list specific performance and compliance standards that received either of these ratings. The service areas of the specific standard is also identified. The Methodology section captures basic review information. Additionally, it contains primary interview information, documentation review, activities observed, who attended the entrance conference, exit conference and any other additional information that would be important to the reader. Specifically, the documentation review would include all documentation reviewed during the evaluation such as files; logbooks; checklists; and policies and procedures. Some of the information listed in this section are include: 1) the entrance conference, which is the formal entry discussion, 2) the exit conference, evaluators give a preliminary overview of the report and this information is also shared with the administrator and other designated staff, 3) activities observed, which is monitoring day-to-day operations that occur in the facility, and, 4) the primary interview section, which is an overview of all youth and staff interviewed during the evaluation. The Final Part of the Evaluation Process The OCI evaluation team developed a Performance Assessment survey intended to capture staffs' concerns, feedback and experience with the comprehensive evaluation process and the evaluators. Since its development, this performance assessment tool has been supplied to facilities for a sample of staff to complete after each evaluation. The final part of the evaluation process involves completion of the confidential Performance Assessment survey by a sample of the facility staffs that were actively involved in the evaluation process. The Performance Assessment survey asks staff about key issues related to customer service and their overall perception of various components and activities of the evaluation process. The results of the survey play no part in a facility's evaluation or the final report. However, the results do play a key role in OCI's internal continuous improvement efforts. The information obtained from the surveys is maintained in a database for documenting research purposes and developing and quality improvement initiatives within the agency. OCI Progress Report 2004 12 Based on their content, the survey questions provide customers the opportunity to rate and comment on the OCI evaluation team's performance, the customer service of individual team members, and to provide perceptions and comments regarding the overall evaluation process. Additionally, this tool allows staff to identify key improvement opportunities for quality assurance activities and to provide recommendations for process improvement. This information is provided directly to OCI through completion of the Performance Assessment. OCI's database of the Performance Assessment surveys provides OCI a unique opportunity to constantly monitor the quality and the delivery of the service provided to all DJJ facilities during the evaluation process. It also provides for periodically reviewing information that may be used in updating the DJJ Standards of Excellence. Since reorganization in April 2004, the survey has been completed and returned by at least 125 facility staffs OCI Evaluation Team Sample of Performance Assessment Data 2004 - 2005 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Agre e OCI Progress Report 2004 Disagre e Spent time with staff & youth Responsive to questions Sought out additional information Knowlegable about policies and procedures Was clear & concise about findings Provided helpful information 13 OCI Evaluation Team Sample of Performance Assessment Data 2004 -2005 Survey Questions Agree Disagree The evaluator spent time with both staff and youth 90% 10% The team was responsive to questions during the evaluation 86% 14% Each evaluator was clear and concise about their findings during the exit conference 82% 18% Each evaluator was knowledgeable about policies and procedures 82% 18% Each evaluator sought out and requested additional documentation as needed 82% 18% The evaluator provided helpful information 82% 18% A sample of the returned surveyed staff comments were polled to determine what factors they felt were of assistance to them in respect to the OCI evaluation process. Some of those responses to the survey are listed here and on the next page: The most valuable aspect of the evaluation process is that this gives the opportunity to find ways to improve our day to day operations and how to provide the best support services for our youth. --YDC Director The evaluation process showed that many facilities need to have periodic evaluations to insure adherence to policy. It was good that the evaluators actually talked to the general staff rather than just the administration. --RYDC Director The evaluation helped us to look at some of our weak points and what we need to do to address them. --Captain OCI Progress Report 2004 14 The evaluation process was able to evaluate and measure our progress and that positive improvements were noted and appropriately communicated to the staff. --JCO II This process gave us the appropriate corrective criticism and what was needed to improve our performance. --Secretary The most valuable aspect of the evaluation is that it allows you to learn what you need to correct in order to become a better facility and a better administrator. --JCO The Confidential Survey: "What our youth say" OCI used a questionnaire to capture youths' opinions about their treatment in a facility, as well as how the facility implemented a number of standards covered in DJJ's Standards of Excellence. "What our youth say" provides facility staff a perspective I think that ....YDC is a wonderful place. The Director is the best director. She treats us like her own kids. She will discipline us when we need it and she treats us with a lot of respect and kindness. Lewis, YDC Resident on their practices. Since its development the interview questions have been administered to a random sample of youth during selected facility evaluations. Since the initial use of surveys in 1999, surveys have been administered to over 600 youth in DJJ facilities. Included on this and the next page are several statements from youth housed in a RYDC and YDC. Overall, most youth reported that they were pleased with the services that they received. Youth commented that staff members go "out of their way" to help them with problems and needs and that the "facilities are clean." Additionally, Performance Based Standards (PbS) indicate that most youth are satisfied with their overall experience at DJJ facilities. Youth also reported that DJJ facilities have a good education program and recreation program. Youth stated that counselors, teachers and medical staff are very helpful. Staff and youth consistently reported that safety and security procedures are followed. OCI Progress Report 2004 15 I think that this is a nice facility, with great staff. The school here is cool and nice and so are the teachers! Kevin, RYDC Resident Staff really care about us! They talk to us everyday. They warn us when we do something wrong. I know this is the best place for me. Tony, RYDC Resident The staff here is very helpful. They teach us how to take care of our things. I appreciate everything they do for me. Tracy, RYDC Resident I love the staff. They treat us real good. We get our hair done and we have to go to school everyday. I didn't do that at home. But, since I have been here, I want to change the way I do things. Thanks. Kinyata, YDC Resident This is the best facility. We have to follow a schedule and the people are really nice. They care for us. I want to thank all of the people in charge. Samantha, YDC Resident OCI Progress Report 2004 16 Evaluation Outcomes The evaluation period for this Progress Report is from April 2004 to mid January 2005, covering five YDCs and eight RYDCs. Since the reorganization of OCI and the transition from a peer review system, the closet thing we have to a true "baseline" or initial measure of overall strength of DJJ facilities in the 10 service areas, can be obtained from the April 12, 2004, to January 14, 2005, comprehensive evaluations. DJJ's facilities overall strength in the SOE's service areas is 79%. In other words, DJJ's facilities met or exceeded, on the average, the vast majority of the SOE requirements. The figure below shows the overall strength listed by SOE's service area. Safety, Security and Facility Structure Behavior Management Student Rights & Services Behavioral Health Leadership & Program Management Training Admission & Release Medical Food Services Education 0 64 71 73 74 77 79 Average 85 86 87 93 10 20 30 40 50 60 70 80 90 100 Most of DJJ's facilities either met or exceeded SOE's requirements. That is, most of the 13 DJJ facilities evaluated either met or exceeded the service area's requirements at 73% or above. In addition, seven of DJJ's facilities achieved 100% strength in one or more service areas. In numerical terms, there are four service areas where the overall strength differed the most in these 13 facilities: Safety, Security and Facility Structure Behavior Management Medical and, Leadership & Program Management OCI Progress Report 2004 17 OCI Progress Report 2004 18 A sample of facility administrators were polled from facilities that achieved 100% overall strength in a service area. The purpose of the poll was to determine what factors they felt contributed to their achieving high overall program strengths. Basically, their comments touched on leadership style, communication and "customer' focused care. "Planning and working together as a team; people taking ownership and responsibility; and being open for dialogue is what led to achieving high overall program strengths. Having program leaders with ethics and integrity contributes to the success. No program is an island. All are interdependent of each other." ---RYDC Facility Administrator "Collaboration of the department heads on a daily basis allows us to identify deficiencies and to develop corrective actions to address those deficiencies. It has been proven that when people have a chance to provide input on a daily basis, it tends to eliminate a number of problems that you would have in the facility." ---RYDC Facility Administrator "As a leader, I make my expectations clear. I encourage all of my staff to participate and carry out the mission and vision statements set by the department. I expect nothing less than the best." ---YDC Facility Administrator "What has helped us make improvements during the past year was networking with OCI and other RYDC Directors. I have also empowered my staff to be responsible for their area of control." ---YDC Facility Administrator "When you hire quality staff, provide good training and voice clear expectations---you get fewer customer complaints. I treat others as I want to be treated." ---YDC Facility Administrator OCI Progress Report 2004 19 A Glance into the Future Where we are going... Coordinate and provide education, presentations and information on strategic approaches to continuous quality improvement. Create an OCI newsletter to promote continuous quality improvement initiatives and educate staff on quality assurance issues. Conduct "Best Practices" workshops. Publish "Best Practices" document. Partner with Southern Polytechnic State University to train OCI evaluators and other DJJ staff as instructors in Problem Solving Tools. Resume evaluations and monitoring of contract providers in residential and day programs. OCI Progress Report 2004 20 Appendix I Monitoring Regions and Facilities OCI Progress Report 2004 21 OCI Progress Report 2004 22 The Office of Continuous Improvement's service areas are divided into two regions. The Northern Region is comprised of 15 facilities existing within a land area consisting of 79 counties and the Southern Region is comprised of 15 facilities located within an 80 county area. OCI staff members frequently cross regional lines to ensure that each facility receives effective quality assurance services. OCI Progress Report 2004 23 OCI Progress Report 2004 24 Appendix II Comprehensive Evaluation Report OCI Progress Report 2004 25 OCI Progress Report 2004 26 Office of Continuous Improvement Comprehensive Evaluation Report GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04 - 9/17/04 Facility Administrator: Kim Lee Team Leader: Joe Lee Reviewer: Leonard Flounoy Reviewer: Alicia Mitchell Reviewer: William Screws Rated Capacity: 30 Team Leader Phone Number: 555-555-7813 Reviewer: Theresa Jones Reviewer: Rafael Rosado-Ortiz Reviewer: Sydney White Report Summary A comprehensive evaluation of the Regional Youth Detention Center was conducted on 9/14/04 - 9/17/04, utilizing the Department of Juvenile Justice's Standards of Excellence. The following is a brief synopsis of the results of that evaluation based on multiple data sources such as staff interviews; youth interviews; document review; and observations of facility operations, activities and conditions. Leadership and Program Management Twenty-three percent of standards pertaining to the service area of Leadership and Program Management are rated commendable and seventy-seven percent are rated satisfactory. The policies and procedures manuals are maintained in a neat and orderly manner and are accessible to all staff. The facility operational plan contains specific goals that are aligned with both the mission and vision statements of the Department of Juvenile Justice. The facility administration fosters a work environment that contributes to the well-being of employees. Education Seven percent of standards pertaining to education are rated excellent; fourteen percent are rated commendable, seventy-eight percent rated satisfactory. Students in disciplinary detention or isolation are receiving educational services according to DJJ policy and procedures; and, the Alternative Placement Education Model (APEM) program is fully operational on a consistent basis. Educational records are in exceptional order. Students are receiving 330 minutes of instruction per day. The facility has an effective educational internal quality assurance program. The education department is functioning at an acceptable level. Student Rights and Services Of the 17 standards in the Student Rights and Services area, the RYDC received two excellent ratings, two commendable ratings, 13 satisfactory ratings. The performance areas rated excellent are Correspondence and Bed Linens, Washcloths and Towels. The areas rated commendable are the Grievance Process and Reading Materials. The areas rated unsatisfactory are Disciplinary Hearings, Disciplinary Confinement and Other Sanctions, and Disciplinary Pre-Hearing Confinement Monitoring. The success of the facility's grievance process suggest it has the foundation and staff in place to correct the problems identified in the area of due process. It is recommended that the Grievance Officer and Disciplinary Hearing Officer share process ideas for a more effective disciplinary hearing process. Note: Similar sections for the remaining nine service areas would appear at this point. OCI Progress Report 2004 27 Office of Continuous Improvement Summary of Performance and Compliance Ratings GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04 - 9/17/04 Facility Administrator: Kim Lee Team Leader: Joe Lee Reviewer: Leonard Flounoy Reviewer: Alicia Mitchell Reviewer: William Screws Rated Capacity: 30 Current Population: 28 Team Leader Phone Number: 555-982-7813 Reviewer: Theresa Jones Reviewer: Rafael Rosado-Ortiz Reviewer: Sydney White Performance Standards Compliance Standards Service Area Average Rating (Excellent, Commendable, Satisfactory, Unsatisfactory) # of Standards Rated Compliance # of Standards Rated NonCompliance Leadership and Program Management Satisfactory 4 0 Medical Satisfactory 8 2 Education Satisfactory 14 0 Behavioral Health Services Satisfactory 2 0 Student Rights and Services Satisfactory 0 0 Behavior Management System Satisfactory 3 1 Admission and Release Satisfactory 0 0 Safety, Security and Facility Structure Satisfactory 2 0 Food Services Satisfactory 3 0 Training Commendable 6 0 OCI Progress Report 2004 28 GLENVILLE REGIONAL YOUTH DETENTION CENTER Performance and Compliance Standards Findings Leadership and Program Management 1.01 Mission Statement Satisfactory Statement was posted in several areas accessible to staff, youth and visitors. DJJ staff expressed a general knowledge of the mission and they demonstrated effective interactions with youth and gave positive and encouraging responses and directives. There were also documentation of several activities and programs from outside as well as inside sources that attributed to the special needs of the youth that was consistent with the mission of the Department. 1.02 Operational Goals, Objectives and Plan Satisfactory Operational plan address ways to meet facility specific goals that are consistent with mission statement. Operational plan is annually reviewed and updated. Plan addresses the needs of youth, staff, volunteers and the community, and indicated a trail of progress. The facility administrator explained meeting with Department heads on weekly basis to discuss issues consistent with plan. Interviews with staff and facility administrator show consistency in explanation of goals and objectives. 1.03 Accountability and Authority Satisfactory In review of the organizational charts on display in the lobby of the different program units, some were found to be inaccurate in structure of authority. Interviews with staff indicated that they all knew their immediate supervisor and were aware of the chain of command beyond their unit. They also knew where the organizational charts were located in the lobby. The charts discussed with the facility administrator should be re-worked in reference to structure of authority and number of positions. Also need to show relationship to the DJJ administrative structure. 1.04 Statistical Management and Information Analysis Satisfactory Interview with facility administrator and review of several documents used for purpose of planning and process improvement, indicated that objectives from the operational plan and QA reviews are also used. Observations and information brought about through discussion of issues deriving from reports and other data obtained are also used toward planning and process improvement. 1.05 Fiscal Management Satisfactory The facility administrator explained understanding of budget and how it is managed; the Administrative Operations Coordinator (AOC II) and other staff collaborated awareness and input into process. The facility administrator and the AOCII have regular meetings regarding the budget. A Financial Audit was reviewed that had been completed March 2003. Department heads and other staff understood process of the acquisitioning of supplies. 1.06 Operations Manual Commendable The policies and procedures are maintained in individual notebooks separated by policy chapters. In each notebook the local procedures are easily identified by them being written on green paper. Each notebook is neat and organized beginning with the presence of the table of contents for the policy chapter contained in that particular notebook. The front of each policy notebook is clearly labeled with the name and number of the policy and a colorful picture. Signs are posted to encourage staff input into the development of policies. Policy meetings are held monthly. OCI Progress Report 2004 29 Office of Continuous Improvement Detailed Performance Ratings GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04--9/17/04 Service Area # of Standards Rated Excellent Leadership and Program Management 0 Performance Standards # of Standards Rated Commendable # of Standards Rated Satisfactory 3 10 # of Standards Rated Unsatisfactory 0 Medical 0 0 19 2 Education 0 1 12 1 Behavioral Health Services 0 0 12 3 Student Rights and Services 0 2 14 1 Behavior Management System 0 1 3 0 Admission and Release 0 0 10 0 Safety, Security and Facility Structure 0 1 22 5 Food Services 1 3 4 1 Training 4 2 7 2 OCI Progress Report 2004 30 Office of Continuous Improvement Percentage of Performance and Compliance Ratings GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04--9/17/04 Service Area Detailed Performance Ratings Compliance Ratings Excellent Commendable Satisfactory Unsatisfactory Compliance Non-Compliance % % % % % % Leadership and Program Management 0 23 77 0 100 0 Medical 0 0 90 10 80 20 Education 0 7 86 7 100 0 Behavioral Health Services 0 0 80 20 100 0 Student Rights and Services 0 12 82 6 0 0 Behavior Management System 0 25 75 0 75 25 Admission and Release 0 0 100 0 0 0 Safety, Security and Facility Structure 0 4 79 18 100 0 Food Services 11 33 44 11 100 0 Training 27 13 47 13 100 0 OCI Progress Report 2004 31 Office of Continuous Improvement Standards Rated Excellent and Commendable GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04--9/17/04 Service Area Excellent Standard Food Services 9.01 Health Department and DJJ Nutritionist Inspections Training 10.04 Basic Juvenile Correctional Officer (JCO) Training Training 10.06 Basic Level II Training 10.07 Basic Level III Training 10.16 Part-Time Staff Orientation Commendable Leadership and Program Management 1.06 Operations Manual Leadership and Program Management 1.08 Work Environment Leadership and Program Management 1.15 Public Information and Education Education 3.14 Library Services Student Rights and Services 5.02 Grievance Process Student Rights and Services 5.15 Bed Linens, Wash Cloths and Towels Behavior Management System 6.01 Behavior Modification Program Safety, Security and Facility Structure 8.25 Facility Appearance Food Services 9.03 Planned Menus Food Services 9.04 Special Diets Food Services 9.10 Food Service and Dining Areas Training 10.03 Pre-Service Training Training 10.17 Volunteers, Contract Personnel, Interns OCI Progress Report 2004 32 Office of Continuous Improvement Standards Rated Unsatisfactory and Non-Compliance GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04--9/17/04 Service Area Unsatisfactory Standard Medical 2.06 Routine Screenings and Evaluations Medical 2.30 Job Specific Training Education 3.29 504 Committee Behavioral Health Services 4.10 Behavior and Risk Management Behavioral Health Services 4.12 Psychotropic Medication Management Student Rights and Services 5.05 Disciplinary Pre-Hearing Confinement Monitoring Safety, Security and Facility Structure 8.07 Key Control Safety, Security and Facility Structure 8.08 Recording, Storage and Inventories of Keys Safety, Security and Facility Structure 8.13 Room Checks During the Sleep Period Safety, Security and Facility Structure 8.21 Evacuation Egress Plans Safety, Security and Facility Structure 8.28 Chemical Control Food Services 9.09 Security of Eating Utensils Training 10.02 Orientation Training Training 10.05 Basic Level I Non-Compliance Medical 2.01 Designated Health Authority Medical 2.17 Emergency Plans and Drills Behavior Management System 6.08 "Cooling Off" Period OCI Progress Report 2004 33 Office of Continuous Improvement Comprehensive Evaluation Report GLENVILLE REGIONAL YOUTH DETENTION CENTER Evaluation Date: 9/14/04 9/17/04 Methodology A program evaluation was conducted at the Regional Youth Detention Center on 9/14/04 - 9/17/04 Prior to the site visit, the evaluation team members discussed and reviewed the internal program reviews conducted by staff from the DJJ Offices of Behavioral Health, Medical Services and Education. In addition, telephone conversations were held with the Regional Administrators and District Directors to discuss the strengths, weaknesses and any concerns regarding the facility. I. Entrance Conference The Team Leader, Joe Herndon initially met with Kim Lee to check in and to discuss the formal entry discussion. The entrance conference was held in the Director's Office beginning at 8:30 a.m. where an overview of the evaluation process was provided by the team leader. The following staff were in attendance: Kim Lee, Facility Administrator Wallis Noon, Asst. Facility Administrator Bill Davis, Regional Principal Linda Jacks, AOCII Art Joe, Captain Laura Scott, SSPII Leigh Chamber, Secretary II Karen Babb, Juvenile Detention Counselor Robert Gina, Education Consultant Hosea Rivera, Lead Nurse II. Primary Interviews A random sample of 10 youth who had been housed at the facility for at least 30 days was identified by the Office of Technology and Information Systems. Of these youth, 2 were on the mental health caseload, 6 were special education students, and 2 had been involved in at least one special incident in the past six months. These 10 youths were interviewed by members of the evaluation team, covering service areas pertaining to the DJJ Standards of Excellence. They were also interviewed by Investigations staff assigned to the team to assess the adequacy of procedures to protect youth from harm. In addition, the education, medical, behavioral health, and confinement file of each of these youths were reviewed by the appropriate team member. Approximately 26 staff interviews were conducted by the evaluation team members. Staff interviewed included the following positions: JCO JCO Lieutenants Registered Nurse Facility Administrator Business Manager Assistant Director Captain Education Clerk Nurse Food Service Employee II OCI Progress Report 2004 34 III. Documentation review Documents were reviewed that were presented by facility staff in support of facility operations and program services. The documents included information contained in files (institutional, education, medical, behavioral health and personnel); logbooks; policies and procedures; appropriate reports and checklists; program schedules; service requests and access forms; staff and youth surveys; statistical data; etc. IV. Activities Observed Wake-up and Morning Preparations Movement to and from Program and Service Areas Meals Classroom Activities Alternative Education Placement Model (AEPM) Intake Process Release Process Sick Call Medication Distribution Outdoor Recreation Visitation Shift Change Due Process Hearings V. Exit Conference An exit conference is held with staff during the last two days of the evaluation. A verbal overview of this report was shared with the facility administrator/designee and other designated staff during each exit conference. A discussion of pertinent standards and the ratings were shared and staff had the opportunity to make comments and ask questions. The following staff were in attendance: Kim Lee, Facility Administrator Wallis Noon, Asst. Facility Administrator Neil Wills, District Director Linda Jacks, AOC II Art Joe, Captain Laura Scott, SSPII Leigh Chamber, Secretary II Michael Day, Lead Teacher Hosea Rivera, Lead Nurse VI. Additional Information OCI Progress Report 2004 35