Office of Student Achievement's Third Annual Report of Agency Activities
December 2002 through December 2003
February 4, 2004
Third Annual Report of Agency Activities
Introduction
The Governor's Office of Student Achievement (OSA) is pleased to present its third annual report of activities from December 2002 through December 2003. In August 2003, Governor Sonny Perdue changed the name of the agency to the Office of Student Achievement in order to more clearly focus the work of the agency on the improvement of student achievement as it relates to accountability in Georgia's public school districts. The OSA was established to address two major goals that affect all students: improving student achievement and improving school completion. Similarly, the office was physically relocated from 101 Marietta Street to the 19th floor of the Sloppy Floyd Building that also houses the Georgia Department of Education (GDOE). Being in close proximity to GDOE has strengthened the collaborative working relationship between the two agencies.
While OSA's direct affiliation remains with the Governor's office, this agency is dedicated to working closely with the Georgia Department of Education (GDOE) to fulfill the accountability expectations of the federal No Child Left Behind legislation as well as Georgia's A Plus Education Act (House Bill 1187). OSA is committed to partnering with GDOE in their mission to "lead the nation in improving student achievement." Both No Child Left Behind Act and Georgia's A Plus Education Act are built upon the principles of accountability and results, local control and flexibility, parental choice, quality teachers in every classroom, and teaching methods based on solid research.
In recent months, OSA has worked with the GDOE to determine Adequate Yearly Progress (AYP) results and clarify AYP expectations as required under the federal legislation in order to assist districts to meet these goals. OSA is currently finalizing the development of Georgia's Single Statewide Accountability Plan, which will merge federal and state requirements. OSA will continue to publish the state's Report Cards, which are required under both federal and state law.
As specified in Georgia law, the agency will also continue working with the University System of Georgia (USG), the Department of Technical and Adult Education (DTAE), the Office of School Readiness (OSR), and the Georgia Professional Standards Commission (GAPSC) to bring the citizens of Georgia accessible and informative reports that communicate the growing success of our public education system. OSA continues to serve as a partner in P-16 initiatives with these agencies.
Office of Student Achievement Accountability Reporting & Research
2
2003 Annual Report of Agency Activities
During OSA's third year of operation, the following major activities were accomplished and/or initiated:
1. Development of Georgia's Adequate Yearly Progress (AYP) Plan 2. Execution of 2003 AYP Plan and Production of School-level Results 3. Development and Execution of the 2003 AYP Appeals Process 4. Production of 2002-2003 Annual P-16 Education Accountability Reports (i.e.,
Report Cards) 5. Development of a Single Statewide Accountability System (SSAS) 6. Involvement with P-16 Initiatives
Development of Georgia's 2003 Adequate Yearly Progress Plan
From early fall 2002 through June 2003 at the request of the State Board of Education, OSA staff took the lead on the development of Georgia's Adequate Yearly Progress (AYP) Plan. OSA staff convened a task force of members from various divisions in the Georgia Department of Education as well as representatives from local school systems. OSA drafted several versions of the AYP Consolidated Plan in compliance with feedback from the United States Department of Education (USED).
OSA held several meetings of its 150+ member Standards and Grading Committee in order to present ideas and seek advice from local educators about Georgia's AYP Plan. OSA established the Standards and Grading Committee in an advisory capacity for the purpose of developing a fair and equitable accountability system for the state of Georgia. While the committee had been reviewing various options and issues relating to the assignment of grades and ratings for Georgia's public schools based on state law, its focus in 2003 was modified to provide feedback on the various drafts of the AYP plan and begin drafting ideas on how the state and federal legal requirements could be integrated into a single statewide accountability system. Members serving on the committee provide a vast regional representation across Georgia. Similar to the Report Card Working Group, members of the Committee include teachers, administrators, counselors, board members, business leaders, and parents. The following table provides details of the purposes of the various meetings of the Standards and Grading Committee.
Date Dec. 10, 2002
Feb. 26, 2003
Meeting Details Related to AYP Plan Development
Committee
Purpose
Standards & Grading Integrating state accountability requirements with federal NCLB
Committee
requirements
AYP decision worksheet which grades, which subjects, etc. will
contribute to a school's AYP determination
Small group discussion of issues/recommendations for state grades
Identification of refinements or changes in state law to make the
integration of NCLB and HB 1187 possible
Standards & Grading Finalizing Georgia's AYP definition for submission to USED by
Committee
May 1, 2003
Using the USED formatted document for submission
Soliciting feedback from committee
Preparing for peer review of Georgia's plan
Office of Student Achievement Accountability Reporting & Research
3
2003 Annual Report of Agency Activities
Date June 17, 2003
Committee Standards & Grading Committee
Aug. 28, 2003 Standards & Grading Committee
Purpose
Overview of the final approved plan Since menu of second indicators was approved, committee set
standards for these indicators. Since the first- and second-grade CRCTs were not given in spring
2003, committee had to recommend an alternate way to determine AYP for K-2 schools for the interim. Update on AYP from OSA and DOE Feedback from the field Introducing the Standard & Poors report card concept Next steps needed to build a single statewide accountability system Seeking individuals to lead task forces for these steps
To ensure that Georgia's plan was in compliance with all aspects of the No Child Left Behind Act as well as other federal laws such as Individuals with Disability Education Act, Americans with Disabilities Act, Equal Opportunity laws, etc., OSA engaged the consultant services of a Washington-based law firm that deals with educational laws and policy, Nixon Peabody, LLP.
As well as the formal meetings listed above, OSA staff gave many presentations from March through June to various groups such as Regional Education Service Agencies (RESAs), Georgia Choice Schools, etc. to communicate the new requirements of the No Child Left Behind Act and Georgia's AYP plan. In addition, after the release of AYP results from August through November, staff presented along with staff from the GDOE Title I division the AYP results and the appeals process.
USED held Georgia's peer review in March 2003 in Atlanta. Based on peer reviewers' comments, modifications and elaborations were made to the draft. In May 2003, Georgia's AYP Plan was approved by the USED. The product of all this work was the Consolidated State Application Accountability Workbook. The AYP Plan is a 62-page document detailing how schools will be measured to determine if they have made adequate yearly progress. The workbook is accessible from the OSA website at www.gaosa.org.
Execution of Georgia's AYP Plan
After moving into new facilities in mid-June, OSA staff collaborated with the Technology and Information Service division of GDOE to develop a use case and identify all data needs for embarking on the school and district AYP determinations under the new federal law. Web-deliverable reports were also created. In addition, OSA and the Policy division of GDOE developed a process for appeals to AYP and school improvement status.
A use case was created that detailed the programming requirements for making AYP determinations for schools. A use case is a document that identifies all the requisite data elements and datasets, the steps in the process, specific calculations, the types of reports required, the type of data tables to be built, types of error checking to be performed, time estimates, and the parties involved in the entire process. Essentially the use case translates the ideas formulated in the Consolidated State Application Accountability Workbook into
Office of Student Achievement Accountability Reporting & Research
4
2003 Annual Report of Agency Activities
an actual work plan for data analyses. After the school-level use case was completed, then modifications were made to create a use case for district- and state-level AYP determinations.
Statewide Initial School-level Results The final test data became available on July 19, 2003, and the actual AYP determination calculations began immediately. On August 5, 2003, GDOE released the AYP determinations for Title I schools. The press release results for Title I schools showed:
672, or 60% of 1128 Title I schools, made AYP; 456 Schools did not make AYP; 298 schools did not make AYP because they did not meet the 95% test
participation rate; 197 schools did not achieve AYP in only 1 or 2 cells.
On August 14, 2003, GDOE released the AYP determinations for non-Title I schools. Some modifications had been made to the initial AYP results for Title I schools. The press release results for all public schools in Georgia on that date were:
1152, or 58%, of 2003 schools made AYP; 846, or 42%, of schools did not make AYP; 536 schools (211 Title I, 325 non-Title I) did not make AYP because they did not
meet the 95% test participation rate; 395 schools (222 Title I, 173 non-Title I) did not achieve AYP in only 1 or 2 cells.
Web Reports & Other Deliverables To present the data to schools and the public, OSA with GDOE designed a series of reports that were deliverable on the GDOE website. The release of results was clearly depicted as a joint venture by having both GDOE and OSA logos on a section of the GDOE website dedicated to Adequate Yearly Progress. The intent of the reports was to detail for schools the particular subject and subgroups that were applicable to a particular school and in which component a school did not meet the criteria for AYP. The first report was entitled "Summary of 2003 Adequate Yearly Progress," and gave an overview of the school's performance. Then a user could drill down for details to the "95% Participation," the "Academic Annual Measurable Objective," and the "Second Indicator" reports. To support the reports, a calculation guide was created that described in detail the datasets used, the specific calculations for each component, and the decision rules that supported the AYP determinations. In addition, a section was dedicated to defining terms related to AYP and to answering frequently asked questions. The website also had a listing of schools by districts with their final AYP determinations, a list of Title I schools and their school improvement status, a contact list, and an data verification packet that schools used to either verify the results or to file an appeal. Every attempt was made to make this site user-friendly and to inform the various stakeholders of the AYP process and the schoolspecific results.
Office of Student Achievement Accountability Reporting & Research
5
2003 Annual Report of Agency Activities
Development and Execution of the 2003 AYP Appeals Process
Schools were given the opportunity to appeal their AYP determinations. A form was
posted on the AYP website. This form was used to submit verification of data or to appeal
data and/or AYP determination. Appeals to Title I status were to be sent to the Title I
office in the GDOE and all other appeals were to be sent to OSA. OSA tracked all appeals
electronically in a database. An appeals committee was formed from members in various
departments (testing, technology, school improvement, Title I, and policy) of GDOE as
well as OSA key staff. Many appeals required a lengthy process that began with
communicating with the schools to get clarification of their request, appropriate
documentation, and requisite data. Then the appeals committee determined the course of
action that was necessary to act on the appeal. The committee reviewed 430 appeals and
120 of these had their original 2003 AYP status changed. It should be noted that not all
120 of these schools went from not making AYP to making AYP. Some schools appealed
the 95% participation component. As a result of these schools aiding OSA and GDOE in
matching additional student records, the schools then met the participation criteria.
However, the inclusion of additional students that were also full-academic year students
affected the percent of students proficient in a negative manner, and now the schools did
not meet the criteria of the annual measurable objectives. The table below shows the AYP
results following the appeals process as posted on the GDOE website on November 21,
2003.
Statewide Post-Appeal AYP Numbers
Met AYP
Did Not Meet AYP
Total Schools
Percent Met AYP
Percent Not Meeting AYP
All Schools
1269
730
1999
63.50%
36.50%
Title I Schools
758
357
1115
68%
32%
Non-Title I Schools
511
373
884
57.80%
42.20%
High Schools
106
255
361
29.40%
70.60%
Middle Schools
78
328
406
19.20%
80.80%
Elementary Schools
1083
141
1224
88.50%
11.50%
K12 Schools
2
6
8
25%
75%
Production of the 2002-03 Annual P-16 Education Accountability Reports
Part of the uniqueness of Georgia's accountability system is that it encompasses additional elements beyond the K-12 public school sector to provide a seamless, unified educational system. OSA began releasing its third annual education accountability report on K-12 schools and its second comprehensive education reports on the University System of Georgia (USG), the Department of Technical and Adult Education (DTAE), the Georgia Professional Standards Commission (GAPSC), and the Office of School Readiness (OSR) December 1, 2003. Each entity submits data to OSA according to specific data analysis requirements set forth by OSA. This year reports were released to the public as the data became available. The following pages briefly detail the contents of each entity's report.
Office of Student Achievement Accountability Reporting & Research
6
2003 Annual Report of Agency Activities
Georgia Department of Education
The 2002-2003 K-12 Report Card contains test results as well as other information relevant to schools and their performance toward the goals of student achievement and school completion. OSA developed a phased-in timeline for the assignment of school grades and ratings that would have issued grades beginning in 2003. This timeline has been put on hold for Georgia because the passage of the No Child Left Behind (NCLB) federal legislation mandated that each state build a single accountability system that incorporates the federal concept of Adequate Yearly Progress (AYP). OSA has worked as a partner with the Georgia Department of Education to produce the 2002-03 AYP Reports. OSA with a large committee of stakeholders in the education communities from around the state continues to forge ahead with establishing Georgia's K-12 accountability system that infuses the federal law with the state law.
In compliance with NCLB and state law, Report Cards must show results on state assessments for all students tested. On the other hand, AYP academic achievement reflects students who meet the definition of full-academic year; in other words, a school's AYP determination with respect to academic performance is based on those students for which the school had the most opportunity to impact their performance.
The K-12 report has four major sections: 1. Student performance results from Georgia tests 2. School performance indicators 3. School demographic information 4. National tests results
OSA's 2002-2003 K-12 Report Card includes assessment results from the following state assessments: Georgia Kindergarten Assessment Program-Revised (GKAP-R), the Criterion-Referenced Competency Tests (CRCT), Middle Grades Writing Assessment (MGWA), Georgia High School Writing Test (GHSWT), and the Georgia High School Graduation Tests (GHSGT) in all subjects assessed in 2003 and the two previous years. Multi-year comparisons at the school, system, and state levels options are now available. In addition, while viewing a school the user can opt to compare the school data to system and state data. Assessment and other data disaggregated by different student groupings. These groupings are based on race/ethnicity, gender, disability, and English proficiency status as required by the A Plus Education Reform Act of 2000 and the federal No Child Left Behind Act of 2001.
Office of Student Achievement Accountability Reporting & Research
7
2003 Annual Report of Agency Activities
For the past several years, the public has been confused by the presence of several report cards. Two sets of report cards were produced by two state agencies. OSA was mandated by state law to produce the official report cards. In previous years, OSA had consciously acted in a cost-effective manner and not duplicated information contained in the GDOE report card, but had reported test and other information with full disclosure by disaggregating that data as required by state and federal laws. This past year has resulted in a much closer collaboration between OSA and the Georgia Department of Education (GDOE). A result of this collaboration is that GDOE decided not to issue its own separate report card. With this decision, OSA is currently revamping its web report card to also include those other data elements from the GDOE report cards that school districts find valuable and will be augmenting the current web report over the next few months. While OSA's Report Card will be the official report card for the state, OSA continues to partner with other entities that are stakeholders in Georgia's education. For this reason, OSA is working with and providing the report card data tables to the School Council Institute, Public Policy Foundation, and the national data initiative of Standard and Poor's School Evaluation Services.
The school performance indicator section includes graduation rates for 2002 and 2003, 912 and 7-12 dropout rates for 2002 and 2003, attendance for 2002 and 2003, and participation in the Georgia Alternate Assessment for 2003. The data analyses supporting graduation rates and dropout rates required some refinements in order to be compliant with NCLB; primarily, the calculations are now dependent totally on the Student Record collection in order to meet federal reporting timelines. Also NCLB defined graduates as students who obtain a regular diploma within the standard number of years (i.e., four years from 9th grade to 12th grade). This transition means that students receiving special education diplomas and certificates of attendance are not considered graduates. Also students who graduate after summer school following their senior year are not included. For these reasons, the graduation rates are lower than the completion rates that had previously been reported in the state.
The school demographic section includes (1) fall and spring enrollment by grade, (2) enrollment by race, (3) percent of economically disadvantaged students (i.e., eligible for free and reduced lunch), (4) percent of students classified as migrant, (5) percent of students receiving special education services, and (6) percent of students with limited English proficiency.
In the national test section are results of the 2003 ACT, 2003 SAT, and the most recent results from the National Assessment of Educational Progress (NAEP). SAT and ACT information is available at the school, the system, and the state levels. NAEP results are only available at the state level since the NAEP is sampled at the state level.
In addition, AYP status will soon be added to the report cards with links to the full AYP reports on the DOE website. Since this is the first year that DOE has not published its report card, it has recently been decided that OSA will include additional elements from DOE's report card that school systems find useful. This data will be added as it becomes available.
Office of Student Achievement Accountability Reporting & Research
8
2003 Annual Report of Agency Activities
The format of the report card is in colorful graphs in order to be easily interpreted and to transcend any English proficiency barrier. There is also a question-and-answer section entitled "About the 2002-2003 Report Card. This section provides answers to frequently asked questions concerning the 2002-2003 Report Card. The user should find answers regarding content of the report card, sources of data, definitions and rules for reporting, how data was disaggregated, etc. If a visitor has a question that is not answered here, then the user can contact OSA and staff provide a response.
Office of Student Achievement Accountability Reporting & Research
9
2003 Annual Report of Agency Activities
Office of School Readiness
OSA's second annual report on the Office of School Readiness (OSR) is now available to the public on the OSA website. OSR administers Georgia's Pre-K Program. The purpose of Georgia's Pre-K Program is "to provide children with quality preschool experiences necessary for future school success." This report card on OSR includes two indicators that reflect the extent to which prekindergarten services are delivered to Georgia's four-yearolds and the quality of those services. Data are reported as provided to OSA by OSR.
During the 2002-2003 school year, some sites participating in Georgia's Pre-K Program closed or moved at some point during the year while others opened mid-year. Some sites may have changed ownership and operated under two different names during the year. For these reasons, some sites will appear to have incomplete information. OSA's report on OSR reflects the sites for which there was complete data.
The first indicator is accessibility of prekindergarten services to four-year-olds and focuses on Pre-K enrollment and the proportion of economically disadvantaged, or at-risk, students served during the 2002-2003 academic year. Participation in Georgia's Pre-K Program is voluntary for program providers and for families. Information is available for each of the 1,660 lottery-funded prekindergarten programs, by each of the 159 counties, and at the state level.
For each Pre-K program, OSA reports: Number of four-year-olds enrolled Number of four-year-olds identified as at-risk Percent of Pre-K enrollment comprised of at-risk students
For each county, OSA reports: Number of four-year-olds enrolled in a Pre-K program Percent of county's estimated four-year-old population served by a Pre-K program in the county Number of four-year-olds identified as at-risk enrolled in Pre-K program within the county Percent of total Pre-K enrollment comprised of at-risk students
At the state level, OSA reports: Number of four-year-olds enrolled in a Pre-K program in the state Percent of state's estimated four-year-old population served by Pre-K programs Number of four-year-olds identified as at-risk Percent of total Pre-K enrollment comprised of at-risk students
Office of Student Achievement Accountability Reporting & Research
10
2003 Annual Report of Agency Activities
OSR provided OSA with all the data included in this report. Enrollment figures for the 2002-2003 school year are based on OSR's February 2003 roster count collected on each Pre-K program. The estimates of four-year-old population are based on the latest census data which was analyzed by the Applied Research Center, Andrew Young School of Policy Studies at Georgia State University.
The second indicator focuses on the quality of services provided by Georgia's Pre-K programs. Beginning with the 2001-2002 fiscal year, OSR implemented a formal data collection process using its newly developed Program Quality Assessment (PQA). In the PQA, a representative from OSR visits each prekindergarten facility and verifies the certifications of instructors as well as evaluates staffing levels, the physical facilities of the location, and instruction and curriculum. This information is collected on an annual basis by OSR staff. OSR provided the PQA database to OSA. OSA's report on OSR includes the PQA ratings for each program. At the state level, the report shows the percent of programs receiving each rating on the following key elements that define a quality Pre-K program:
Lead teachers are certified, degreed, and meeting all other program requirements. Facility has implemented the approved curriculum appropriately. The classroom is arranged into clearly defined learning areas that enhance
children's growth and development. The Language and Literacy area is adequately equipped to provide many
opportunities for children to explore, manipulate, investigate, and discover. The Math/Manipulative area is adequately equipped to provide many opportunities
for children to explore, manipulate, investigate, and discover. The Dramatic Play area is adequately equipped to provide many opportunities for
children to explore, manipulate, investigate, and discover. The Art area is adequately equipped to provide many opportunities for children to
explore, manipulate, investigate, and discover. The Block area is adequately equipped to provide many opportunities for children
to explore, manipulate, investigate, and discover. The Science area is adequately equipped to provide many opportunities for children
to explore, manipulate, investigate, and discover. Music and movement materials are provided for children's use. Facility meets child/staff ratios. Facility has an appropriate daily routine. Facility provides an environment and instruction that promotes language
development.
OSR evaluates the above key elements for each program using the following ratings:
Meets OSR Standards - meets the required level of classroom and instructional quality as defined in the Office of School Readiness Pre-K Operating Guidelines, Standards of Quality, and The Georgia Prekindergarten Quality Assessment documents. Documents are available on the OSR website: http://www.osr.state.ga.us/
Office of Student Achievement Accountability Reporting & Research
11
2003 Annual Report of Agency Activities
Exceeds OSR Standards - represents a higher level of classroom and instructional quality.
Does Not Meet OSR Standards - indicates specific areas in need of technical assistance.
Partially Meets OSR Standards - indicates potential areas of concern and technical assistance opportunities.
It should be noted that since OSR has the responsibility to ensure the quality of prekindergarten services to Georgia's four-year-old population, the agency has the authority to discontinue providing lottery funds to any program that does not meet a minimum standard on a sufficient number of the key elements in the PQA survey.
Office of Student Achievement Accountability Reporting & Research
12
2003 Annual Report of Agency Activities
OSA's second annual report on the Georgia Department of Technical and Adult Education (DTAE) is now available to the public on the OSA website. DTAE provides a unified system of technical education, customized business and industry training and adult education.
OSA's second annual report card on DTAE focuses on three indicators: Retention rates of first-time, full-time award-seeking students, Graduation rates, and Pass rates on licensure/certification exams.
Data are reported as provided to OSA by DTAE. The data reflect the 2001-2002 academic year. The above indicators are reported for the DTAE system and each of the 33 technical institutions. When data are available, OSA reports these indicators for all students as well as by race/ethnicity, gender, and socioeconomic status. The race/ethnicity categories included for DTAE are Asian, Black, Hispanic, Native American, White, and Multiracial. Results for male and female students are also presented. Socioeconomic status is based on whether the student has applied for and been deemed eligible for a Pell Grant. If a student is deemed eligible for a Pell Grant, then that student is counted as an economically disadvantaged student.
Retention for an institution of higher education is an indicator of the school's success in keeping students enrolled from their first year to their second year. Retention rates reflect the percentage of fall quarter first-time, full-time students that continue enrollment the following fall quarter. This report presents both institution-specific retention rates and system retention rates. Institution-specific retention rates show the percentage of fall quarter first-time students that remain enrolled at the same school the next fall quarter. System retention rates show the percentage of fall quarter first-time students that are not in the same institution the next fall but are enrolled in some other DTAE institution. Retention rates for the system as a whole and for each of the 33 DTAE institutions are presented in OSA's web report. Results based on disaggregation by race/ethnicity and gender are also reported.
DTAE institutions offer a variety of degrees, diplomas, and certificates. OSA's web report on DTAE presents information on (1) graduation rates for associate degree and diploma programs and (2) rates for students who earn a professional certificate. The data provided by DTAE show graduation rates assessed at 1.5 times the nominal program duration (i.e., "time-and-a-half" graduation rates). The nominal length of an associate degree program or diploma typically ranges from one to two years. This report presents both institution-
Office of Student Achievement Accountability Reporting & Research
13
2003 Annual Report of Agency Activities
specific graduation rates and systemwide graduation rates. Institution-specific graduation rates show the percentage of students who entered as a freshman into an institution and graduated from that same institution. Systemwide graduation rates reflect those students who entered as a freshman into an institution and graduated from another institution within the DTAE system. The graduation rates provided by DTAE show the percent of first-time, full-time, degree-seeking students that earn an associate's degree or diploma within 3 years. The nominal duration for students earning a professional certificate typically ranges from 6 months to 1 year. The "time-and-a-half" rates provided by DTAE show the percentage of students completing a certificate program within 1.5 years. The graduation rates are also shown disaggregated by race/ethnicity and by gender.
The final indicator for DTAE reflects how well students from the technical colleges do on licensure or certification exams. The 33 technical colleges offer different combinations of degrees and programs. Therefore, the list of licensure exams for which OSA reports a passing rate varies for each institution. As with all other data, OSA reports for the DTAE system as a whole as well as by each college.
Office of Student Achievement Accountability Reporting & Research
14
2003 Annual Report of Agency Activities
University System of Georgia
OSA's second annual report on the University System of Georgia (USG). USG's Board of Regents was created in 1931 to unify public higher education under a single governing body. The Board governs 34 institutions that are organized into five sectors: 4 research universities, 2 regional universities, 13 state universities, 2 state colleges, and 13 two-year colleges. The 2002-2003 annual report card on USG focuses on three indicators:
Retention rates of first-time, full-time award-seeking students, Graduation rates, and Pass rates on the Regents' exams.
Data are reported as provided to OSA by USG. The data reflect the 2001-2002 academic year. The above indicators are reported for the university system as a whole, for each sector, and for each of the 34 institutions. When data are available, OSA reports these indicators for all students as well as by race/ethnicity, gender, and socioeconomic status. The race/ethnicity categories included for USG are Asian, Black, Hispanic, Native American, White, and Multiracial. Results for male and female students are also presented. Socioeconomic status is based on whether the student has applied for and been deemed eligible for a Pell Grant. If a student is deemed eligible for a Pell Grant, then that student is counted as an economically-disadvantaged student.
Retention for an institution of higher education is an indicator of the institution's success in keeping students enrolled from their first year to their second year. Retention rates reflect the percentage of fall semester first-time, full-time freshmen that continue enrollment the following fall semester. This report presents both institution-specific retention rates and system retention rates. Institution-specific retention rates show the percentage of fall semester freshmen that remain enrolled at the same college or university the next fall semester. System retention rates show the percentage of fall semester students that are in the same institution the next fall plus those who are enrolled in some other USG institution. Retention rates for the system as a whole, for each sector, and for each institution are presented in OSA's web reports. Results based on disaggregations by race/ethnicity and gender are also reported.
USG institutions offer a variety of degrees, programs, and certificates. OSA's web reports present information on (1) six-year graduation rates for baccalaureate (bachelor's) degree programs, (2) three-year associate degree graduation rates plus transfer rates, and (3) twoyear completion rates for one-year certificates. This report presents both institutionspecific graduation rates and systemwide graduation rates. Institution-specific graduation rates show the percentage of students who entered as a freshman into an institution and
Office of Student Achievement Accountability Reporting & Research
15
2003 Annual Report of Agency Activities
graduated from that institution. Systemwide graduation rates reflect those students who entered as a freshman into an institution and graduated from another institution within the university system. The data provided by USG show graduation rates assessed at 1.5 times the nominal program duration (i.e., "time-and-a-half" graduation rates). The nominal length of a baccalaureate (bachelor's) degree program is typically four years (although there are a few five-year bachelor's degree programs at some USG institutions). Consequently, the baccalaureate (bachelor's) degree graduation rates provided by USG show the percent of first-time, full-time, degree-seeking students that earn a bachelor's degree within six years. The nominal duration for associate degrees is two years; certificates range from one to two years. The "time-and-a-half" rates provided by USG show the percentage of students completing associate or transferring to four-year institutions within a three-year period. The certificate program completion rates provided by USG show the percentage of students completing a certificate program in two-years. The graduation rates are also shown disaggregated by race/ethnicity and by gender.
Beginning in 1972, the USG Board of Regents implemented the Regents' Testing Program as one means by which each institution in the University System can ensure that students receiving degrees from the institution possess certain minimum skills of reading and writing. The Regents' Test has two parts: a Reading Test and an Essay Test. Students' scores on the tests are used to determine whether they have the minimum levels of reading and writing skills required for graduation. Regents' policy requires that students must take the test in the semester after they have completed 30 semester credit hours if they have not taken it previously. Students who have earned 45 semester credit hours and have not passed both parts of the test must enroll in remedial courses until they pass both parts. The information provided by USG shows for each institution the percentage of students who pass the test before earning 45 credit hours. This information is disaggregated by race/ethnicity, gender and socioeconomic status.
Office of Student Achievement Accountability Reporting & Research
16
2003 Annual Report of Agency Activities
Georgia Professional Standards
OSA's second annual report on the Georgia Professional Standards Commission (GAPSC) is now available to the public on the OSA website. The GAPSC has the full responsibility for the preparation, certification, and conduct of the certified, licensed, or permitted personnel employed in the public schools of the state of Georgia. Its mission is "to provide a qualified teacher in every classroom by setting and applying high standards for the preparation, certification, and continued licensing of Georgia public educators." To reflect this mission, the OSA report card on GAPSC focuses on two indicators: the pass rates on the Praxis I and the Praxis II exams. These exams are used to ensure that individuals who are certified as educators in Georgia are qualified by showing mastery of basic skills and specific content for their teaching field. Data are reported as provided to OSA by GAPSC. OSA displays the data on its web-delivered report in colorful bar graphs that are easy to read.
The GAPSC uses the Praxis I tests of mathematics, reading, and writing to assess the basic skills of individuals seeking certification. As of March 1, 1999, candidates seeking teacher certification in Georgia had to pass all three tests. Praxis I is considered a pre-professional skills test. The GAPSC sets the passing scores on the Praxis I exam. OSA reports the 2002-2003 pass rates on the Praxis I for public and private post-secondary institutions with which the examinees have identified themselves as attending. The pass rates are presented for all examinees as well as disaggregated by race/ethnicity and by gender.
GAPSC also uses the Praxis II exams to ensure that Georgia educators are well versed in their teaching field(s). To be recommended for licensure, a person must earn satisfactory scores on the Praxis II Subject Assessments in the appropriate subject area(s) for the certification sought. Currently there are 58 Praxis II subject area tests. For each public and private post-secondary institution offering a teacher preparation program, OSA reports an overall Praxis II pass rate. OSA also includes for each individual institution the pass rates on the specific content exams taken this past year. OSA offers the user the ability to compare institutional pass rates on a specific content exam. Pass rate data for all examinees are presented as well as for disaggregations on the basis of race/ethnicity and of gender. It should be noted that the listing of exams by institution varies depending upon whether that institution offers teacher preparation in that area of certification and whether an institution had any student ready to exit their program and seek certification.
Office of Student Achievement Accountability Reporting & Research
17
2003 Annual Report of Agency Activities
Development of a Single Statewide Accountability System
With the passage of the No Child Left Behind (NCLB) Act that reauthorized federal education funding, states are now required to have a single statewide accountability system (SSAS) for all public schools. For this reason, OSA has spent much of the year spearheading the development of Georgia's AYP plan as the core of the state's accountability system. As previously mentioned under the section on the development of Georgia's AYP plan, the Standards and Grading Committee had been established for the purpose of building the state's accountability system based on state law. With the approval of Georgia's AYP plan, this committee's meeting on August 28, 2003, focused on the integration of AYP into a single statewide accountability plan that would integrate state and federal requirements. OSA formed two task forces from the membership of the Standards and Grading Committee with the missions of designing Georgia's grading system and establishing a rewards and consequences system that would be compliant with both state and federal laws. The task forces met independently at first, but then decided that joint meetings would be beneficial since many issues overlapped. The table below provides a description of the task forces' meetings.
Date Oct. 15, 2003 10:3012:30
Oct. 15, 2003 2:304:30
Nov. 4, 2003
Dec. 10, 2003
Meeting Details Related to SSAS Development
Committee
Purpose
Grading System
Overview of requirements and context relative to both state and
Subcommittee
federal laws
Historical review of associated accomplishments
Discussion of key issues related to merging the federal guidelines
for AYP determination with state requirements
Member recommendations for guiding principles
Projected timeline
Rewards &
Overview of requirements and context relative to both state and
Consequences
federal laws
System Subcommittee
Historical review of associated accomplishments Discussion of key issues related to the federal rewards and
sanctions system for Title I schools
Member recommendations for guiding principles
Projected timeline
Grading System and State criteria for accountability
Rewards & Consequences System Subcommittees
Integration of federal and state accountability systems Overview of what other states are doing Guiding principles for the tasks at hand (list was a result of the
discussions and recommendations from the Oct. 15 meetings
Four key questions for designing SSAS model
Rewards and consequences
Grading System and Working through a draft of a blueprint for building a single
Rewards &
statewide accountability system (SSAS)
Consequences System Subcommittees
Discussion gave rise to several models of how to combine AYP and state grading system
Meetings continued in January 2004 to put the final touches on models to recommend to the GDOE and the Governor's office. Some of these recommendations will possibly require some modifications to the state laws regarding the state's accountability system.
Office of Student Achievement Accountability Reporting & Research
18
2003 Annual Report of Agency Activities
Involvement in P-16 Initiatives
OSA continues its focus on seamlessness between the various education agencies. The emphasis on seamlessness is to promote the coordination and articulation of high standards and educational initiatives that support students and their learning as they progress through their public school opportunities in Georgia. OSA provides assistance to any education agency whenever possible. The Executive Director of OSA serves on the State School Superintendent's Cabinet. Various OSA staff members serve on committees such as the Student Information System, the Dropout Prevention Taskforce (Education, Go Get It), the NCLB Implementation Task Force, and the GDOE Policy Committee.
Office of Student Achievement Accountability Reporting & Research
19
2003 Annual Report of Agency Activities