PROGRAM EVALUATION
Office of Planning and Budget and the Department of Education
DEPARTMENT OF EDUCATION'S RESULTS-BASED BUDGETING DATA
August 2003
Russell W. Hinton, State Auditor Performance Audit Operations Division Department of Audits and Accounts
254 Washington St., S.W. Atlanta, GA 30334
Table of Contents
Background.....................p. 1-8 Recommendations and Agency Responses......p. 9-24
Appendices: Appendix A-- DOE's RBB...................p. 1-20
Appendix B-- Descriptions of Tests.........p. 1-2
Definitions of RBB Terms
Outcome measure: A quantitative or qualitative measure by which the performance of a program can be assessed against adopted goals and objectives.
Program: A systematic set of activities designed to accomplish a mission with specific goals and desired results. Activities are undertaken to effect a change in the program's customers or in the conditions the program is to address. Progress must be measurable.
Goal: Statement of what things will look like if the programs work as intended.
Desired Result: The portion of the goal that will be accomplished in one year.
Sources: O.C.G.A. 45-12-71 (11) and OPB memo
Results-Based Budgeting in Georgia
In 1993 the General Assembly passed the Budgetary Accountability and Planning Act, which amended the Office of Planning and Budget's (OPB) legislation to require that OPB develop and implement an outcome based budgeting system that relates funding to achievement of established goals and objectives, measures agency performance against attainment of planned outcomes, and provides for program evaluations for policy and funding determinations. In response to this requirement, OPB developed the Results-Based Budgeting (RBB) system currently in use. RBB information for state agencies was first included as part of the annual Governor's Budget Report in 1998.
It should be noted that passage of Georgia's Budgetary Accountability and Planning Act of 1993 was part of a national movement interested in greater accountability in government, also known as "governing-for-results." In general, "governing-for-results" at both the federal and state level is focused on outcomes and producing the best possible results, given the available resources. According to the National Conference of State Legislatures, most states are in some phase of applying performance measures to state agencies and programs.
Implementation of RBB In the early years of Georgia's implementation process, each agency identified programs and subprograms that were further
Department of Education Results-Based Budgeting Data
Page 1
defined by goals and desired results. The desired results are quantified with a number or percentage target the agency wants to achieve in one year and actual results provide the data to determine how close the agency came to meeting its target. For example, one of the Department of Education's goals for its Regular Education Program is that students will be adequately prepared for further education and the work place. One of the desired results associated with this goal is an average Scholastic Aptitude Test (SAT) score of 1,000.
RBB information is submitted along with each agency's annual budget request. Each year, state agencies complete the RBB by submitting actual results and identifying the desired results for the future year. In addition, agencies must submit a disclosure statement for each actual and desired result explaining how the data will be (or was) collected and analyzed, any limitations of the data, and any other explanation necessary to understand the information provided. For example, if the data is obtained from surveys, the agency would need to explain how many customers were surveyed, how these customers were selected, and, for the actual results, how many customers responded. According to OPB, the disclosures are necessary for the OPB analyst to understand the degree to which he or she can rely on the information. For example, for a survey, if the respondents were not randomly selected or if only a small percentage returned the survey, the analyst would consider that fact when reviewing the results. Important disclosures may be included as footnotes in the Governor's Budget Report. In its 2001 memorandum to agency heads, OPB specified that the desired results should be as clear and simple as possible to facilitate understanding by the layperson.
OPB provided direction and assistance to the agencies in the development of these components during the first few years of RBB. According to staff, they continue to offer agencies assistance as requested. For the past two years, OPB's focus has been on reporting actual results and agencies have been directed to report on these results, but not to make changes to the programs, goals, or desired results without prior approval from OPB. OPB has noted that ensuring the RBB is complete and the data is reported accurately is the responsibility of agency personnel.
According to OPB staff, the RBB is used primarily as a tool to enable the Governor to manage statewide priorities. The Results-Based Budgeting Overview presented in the Governor's Budget Report states that RBB information will assist the Governor and the General Assembly in allocating scarce resources to the best benefit of the citizens of Georgia by: ...Enabling programs to be evaluated and funded according to their actual benefit to program customers and taxpayers. At present OPB does not provide training to legislators regarding RBB data. Additionally, the RBB measures are not directly linked to the agencies' budget requests, so it is not always possible to align the amount spent on an individual program with the progress made toward achieving a specified desired result.
Page 2
Department of Education Results-Based Budgeting
According to a June 2003 memorandum, OPB's 2005 budget review process will shift from a continuation budget methodology to a prioritized program-based budget. Under the programbased budget process, agencies will submit information regarding funding history, performance measures, program results and priorities to be used in making funding decisions. Agencies have been directed that programs should address key policy and service areas. Additionally, the intent is to provide decision-makers with sufficient information to be able to link budget requests, funding, and expenditures to individual programs. It was also noted that agencies should work with OPB to review the programs as defined previously for the Results-Based Budget to determine if changes are necessary or if the current program designations are appropriate. The new program-based budget will be similar to RBB in so far as agencies will be required to devise goals and performance measures for each of the programs identified.
Department of Education's Results-Based Budget Process
In fiscal year 2004, the Department of Education's RBB document identified three programs: Regular Education, Exceptional Students, and Education Support. These three programs were further divided into 15 subprograms defined by 33 goals and 70 desired results. (See Appendix A for a copy of the Department's fiscal year 2004 RBB). The desired results were measured by national tests such as the SAT, and state-developed tests such as the Criterion-Referenced Competency Test (CRCT) and the Georgia High School Graduation Test (GHSGT). In addition, surveys of school systems and surveys of employers were also used to measure desired results. (See Appendix B for a listing and definitions of the tests administered in Georgia.)
OPB begins the RBB process each summer by sending agency heads and fiscal officers a memo specifying what the agency is to submit. Exhibit 1 on the following page demonstrates the process used to develop and submit the fiscal year 2004 RBB. After sending the memo, OPB staff provides the Department's budget office with a copy of the prior year's RBB document broken down by program and subprogram. The budget office forwards each RBB goal section to the appropriate program person, who then compiles the data necessary to fill in the blanks on the RBB document. At least 13 program managers were involved in providing data for the fiscal year 2004 RBB.1 According to program staff, they use various methods for obtaining data, including requesting reports from the Department's Administrative Technology Unit, requesting data from the Department's Testing Division, pulling information from the Department's web site, pulling data from a contractor or testing company's web site, etc. Questions go back and forth between OPB and Department program staff or between OPB and the budget staff. Once the data is compiled and the disclosure statements are completed, each program manager either emails the document directly to OPB or to the budget office, which forwards it on to OPB. OPB
1For purposes of this report, the term "program manager" refers to individuals responsible for programs such as English for Speakers of Other Languages (ESOL), which RBB identifies as sub-programs.
Department of Education Results-Based Budgeting Data
Page 3
Exhibit 1
Page 4
Department of Education Results-Based Budgeting
staff reviews the information and has the latitude to delete goals and desired results if deemed necessary. They may also require new goals and desired results be created. OPB staff does not have the latitude to alter the data provided. Once OPB approves the RBB submission it is included in the Governor's Budget Report and sent to the General Assembly.
Education Reform
Two recent educational reforms have had a significant impact on increasing education accountability and encouraging a focus on results in Georgia the A Plus Education Reform Act and the No Child Left Behind Act. Both reforms also serve as the measures by which schools, school systems and the state's educational system are deemed successful.
In 2000, Georgia passed the A Plus Education Reform Act which was modeled on Texas legislation and called for increased accountability for teachers, schools, school systems, and the state Board of Education. While the Act encompasses a variety of areas including educational funding, maximum class sizes, performance reviews and continuing education of teachers, and the creation of the Office of Educational Accountability (OEA), it also has student achievement provisions that are outcome- or results-oriented. The primary student achievement goals of the Act include improving SAT scores, decreasing the percentage of students who fail the Georgia High School Graduation Test, and decreasing the number of students who drop out of school. Another requirement of the Act is that school performance be monitored and rated on a scale (A, B, C, D, and F) based on absolute student achievement standards and on progress in improving student achievement. The OEA is responsible for establishing achievement standards and grading the schools.
In 2002, the federal No Child Left Behind Act (NCLB) was passed by the U.S. Congress. In Georgia, the framework for implementing NCLB had already been put into place under the A Plus Education Reform. NCLB requires schools, school systems, and each state to demonstrate adequate yearly progress (AYP) as defined by each state and approved by the U.S. Department of Education. Specifically designed to address the achievement gap among student groups, NCLB calls for the disaggregation of data based on students' economic status, race and ethnicity, disability status, and English proficiency level, etc. Students' progress toward meeting state-established standards in math, reading, and science is to be measured by tests that are aligned with each state's standards.
Schools that continually fail to demonstrate AYP are subject to a range of sanctions depending on the number of years that the school has failed to meet the standards. For example, if a school fails to make AYP for two consecutive years it must receive technical assistance and develop
Department of Education Results-Based Budgeting Data
Page 5
and implement an improvement plan. A school's failure to meet adequate yearly progress for a fifth year requires the school system to restructure the school by replacing staff or turning the school over to the state. NCLB also gives parents options if their child is enrolled in a school that is chronically identified as being in need of improvement. For example, parents have the option of transferring their child to a better performing school or receiving supplemental education services, such as tutoring, after school services or summer school programs.
Exhibit 2 Overlap of State Accountability Reports with RBB
School Reports School System Reports
State Report
GKAP-R1 CRCT1
5th grade Writing Test 8th grade Writing Test
GHSGT1 GHSWT NAEP1
SAT1 ACT AP1
Graduation information High School Completion Rate Graduates Entering GA Public Colleges Graduates Requiring Learning Support
DOE
OEA K-12
Report Card
Report Card
Disaggregation of Results
Reported by the following
Reported cumulatively, not by student subgroup
student subgroups: Race/Ethnicity, Gender, Disability, and English
proficiency status
Level of Detail Reported
yes
yes
yes
yes
yes
yes
Assessment Tools Reported
no
yes
yes
yes
yes
no
yes
yes
yes
yes
yes
yes
no
yes
yes
yes
yes
yes
yes
no
Other Data Reported
DOE RBB
Reported according to student's participation in particular programs such as special education, gifted, or ESOL programs.
no no yes
yes yes no no yes no yes yes no yes
yes
no
yes
yes
no
yes
yes
no
yes
yes
no
yes
For more information about each
assessment tool see:
www.accountability.doe.k12.ga.us
1See Appendix B for descriptions of these tests
www.ga-oea.org
Appendix A
Page 6
Department of Education Results-Based Budgeting
Reporting by the Office of Education Accountability As noted earlier, OEA is charged with establishing standards and grading schools. In addition, it was designated as the reporting agency for the NCLB information. OEA obtains information from data housed by two separate sections within the Department of Education: the Administrative Technology Unit and the Testing Division. As part of its responsibilities, OEA produces the official report card on the state's education situation and attempts to disaggregate the data to show results for required groups of students. The Department also issues a report card on the state's education situation, and as shown in Exhibit 2 on the previous page, there are several areas in which the information presented in OEA's report card, the Department's report card and the RBB document overlap. For example, each provides state-level information, and reports results of the Georgia High School Graduation Test (GHSGT). However, unlike the RBB, the two report cards also supply information for each school system and school in Georgia. It should be noted that each reporting tool also provides additional information not found in the others. For example, the Department's report card provides revenues and expenditures, demographic information and community information. OEA's report card provides enrollment data by grade and by race/ethnicity, gender, and special needs. Less than half of the RBB's 70 desired results are represented in the chart. The majority of the remaining desired results reflect analyses and surveys performed by individual programs.
Funding Information
The Department of Education's fiscal year 2004 budget of $5.93 billion in state funds represents 37% of the state's total budget of $16.2 billion. It should be noted that $5.4 billion (91%) of the Department's funding is determined by the Quality Basic Education Act (QBE) formula.
While the Department estimates its expenditures and funding requests according to the three RBB programs (Regular Education, Exceptional Students and Education Support) to meet RBB reporting requirements, appropriations cannot be linked to these programs. Appropriations are made according to cost categories such as personal services and regular operating expenses.
Evaluation Scope and Methodology
This program evaluation was conducted at the request of the Budgetary Responsibility Oversight Committee (BROC). The evaluation was conducted in compliance with O.C.G.A. 45-12-178 and in accordance with generally accepted government auditing standards for performance audits.
Our review of the Department of Education's Results-Based Budget data addressed the following questions:
Department of Education Results-Based Budgeting Data
Page 7
Are the goals and desired results valid in that they provide information on progress toward achieving the agency's mission?
Are the goals and desired outcomes good measures of the agency's effectiveness and efficiency?
Are the data reported reliable and accurate? (This includes a review of the systems through which data is reported.)
Can and is the RBB data used internally by the agency to assess its performance? Can and is the RBB data used to inform external parties (including the public and the
legislature) regarding the Department's performance?
Our review focused on fiscal year 2004 RBB information with reviews of previous years' data as necessary. With the assistance of the Department of Audits and Account's Information Systems Audit and Assurance Services Division, we reviewed the reliability of systems used to collect RBB data as well as the internal controls used to ensure source data is correct and complete. To evaluate the validity of the goals and desired results, we reviewed accepted criteria (as identified by groups such as the federal General Accounting Office, the Governmental Accounting Standards Board, the U.S. Department of Education, and the National Conference of State Legislators) for evaluating the effectiveness and efficiency of government programs and compared these criteria to Georgia's RBB. In addition, we surveyed members of the Education and Appropriations Committees in the House and Senate to gather information on how RBB information is currently used and whether they view it as a useful tool in determining whether the state is progressing towards achieving its educational goals. However, only four survey responses were received, so results were not included in our analysis.
Our evaluation did not include the following:
An evaluation of the effectiveness of RBB as a tool to manage the Governor's priorities A review of school systems' goals, or their use of RBB to measure local progress A review of individual programs' effectiveness; or, A formal audit of all DOE computer systems
This report has been discussed with appropriate personnel representing the Office of Planning and Budget and the Department of Education. A draft copy was also provided for their review and comment. Pertinent responses have been incorporated as appropriate.
Page 8
Department of Education Results-Based Budgeting
Recommendations and Agency Responses
It should be noted that the Office of Planning and Budget has announced plans for revising agency budgets and the RBB process as part of its new program-budgeting requirement. While these findings focus on fiscal year 2004 and earlier budget processes and RBB submissions, recommendations are applicable to goals and performance measures that will be required under this new initiative.
In its response to the report, OPB noted that the findings will be used to "enhance the Governor's newly implemented Prioritized Program Budget which has incorporated and refined the RBB process."
RECOMMENDATION #1 The Department of Education's Results-Based Budget does not provide a credible representation of the Department's progress toward addressing the education needs of Georgia's children. The RBB document cannot be used to determine whether the state is making progress toward achieving its educational goals because of problems with the validity and reliability of the information, as well as problems with the presentation of the information which make it difficult to interpret the results. Our review of the Department's RBB identified the following:
technical assistance to schools and school systems; and, major components of education accountability legislation have not been incorporated.
The reliability of reported results cannot be verified because no process exists for establishing desired results targets; and no process exists for how actual results are to be calculated or compiled. As a result, actual results cannot be reproduced.
The systems used to collect data used in RBB do not contain sufficient checks to ensure the data is accurate. In addition, the Department does not validate data at the school level to ensure that what is submitted is complete and accurate.
The validity of the goals and desired results is questionable. They are incomplete in that not all goals are thoroughly measured by the corresponding desired results; for those that are, it is difficult to determine whether actual progress has been made; the goals do not address the Department's primary functions such as providing
The RBB document itself is difficult for the layperson to use and is not directly related to the budget in such a way that decision-makers can determine the costs associated with achieving the results.
In order to be an effective document for determining whether the Department is making progress toward achieving its goals
Department of Education Results-Based Budgeting Data
Page 9
for education in the state, the goals and desired results must accurately measure the Department's activities and the information presented must be valid and reliable. These issues are discussed in more detail in the following findings.
In its written response to the report, the Department indicated that more definitive measures are desirable; however, it does note that there have been significant educational reform efforts over the past five years, and that both internal and external educational policy makers have also changed, which has precluded a uniform focus/direction to data collections, testing, and budget decisions. The Department also noted that all budget units are required to outline and track performance criteria as part of its new strategic planning process.
RECOMMENDATION #2 Action should be taken to make the RBB document more useful to decisionmakers. Our review revealed that the RBB document is lengthy, difficult to understand without additional information, and is not used as a management tool within the Department. As a result, the usefulness of the document for the layperson, as well as legislative and agency decision-makers is limited. These points are discussed in more detail in the following bullets. Volume: The RBB is lengthy both in the
number of goals and measures and the volume of data presented. It includes 20 pages of information, including three programs, 15 subprograms, 33 goals and 70 desired results measures. In addition, programs and goals are not prioritized in terms of what influence the Department
Exhibit 4 Example of an RBB Measure that Requires Additional Explanation
Program: Regular Education Purpose: Ensure Georgia's K-12 students are academically prepared for their futures in the 21st century.
Sub program: Academic Achievement Purpose: Ensure Georgia's K-12 students are academically prepared for further education and the workplace by providing leadership and support to initiate, promote, enhance and communicate curriculum and programs of study in all academic areas for education and the general public.
Goal 1: Students will be adequately prepared for further education and the workforce.
Desired Result 1c: The percentage of students scoring 3 or above on Advanced Placement (AP) exams and the number taking AP courses.
Targets: FY1999--63%; FY2000--64% and FY2001-2004--65% each year (Note:
no target was listed for the number of students taking the course)
Actual Results:
Source: DOE's RBB Document (See Appendix A)
FY 1999--54.3% FY 2000--55.5% FY 2001--56.2% FY 2002--59.0%
7,069 of 13,018 8,116 of 14,623 20,846 of 37,092 25,297 of 42,748
Page 10
Department of Education Results-Based Budgeting
has over the outcomes. (See Appendix A)
Clarity: The RBB document does not include sufficient information to understand the results presented. It uses nine tests to measure a large number of its results; however, there is no description of what the tests are designed to measure or how the results should be interpreted.
For example, as shown in Exhibit 4 on the previous page, no description is provided for the Advanced Placement (AP) exam nor is the significance of scoring a "3" explained. Therefore it is not possible to determine how the desired result relates to the goal of having students adequately prepared for further education or the workforce. If the reader were alerted to the fact that the AP exam is given to students taking advanced high school classes and that scoring a three on the final course exam allows the student to obtain college credit and therefore skip that course in college, the reader could understand that such a score is indicative of being prepared for college.
Additionally, there is no explanation of why the desired result was set at 65% of students scoring three or more on the AP exam when the state's actual results have fallen short of, and made only incremental increases toward, the target over the last four years. There is no explanation for why a target is not included for the number of students taking the test; and, there is no explanation for why the state has apparently seen a 192% increase from fiscal year 2000 to fiscal year 2002 (from 14,623 to 42,748) in the number of
students taking the AP exam. As noted by the Urban Institute in Making ResultsBased State Government Work, explanatory information should be provided along with other performance measures, particularly when outcomes fall short of expectations.
In other cases, the reader must assume a relationship exists between the measure and the desired result. For example, the Foreign Language Program lists scores on the SAT as one of its measures (see Appendix A, p. 7). This is an acceptable measure assuming the relationship between SAT scores and the study of foreign language has been studied and found to be a positive one. In another example, the reader must assume that performance in math and science are key to agricultural competency and that there are no other measures of this competency (see Appendix A, p. 9). While the measures in both examples may be valid, it is not readily apparent to the reader. As noted by the U.S. General Accounting Office (GAO) in its report The Results Act An Evaluator's Guide to Assessing Agency Annual Performance Plans, goals and measures should not require subjective considerations or judgments to dominate the measurement.
Utility: Department staff reported that the RBB is not used as a management document to evaluate and guide the agency's activities. In addition, the document is not directly related to the budget in such a way that it can be used to determine the cost of individual subprograms, much less the costs of
Department of Education Results-Based Budgeting Data
Page 11
achieving specified goals and desired results.
While two program managers noted that
the measures included in RBB were
Without [links between budget request and performance measures], agency plans and performance measures
reflective of federal measures they were required to use, eight other
had limited value in the m a n a g e r s legislative budget process. indicated they
-Florida's Office of did not apply Program Policy Analysis RBB measures
and Government w h e n
Accountability determining the
success of their
programs. It is important that these
measures be utilized internally because,
as noted by the GAO in its report
Performance Plans Selected Approaches
for Verification and Validation of Agency
Performance Information, data is likely to
be of a better quality if used to manage
programs. Program managers are more
likely to pay attention to the data and
identify errors.
Additionally, the RBB is not currently linked to the budget in such a way that the costs of programs can be tied to the results achieved. The two documents are organized differently and the budget does not correspond structurally with the RBB goals. Additionally, the Quality Basic Education Act formula, which makes up 91% of the Department's budget, is not broken down by program, so it is not possible to identify the cost of individual programs. Without such linkages, the performance measures have limited value
in the decision making process.
As noted by GASB in its special report titled Reporting Performance Information: Suggested Criteria for Effective Communication, reported information should be aggregated or disaggregated based on the needs and interests of intended users. By doing so, information will not be misleading because it obscures or is not representative of true performance and it will be relevant to users with different interests and needs. By improving the readability of the document, it increases the likelihood that it will become a useful tool in decision-making.
In its written response, the Department noted that the organization of the RBB is not at the discretion of the Department; it is under the aegis of the Governor's Office of Planning and Budget (OPB). The Department noted that it will continue to work closely with OPB to improve the next iteration of the RBB or Program Budget document.
RECOMMENDATION #3 To improve the validity and reliability of the Department's goals and desired results, the Department's Results-Based Budget should be reviewed to ensure that each goal is comprehensively measured by its results measures and that data is available to measure the desired results. Although the Department's 2004 desired results appear to be logical measures of progress toward meeting their respective goals, a review of the Department's goals
Page 12
Department of Education Results-Based Budgeting
found that some goals are not comprehensively measured by their corresponding desired results. In addition, there are a number of measures for which no actual results are reported. Examples of these problems are discussed in more detail below:
The Department's RBB includes a goal that indicates that students will be proficient in science (See Appendix A, p. 5). The Department measures progress in attaining this goal by reporting the results of the percentage of students meeting expectations on the science portion of Georgia's Criterion-Referenced Competency Test (CRCT) the first time they take it, and by the average percentile ranking in science on norm-referenced tests. The CRCT results, however, only include grades four through eight and the norm-referenced test results only include grades three, five and eight. As a result, there is no measure of proficiency in science for students in grades nine through twelve. It should be noted that all Georgia students, regardless of the diploma they are seeking, are required to pass the Georgia High School Graduation Test in a variety of subject areas, including science. The results of this test, however, are not used to measure progress toward this goal.
The Agricultural Education subprogram has a goal of promoting the development of agricultural competency and academic skills (see Appendix A, p. 9). This goal is measured by the average academic performance of agriculture completers in Math and Science on the National
Assessment of Educational Progress Test (NAEP). (The NAEP is a national achievement test administered to a randomly selected sample of schools and students in each state. Therefore, not all students in Georgia take the NAEP.) As noted in the reported results, however, only 378 agriculture completers took the test in 2002. Because there were 35,838 students in the Agriculture subprogram in the 2002 school year, it is questionable as to whether the results of the NAEP can be considered representative of the entire program's achievement.
The Department has a goal that students will be adequately prepared for further education and the workforce (see Appendix A, p.1). However, despite the eight desired results designed to measure progress in meeting this goal, none of them deal directly with measuring preparation for the workforce. Instead they focus on measures related to graduates going on to college and other postsecondary education and student achievement.
A review of the 70 desired results included in the Department's 2004 Results-Based Budget revealed that 17 (24%) had no actual results reported for a variety of reasons including that data had not been provided or was not usable. An additional 11 (16%) had no results reported for 2002 because the data was not yet available (but was expected in the future). Furthermore, nine of these measures that did not have data collected in 2002 also had no data reported for 2001. Clearly, a lack of data does not
Department of Education Results-Based Budgeting Data
Page 13
allow for progress to be measured and renders the goals and desired results for which there is no data useless. (2002 is the most recent year for which data could be reported in the 2004 Governor's Budget Report.)
Steps should be taken to improve the validity and reliability of the Department's results measures by placing priority on the collection of important outcomes-related data, and by ensuring that goals are fully measured and that all relevant students and grades are included in the results. According to the U.S. Governmental Accounting Office, the reasonableness and appropriateness of proposed measures are also influenced by the extent to which the needed data can be obtained at a reasonable cost i.e. the extent to which benefits obtained from providing the data outweigh the costs of producing it.
In its response to the report, the Department indicated that as its new vision and mission evolves, and in light of new requirements under No Child Left Behind federal legislation, it will scrutinize closely the goals established and the progress made toward the accomplishment of these goals.
RECOMMENDATION #4 Due to changing goals and desired results, it is difficult to determine if the Department is making progress in achieving the goals in its Results-Based Budget. Although the Department began submitting a
Results-Based Budget in fiscal year 1998, none of the goals and desired results has remained the same since 1998. As a result, there are few measures currently in place that have enough trend data to determine if progress is being made. As noted by OPB staff, in order for RBB to be useful in measuring outcomes, data trends are needed. Annual results data alone is not enough to make decisions about program effectiveness given abnormalities that may occur in any given year. Examples of the changes that have occurred are provided below.
An analysis of the Department's 2004 RBB revealed that only 16 (48%) of the Department's 33 goals and 16 (23%) of 70 desired results have remained similar since fiscal year 2000.
Between fiscal years 2002 and 2003, 22 goals were deleted or replaced with 18 new goals. In addition, a total of 42 desired results were deleted or replaced and 45 new desired results were added.
In an effort to determine if progress has been realized for measures that have remained constant, goals and desired results for one of the Department's three programs were reviewed. Only eight (25%) of the Regular Education Program's 32 desired results have remained the same since fiscal year 2000. For these eight measures, there have been mixed results. For example, seven of the measures indicate positive gains and one has shown a decline. It should be noted that progress toward any desired result may or
Page 14
Department of Education Results-Based Budgeting
may not be an adequate measure of success because, as noted in a subsequent finding, problems have been identified with the way the desired results targets have been set.
The reason the Departments RBB has changed could not be specifically identified. However, there are several factors that may have contributed to the volume of changes. OPB has made several changes to the RBB submission process over the years, which required changes to the measures. For example, according to OPB staff, the poor quality of agency submissions in the first year caused them to offer training to further educate agencies on how to identify useful goals and results measures. Following this training, many agencies changed their goals and desired results for the following fiscal year's RBB submission. Between 2002 and 2003, there was another shift in OPB's direction to agencies. The new process required agencies to ensure that valid, accurate and timely results are reported and that data collection strategies for developing and obtaining missing and substandard outcome data had been improved. According to OPB staff, agencies realized that OPB was using the information and paying attention to the results, which resulted in further changes to the information provided. OPB's newest directive requires agencies to develop performance measures and program results around core businesses for the fiscal year 2005 program-oriented budget. This shift is expected to result in a redefinition of programs, and therefore a redefinition of goals and desired results.
While it is reasonable to expect changes and improvements were necessary in the early years of the process, it is also expected that the measures would have become more stable over time, not continue to change every two years. And while it may be necessary to periodically review and revise measures to reflect changing responsibilities or legislative requirements, steps should be taken to ensure that such changes do not result in a severe lack of trend data necessary to evaluate a program. Without a consistent set of programs, goals, and desired results, the value of submitting results measures is greatly diminished.
In its response to the report, the Department noted that more definitive measures are desirable and that recent reform efforts and changes in both internal and external educational policy makers have precluded a uniform and continual focus/direction to data collections, testing and budget decisions. In addition, the Department indicated that its Results-Based Budget will be revamped in light of recent educational reform laws and of new leadership and administration, both internally via the Superintendent of Schools and the State Board of Education and externally via the Governor and the General Assembly. As noted by the Department, this poses a continuing concern, however, as this may once again provide significant changes in the information presented due to changing priorities and initiatives of new leadership.
Department of Education Results-Based Budgeting Data
Page 15
RECOMMENDATION #5 The Department should develop a methodology and process for setting RBB targets that reflects expectations of what programs should accomplish during the coming year. Currently, individual program managers set quantitative annual targets for each desired result. As a result, there is no assurance that targets are reflective of the Department's expected results for the programs. Additionally, there is no assurance that appropriate targets are set to encourage performance, or that targets reflect the actual impact of the program. These points are discussed in further detail below.
According to our evaluation, targets are set by individual program managers without management's review to determine whether the target is appropriate and in line with Department objectives. Additionally, because the targets are set by those responsible for accomplishing them, there is little incentive to set ambitious targets. Instead, there is a built -in incentive to set easily reachable targets.
program manager reported using the previous year's actual results as the target for the next year. Another uses the national average test scores from four years prior as the target. Additionally, because targets are set without statistical analysis of the margin of error inherent in the data, there is no assurance that the change measured is due to program intervention as opposed to random variations.
By establishing a systematic methodology for developing desired results targets, the Department can ensure that the targets reflect the Department's goals and intentions for the programs identified. Assurance can also be provided that the changes in results are due to Department intervention as opposed to random fluctuations in data.
In its written response, the Department noted that, as part of its reorganization, new policies and procedures are being implemented to ensure that program mangers identify specific goals. The Department also indicated that the report's recommendations will be taken into consideration as this process continues.
There is no process in place governing how the targets should be set. Interviews with program staff and review of the Department's 2004 RBB data revealed that targets are not based on trend analysis or an estimation of the program's impact on the result. Instead, one manager reported simply adding 0.5% to each of his prior year's targets, while another adds 0.9% and another adds 2%. One
RECOMMENDATION #6 The Department should develop a process to ensure that reported outcome data is a credible and reliable representation of each program's annual progress. Currently, each program manager is responsible for determining the process by
Page 16
Department of Education Results-Based Budgeting
which raw data is acquired and actual results are computed and reported. The lack of an independent and standardized process to collect, compute, document, and report outcome data has led to results of questionable accuracy because they cannot be reproduced. Our evaluation of RBB actual results, program files, and interviews with Department and OPB staff revealed the following:
Data is collected, calculated, and reported by the programs responsible for the results being measured, without external review or validation. The reported results are not reviewed for accuracy or consistency by Department management or by OPB. Without independent review of the processes by which information is gathered and calculated, the credibility of the reported data is questionable.
from the College Board website showed different state averages for Georgia than those reported on the Department's Report Card. Both sources might be correct, depending on the student group measured, and other analytical issues. The discrepancy illustrates, however, the need for an official and consistently reported source of RBB data.
Additionally, surveys are created and responses tabulated by the people who are responsible for the programs' results. There is no systematic review of the surveys to ensure they are designed appropriately to elicit accurate responses and that the results are interpreted objectively. It should be noted that Department executive staff have stated that in the future surveys will be reviewed and approved by the Department's Policy Division.
Program managers independently
determine which source they will use for
data. Data may be collected from
multiple sources internal and external to
the Department. For example, while data
from testing contractors is managed by
the Department's Testing Division, a
program manager reported going to a
Obtaining quality
contractor's website
performance information to obtain data for
is an agency-wide
RBB actual results.
management issue. -- U.S. Governmental
Data
publicly
Accounting Office presented by these
external sources does
not always match
results officially reported by the
Department. For example, a comparison
of Scholastic Aptitude Test (SAT) data
Procedures used to collect and calculate results are not documented. Interviews with seven program managers revealed that in six instances, no documentation exists to provide assurance that calculations used to generate RBB results were performed correctly or consistently. Without formal procedures in place, there is also no assurance that data is collected and calculated consistently from year to year. It should be noted that in at least one subprogram, RBB desired results were designed to reflect indicators required by the federal grant supporting the program. As a result of the federal requirements, data sources and calculations were documented.
Department of Education Results-Based Budgeting Data
Page 17
Employee turnover in the Department has been a problem in maintaining adequate documentation of procedures used to obtain and calculate data reported in the RBB. For example, for the Academic Achievement Program, neither OPB or Department staff had records to identify the person responsible for submitting actual and desired results for the fiscal year 2004 budget cycle, nor were hard copy files of the RBB submission available for review. According to program and Administrative Technology staff, data requests for RBB actual results are usually requested separately each year, by individual program managers, rather than being part of a standard annual RBB report generated for the Department as a whole. Therefore, there is no assurance that the data is requested and obtained in the same manner each year.
Electronic files used to transmit RBB data are not protected from tampering. Annual requests for RBB data are sent to the Department's budget office, and then to individual program managers, as electronic spreadsheet files. Portions of the files containing previously reported information from prior years are not protected from accidental or intentional changes. Department budget staff provided the audit team with electronic copies of the original data requests and we were able to manipulate all data, including previously reported actual and desired results. The same potential exists for accidental loss or overwrite of new data every time the file is opened and manipulated by Department or OPB staff. As these data files are the basis of each
year's RBB report, the risk exists that any information could be changed without OPB's knowledge. Although OPB staff includes disclaimers in each data request warning Department staff not to modify previously reported results, OPB should also take advantage of change-protection functions provided in the spreadsheet software.
Absent documentation regarding data sources, and collection and calculation procedures, the accuracy of the data cannot be validated. Internal or external evaluators cannot reproduce the Department's actual results presented in the Governor's Budget Report.
RECOMMENDATION #7 Action should be taken to ensure that limitations on data quality are revealed and communicated to the users of RBB. Program managers do not consistently reveal data sources or disclose concerns about data quality in their RBB submissions, as required by OPB. Additionally, information that is provided is not consistently communicated to the reader of the final RBB document. Under the current processes, individual program managers are provided with limited guidance on what to disclose, and no action is taken if disclosures are not made. These points are discussed in more detail below.
In an official memorandum to agency heads regarding data disclosure requirements for the 2004 budget, OPB
Page 18
Department of Education Results-Based Budgeting
requested that those reporting RBB data explain collection techniques detailed enough to satisfy a thorough review of the data collection methodology. The letter also outlined specific conditions that can limit data quality. However, the less formal data request/disclosure files sent to Department program managers, which comprise the bulk of communication between OPB and program staff, are much less detailed and informative. Instead, program staff are requested to please disclose any information that might impact the integrity of these data. Once again, it is left to each program manager's discretion and understanding of data quality issues to determine what information should be included.
In addition to being provided with limited guidance as to what disclosure information is pertinent, it could not be documented that any action is taken if the responsible program manager does not provide all requested information. A review of the data disclosure forms submitted for the fiscal year 2004 RBB indicated that for two of 15 subprograms no contact person was identified, although data was provided. Of the 70 desired results contained in the disclosure documents, only 14 (20%) were documented in such a way that the validity of the data could be verified. Twenty three (33%) did not include information on how data was collected and analyzed and 33 (47%) did not contain sufficient information to allow the results to be reproduced. During the evaluation, program managers also
reported concerns about data quality that had not been disclosed in the forms presented to OPB. For example, one program's information is often preliminary when reported to OPB; another reported that the data is difficult to disaggregate down to specific student populations.
Program manager's documented concerns about data are not disclosed to readers of the Governor's Budget Report. For the fiscal year 2004 budget, Department program managers reported concerns or contextual comments about data quality for nine of 70 desired results. Only two of these concerns were fully documented in the Report. Because data concerns are not fully documented, users of the document would not know, for example, that because data for one measure was collected by the local school systems through a process of their choice, the quality of their data is dependent on the method they chose to gather the data.
If concerns about data quality are not discussed in the Report, readers cannot evaluate the reliability of the results being reported.
The Department noted in its response to the report that it will continue to provide information to OPB on data issues and concerns.
Department of Education Results-Based Budgeting Data
Page 19
RECOMMENDATION #8 The Department should make certain that adequate controls are in place to ensure the accuracy and reliability of the data it maintains. Our review of Department data collection and validation activities revealed concerns about these processes, and therefore about overall data reliability. Data reported electronically by school systems is not validated at the source and the Department's screening procedures are not sufficiently proactive to identify and prevent problems.
The Department does not currently audit data submitted by schools and school systems to ensure its accuracy. Responsibility for the accuracy of the data lies with each school system's administration. According to OEA staff, plans are being developed to review and selectively audit the data reported by schools, beginning in January 2004. However, the status of these plans was uncertain at the time this report was completed.
In addition, electronic data screening procedures are created in reaction to past reporting problems, and are not based on anticipated risk of misreported or underreported information. Problems encountered during the data collection process are reviewed and if staff determines that the potential exists for the problem to recur, edit checks are put in place to detect this problem in the future. However, Department staff do not attempt to anticipate and prevent other causes of intentional or accidental
misreporting before they happen, or compare data transmitted by similar systems to detect anomalies in reporting.
The impact of this lack of a proactive approach to data screening becomes more significant in the current environment of education reform. Under NCLB, schools face reduced funding and increased expenses if they fail to meet standards for school safety and education quality. For example, parents with children in schools designated as unsafe have the option to move their children to another school, at the school system's expense. Recent newspaper investigations have shown that significant under-reporting can go undetected by current data screening methods. For example, Gwinnett County's school system has been reported to have understated discipline problems by as much as 85% for the 2001-2002 school year.
It should be noted that the controls discussed here do not apply to the data housed by the Testing Unit [which manages testing contracts and the resulting testing data for assessments such as the Criterion Referenced Competency Test and the Georgia Kindergarten Assessment of Progress Revised (GKAP-R)]. A comprehensive review of the various electronic data systems was not conducted as part of this audit; however, as noted below, problems were identified when OEA attempted to match test result data to the student information data. Given that test scores could not be linked to the students taking the test, it appears that data accuracy may need to be addressed
Page 20
Department of Education Results-Based Budgeting
within this unit as well. This point is discussed further in the following finding. Without a system to ensure accuracy of data at the source, and a consistent, proactive approach to data validation, there is no assurance that the information maintained by the Department is accurate and reliable.
In its written response, the Department indicated that its Budget staff, working in conjunction with DOE program managers and technology staff, will work to review submission formats in detail to identify potential areas of concern and issues of data reliability. Staff will also work to develop electronic filters to identify data anomalies for clarification.
RECOMMENDATION #9 The Department should continue its efforts to combine student information files with test results data. As part of the fiscal year 2004 Appropriations Act, responsibility for the development and implementation of a new Student Information System was transferred to the Department from the State Data Research Center (SDRC). Currently, responsibilities for handling student data are split between the Department's Technology Unit, which handles student information collected from the schools, and the Testing Unit which reviews and analyzes raw data received from contractors who administer and score standardized tests. Under NCLB, states are required to disaggregate data collected on students' tests according to a range of
demographic characteristics including race/ethnicity, gender, disability status, and English proficiency. Such disaggregation requires matching information from the multiple files housed within these two units of the Department.
The Department has not previously attempted to link testing files to student record files until the June/July 2003 data collection cycle. The Office of Education Accountability (OEA), however, combined these files for the 2001-2002 school year to create their statemandated school report cards. In doing so, significant problems were encountered in trying to match testing files to student records. Records are matched according to the student's identification number, and unless the school system has provided preprinted labels for the tests, the accuracy of the information provided on the tests is up to each student who fills in the identifying information. It should be noted that these files are the sources for data contained in report cards published by the Department and OEA as well as in the RBB.
In developing and implementing a new student information system, the Department should ensure that the need to disaggregate data is adequately addressed.
The Department noted in its response that several task forces are working on the development of a comprehensive student information data system, which will provide meaningful, accurate and timely data for use in reporting and decision-making.
Department of Education Results-Based Budgeting Data
Page 21
RECOMMENDATION #10 To improve accountability and provide a complete picture of the Department's progress toward achieving its purpose, the RBB should be revised to include goals and desired results for those activities over which the Department has a direct influence. Currently, the Department's Results-Based Budget is primarily focused on measuring student achievement. However, the Department's primary functions center around providing administrative support, education support, and technical assistance to 180 public school systems in the state. Although it is important that the Department be held accountable for student achievement, it does not have direct influence over those factors that impact student achievement such as test scores, percentage of students with credentials to enter college or other postsecondary education, or reading proficiency.
support function address student transportation and the school and community nutrition programs. According to the U.S. Governmental Accounting Office (GAO), a Department's measures are considered to sufficiently cover key aspects of performance when they reflect the core functions of their related programs and activities.
The Department should be commended for the efforts it has made in identifying direct and indirect goals. Consideration should now be given to identifying and incorporating the relevant outcome-oriented measures related to the Department's administrative and educational support functions into the RBB.
Performance Plans do not have to include the complete array of goals and measures used in managing programs, but they should reflect a picture of intended performance without any significant gaps.
-U.S. Governmental Accounting Office
Goals recently developed by the Department as part of an internal strategic planning process make the distinction between those activities over which the Department has control by developing two sets of goals indirect and direct. While the functions identified as direct activities of the Department, including producing a school report card, quality core curriculum, and well-designed and aligned tests have long been functions of the Department, none have been addressed in the RBB. The only goals related to the Department's educational
In its written response to the report, the Department stated that its current goals and desired results were developed in collaboration with the Governor's Office of Planning and Budget. This recommendation will be shared with OPB and taken into consideration as the next iteration of the Department's Results-Based Budget is developed.
RECOMMENDATION #11 The Department's Results-Based Budgeting goals and desired results
Page 22
Department of Education Results-Based Budgeting
should be reviewed in light of recent state and federal educational reform laws. According to OPB staff, the Department's Results-Based Budget has not been reviewed and revised to incorporate relevant aspects of the federal No Child Left Behind Act of 2001 and the state's A Plus Reform Act of 2000. (For a brief synopsis of each of the Acts see page 2.) A review of the major provisions of each of the reforms revealed areas that have important outcome measurement implications. Examples of outcome-related provisions that could be incorporated into the Department's RBB are discussed in more detail below.
No Child Left Behind of 2001 (NCLB) Adequate Yearly Progress: The Department's RBB contains no goals or desired results regarding the number or the change in the number of schools or school systems that do not demonstrate adequate yearly progress in meeting the state-defined standards. According to NCLB, school systems or schools that continually fail to make adequate yearly progress toward the standards will be held accountable in a variety of ways. For example, if a school fails to meet adequate yearly progress standards for two consecutive years it must receive technical assistance and develop and implement an improvement plan.
Achievement Gap: Although the Department's RBB contains separate goals and desired results for programs serving "exceptional students" (e.g.,
gifted students, English for Speakers of Other Languages, and Special Education), there are no goals or desired results specifically targeted to reducing achievement gaps among the student groups nor are students in each of the programs being measured using the same goals, desired results, or achievement standards. NCLB requires that students' progress toward meeting state standards in math, reading, and science be measured using tests that are aligned with the standards. The results are to be sorted for students who are economically disadvantaged, from racial or ethnic minority groups, have disabilities, or have limited English proficiency, etc. Disaggregating the test results shows achievement gaps among the student groups and provides information needed to ensure that no child is left behind.
A Plus Education Reform Act of 2000 High School Graduation Test: One of the primary goals of the A Plus Education Reform Act is to decrease the percentage of students who fail the Georgia High School Graduation Test. The results of the High School Graduation Test, however, are largely absent from the Department's RBB. Results of this test are only used to measure achievement of students in the Technology and Career Education Program and the At-risk Program, but not overall student achievement. It should be noted that all high school students, regardless of the type of diploma they are seeking, are required to pass the Georgia High School Graduation Test in writing, English, mathematics, science, and social studies
Department of Education Results-Based Budgeting Data
Page 23
in order to graduate.
School Accountability: Although reporting on school performance is the responsibility of the Office of Educational Accountability, the Department should also have goals and desired results regarding raising the performance of schools. The Department's RBB, however, does not report student achievement results by school or school system. The Act mandates that school performance be monitored through the establishment of individual school ratings (A, B, C, D, or F) based on absolute student achievement standards and on progress in improving student achievement.
Consideration should be given to identifying and incorporating relevant outcome-oriented measures in the Department's RBB that are included in the No Child Left Behind Act
and the A Plus Education Reform Act. Such measures may help provide a more complete picture of areas in need of improvement as well as areas in which progress is being made. In addition, the RBB would more closely reflect the measures being used by the state and federal groups to assess educational success and progress.
In its response to the report, the Department indicated that its Results-Based Budget will be revamped in light of recent educational reform laws and of new leadership and administration, both internally via the Superintendent of Schools and the State Board of Education and externally via the Governor and the General Assembly. The Department also notes that this poses a continuing concern, however, as this may once again provide significant changes in the information presented due to changing priorities and initiatives of new leadership.
Page 24
Department of Education Results-Based Budgeting
Appendix A
The Department of Education's Results-Based Budget As printed in the Fiscal Year 2004 Governor's Budget Report
Department of Education Results-Based Budgeting
Appendix A, Page i
Table of Contents
Program 1: Regular Education
SubProgram 1: Academic Achievement
Goal 1:
Students will be adequately prepared for further
education and the workforce
Goal 2:
Students will be proficient in English/Language Arts
Goal 3:
Students will be proficient in Math
Goal 4:
Students will be proficient in Science
Goal 5:
Students will be proficient in Social Studies
SubProgram 2: Reading
Goal 1:
Improve student's reading and comprehension abilities
SubProgram 3: Goal 1:
Goal 2:
Foreign Language Students who have an extended sequence of foreign language study will perform better on the verbal and math portions of the SAT than their peers who have studied a foreign language Students who study a foreign language in the Georgia ESFL Model Program will be proficient in speaking a second language
SubProgram 4: Goal 1: Goal 2:
Goal 3: Goal 4:
Technology/Career (Vocational) Education Increase the academic achievement of secondary students in Technology/Career (Vocational) Education Programs The number of students who graduate from high school with credentials to succeed in post-secondary education will increase Increase the vocational/technical skill proficiencies of students in technology career education Students grading with college prep/technology career prep seal will be prepared for the workforce and higher education
SubProgram 5: Goal 1:
Goal 2:
Agriculture Education Promote the development of agricultural competency and academic skill Agricultural education students will find jobs in their field of study or enroll in postsecondary education
Page #
1 3 4 5 5 6
7 7
8 8 8 9
9 9
Appendix A Page ii
Department of Education Results-Based Budgeting Data
Table of Contents (continued)
Program 2: Exceptional Students
SubProgram 1: Gifted and Talented Students
Goal 1:
Students who participate in the Governor's Honors
Program (GHP) will be empowered to take charge
of their own learning
Goal 2:
Students who participate in Georgia's Programs
for Gifted Students will excel academically;
demonstrating exceptional performance on measures
of mastery of Quality Core Curriculum standards
in their areas of strength, as well as advanced
research and communication skills, and creative
thinking and creative problem solving skills
SubProgram 2: Goal 1:
Early Intervention Raise achievement of Kindergarten students who are in the Early Intervention Program
SubProgam 3: Goal 1:
Remedial Education Students who participate in the Remedial Education Program will complete high school
SubProgram 4: Goal 1:
Goal 2: Goal 3:
Goal 4:
Special Education Students with disabilities will successfully transition to post-secondary education or the workplace Students with disabilities will succeed academically Children with disabilities will be identified early to avoid falling behind in school Students with disabilities will be taught in the regular classroom with their peers to the maximum extent possible
SubProgram 5: Goal 1:
Goal 2: Goal 3:
State Schools Students attending state schools will be adequately prepared for successful employment and further education Students will achieve academically Sensory impaired children will be ready to learn in school
Page #
10
10 11 11 12 12 14 15
15 15 16
Department of Education Results-Based Budgeting
Appendix A Page iii
Table of Contents (continued)
Program 2: Exceptional Students (Continued)
SubProgram 6: English for Speakers of Other Languages
Goal 1:
ESOL students will gain sufficient English
proficiency to succeed in school
SubProgram 7: Goal 1:
Goal 2:
Alternative Education Program services will enable students, who have had difficulty in traditional classroom settings, to succeed academically Fewer students referred to the Alternative Education Program will drop out of school
SubProgram 8: Goal 1: Goal 2:
Goal 3:
At-Risk Kindergarten students in Title 1A schools will meet or exceed State performance standards Students in grades 4 and 8 attending schools receiving Title 1A funding will meet or exceed State performance standards Students in grades 9-12 attending schools that receive Title 1A funds will meet or exceed State performance standards
Program 3: Education Support
SubProgram 1: School Transportation
Goal 1:
Reduce the number of school bus accidents
per 100,000 miles
SubProgram 2: Goal 1:
School and Community Nutrition Students will eat nutritious meals at Georgia schools
Page # 16
16 17 17 18 18
19 19
Appendix A Page iv
Department of Education Results-Based Budgeting Data
STATE BOARD OF EDUCATION
Results-Based Budgeting
REGULAR EDUCATION
Purpose: Ensure that Georgia's K-12 students are academically prepared for their futures in the 21st century. ACADEMIC ACHIEVEMENT (Subprogram)
Purpose: Ensure that Georgia's K-12 students are academically prepared for further education and the workplace by providing leadership and support to initiate, promote, enhance, and communicate curriculum and programs of study in all academic areas for education and the general public.
Goal 1: Students will be adequately prepared for further education and the workforce.
Desired Result 1a: Percentage of students requiring learning support courses (remedial coursework) when they enter public colleges and universities [1]
Actual Desired
Actual Desired
Actual Desired
Actual Desired Desired Desired
23% 20% 22%
21%
18%
20%
16%
14%
12%
12%
5,368/ 23,339
5,318/ 24,063
5,200/ 24,413
4,891/ 24,510
Average SAT Scores
FY 98 FY99 FY00 FY01 FY02 FY03 FY04
Note 1: FY 2002 Actual Results data for students graduating in FY 2000 and Completing their first year of post-secondary education in FY 20001 wll be availabe in FY 2003.
1,017
1,016
1,019
1,020
1,020
968
969
974
980
980
Desired Result 1b: Georgia's students average Scholastic Aptitude Test (SAT) score
Georgia Actual National Average Desired Result Georgia Actual National Average Desired Result Georgia Actual National Average Desired Result Georgia Actual National Average Desired Result Georgia Actual National Average
Desired Result Desired Result
FY98
975 FY99
975 FY00
1,000 FY01
1,000 FY02
1,000 FY03
1,000 FY04
Desired Result 1c: The percentage of students scoring 3 or above on Advanced Placement (AP) exams and the number taking AP courses
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
63%
64%
65%
65%
65%
65%
Actual Result - Percentage
60.00%
54.30%
55.50%
56.20%
59.00%
- Number
9,183 of 15,305
7,069 of 13,018
8,116 of 14,623
20,846 of 25,297 of
37,092
42,748
Desired Result 1d: Percentage of Kindergarten students who score 161 or higher on the GKAP-R and are promoted to first grade
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
N/A
N/A
N/A
94%
95%
95%
Actual Result - Percentage - Number
N/A
93%
92%
93%
93%
N/A
98,121 of 105,146 of 102,463 of 102,075 of 105,597 114,289 110,175 109,500
Appendix A Page 1
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 1e: Average score for Georgia's students on the National Assessment of Educational Progress Test
(NAEP)
NAEP Average Score: Mathematics
NAEP Average Score: Reading
257 261
261
215 220
226 220
226
266 274 274 222 228
210 215
215
Georgia Actual National Actual Georgia Actual National Actual Georgia Desired Georgia Actual
National Actual Georgia Actual National Actual Georgia Desired
Georgia Actual National
Georgia Desired Georgia Actual National
Georgia Desired
FY 1996 FY 2000 FY 2004
FY 1996 FY 2000 FY 2004
Fourth Grade
Eighth Grade
Note 1: The mathematics portion of NAEP is given every four years. In FY 2000, 19% of Georgia's
fourth graders and 22% of eighth graders were rated "proficient" or better in math. Note 2: The program incorrectly reported some of the scores for the FY 2003 Governor's Budget
Report ; these scores have been corrected.
FY 1998 FY 2002 Fourth Grade
FY 1998 FY 2002 Eighth Grade
Note 1: The reading portion of the NAEP is given every four years. FY FY 1998, 29% of Georgia's fourth graders and 26% of eighth graders were rated "proficient" or better in Reading. Note 2: FY 2002 Actual Results data are not yet available.
NAEP Average Score: Science
150
149
149
148
144 143
NAEP Average Score: 8th Grade Writing
150
148
146
Georgia Actual National Georgia Actual National
Georgia Desired Georgia Actual
National Georgia Desired
4th Grade
8th Grade
FY 2000
4th Grade 8th Grade FY 2004
FY 1998
FY 2002
Note 1: The Science and Writing portions of the NAEP is giv en ev ery four y ears. In FY 2000, 26% of 4th graders and 23% of 8th graders w ere rated "proficient" or better in Science; in FY 1998 and 24% of Georgia's 8th graders w ere rated "proficient" or better in w riting. FY 2002 Actual Results for eighth grade w riting are not y et av ailable.
Desired Result 1f: Percentage of students who have enrolled in postsecondary education within one year of graduation [1] [2]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result
- Total N/A
N/A
N/A
N/A
45%
47%
48%
Actual Result
- Total N/A
- Number
N/A
N/A
45.20%
44.70%
44.90%
29,268 of
30,484 of
N/A
64,752
N/A
67,896
DTAE Institution N/A
N/A
7.7%
7.4%
8.8%
BOR Institution N/A
N/A
37.5%
37.3%
36.1%
Note 1: There is a one year lag in data. For example, FY 2002 Actual Results show the percentage of student graduating high school in FY 2000 and enrolling either
a DTAE or BOR institution sometime during FY 2001.
Note 2: Actual Results do not include students who have enrolled in private or out-of-state institutions.
Desired Result 1g: Percentage of students graduating from high school within 4 years of entering 9th grade [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
N/A
N/A
N/A
N/A
72%
74%
Actual Result - Percentage
N/A
N/A
N/A
70.7%
71.1%
- Number
N/A
N/A
N/A
N/A
N/A
Note 1: There is a one year lag in data. For example, FY 2002 Actual Results are based on FY 2001 data.
Appendix A Page 2
BOARD OF EDUCATION - Results-Based Budgeting
100% 3%
90%
17%
80% 70%
17%
60% 50%
12%
3%
3%
6%
5% 2%
23%
24%
3% 6%
1%
24%
4% Special Ed. Certificate of Attendence
1% General
Technical
16%
18%
16%
Combination College/Technical
40% 30% 20%
48%
50%
49%
49%
College Prep
10%
0% FY 99
FY00
FY01
FY02
61,004
64,199
67,896
68,215
Diplomas Diplomas Diplomas Diplomas
Note 1: There is a one year lag in data.
Note 2. The program did not provide Desired Results for FY 2003 and FY 2004.
Note 3: Some totals may not add to 100% due to rounding.
Desired Result 1h: Percentage of high school graduates earning each of the six types of diplomas offered by Georgia's public schools [1] [2] [3]
Goal 2: Students will be proficient in English/language arts.
Desired Result 2a: Average percentile ranking in English/language arts on norm-referenced tests [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
ITBS
ITBS
ITBS
STAT-9 STAT-9 STAT-9 STAT-9
Desired Result - 3rd Grade
N/A
63%
65%
65%
69%
72%
N/A
- 5th Grade
N/A
62%
64%
64%
68%
71%
N/A
- 8th Grade
N/A
59%
61%
62%
65%
69%
N/A
Actual Result - 3rd Grade
62%
64%
63%
61%
N/A
- 5th Grade
61%
63%
62%
61%
N/A
- 8th Grade
58%
60%
60%
63%
N/A
Note 1: In FY 2001, the Georgia public schools replaced the Iowa Test of Basic Skills (ITBS) with the STAT-9 (Stanford-9). STAT-9 data for FY 2002 are not usable.
Desired Result 2b: Percentage of students passing the English/language arts portion of Georgia's Criterion Referenced Competency Test the first time they take the test
[1] [2]
CRCT - English/Language Arts
Actual
Desired
71%
74%
75% 77%
80%
4th Grade
CRCT - English/Language Arts
Desired Actual
Desired
85% 70%
75%
1st Grade
83% 70%
75%
2nd Grade
70% 6th Grade
64%
65% 66%
61%
75%
65%
68%
70% 72%
8th Grade
FY 2000 FY 2001
FY 2002
FY 2003
Note 1: The State of Georgia began administering the CRCT to the 4th, 6th, and 8th grades in FY 2000. Note 2: The Department of Education did not provide Desired Results for FY 2004.
Appendix A Page 3
82% 70%
75%
3rd Grade
79% 75%
80%
5th Grade
65%
79%
70%
7th Grade
FY 2002
FY 2003
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 2c: Average verbal SAT score of Georgia students compared to the national average [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003
Desired Result - Georgia
N/A
493
493
500
500
502
Actual Result - Georgia
486
487
488
491
489
- National Average
502
502
501
506
504
Note 1: The numbers represent scaled scores on a scale of 200 to 800.
FY 2004 502
Goal 3: Students will be proficient in mathematics.
Desired Result 3a: Average percentile ranking in mathematics on norm-referenced tests [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002
ITBS
ITBS
ITBS
STAT-9 STAT-9
Desired Result - 3rd Grade
N/A
61%
65%
65%
50%
- 5th Grade
N/A
58%
64%
64%
55%
- 8th Grade
N/A
55%
61%
62%
50%
Actual Result - 3rd Grade
61%
61%
62%
42%
N/A
- 5th Grade
58%
59%
59%
51%
N/A
- 8th Grade
55%
56%
57%
42%
N/A
FY 2003 STAT-9
55% 60% 55%
FY 2004 STAT-9
Note 1: In FY 2001, the Georgia public schools replaced the Iowa Test of Basic Skills (ITBS) with the STAT-9 (Stanford-9). STAT-9 data for FY 2002 are not usable
Desired Result 3b: Percentage of students meeting expectations on the mathematics portion of Georgia's Criterion Referenced Competency Test the first time they take the test [1] [2]
CRCT - Mathematics
Actual
Desired
75%
62%
69%
66% 70%
4th Grade
CRCT - Mathematics
Desired Actual
Desired
85%
65%
70%
1st Grade
83%
65%
70%
2nd Grade
75%
66%
69%
69% 70%
6th Grade
82%
65%
70%
3rd Grade
70%
54%
58%
65% 65%
8th Grade
FY 2000 FY 2001
FY 2002
FY 2003
Note 1: The State of Georgia began administering the CRCT to the 4th, 6th, and 8th grades in FY 2000. Note 2: The Department of Education did not provide Desired Results for FY 2004.
77%
70%
75%
5th Grade
73%
65%
FY 2002
70% 7th Grade FY 2003
Desired Result 3c: Average mathematics SAT score of Georgia students compared to the national average [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Georgia
N/A
493
493
500
500
500
500
Actual Result - Georgia
482
482
486
489
491
- National Average
509
508
510
514
516
Note 1: The numbers represent scaled scores on a scale of 200 to 800.
Appendix A Page 4
BOARD OF EDUCATION - Results-Based Budgeting
Goal 4: Students will be proficient in science.
Desired Result 4a: Percentage of students meeting expectations on the science portion of Georgia's Criterion Referenced Competency Test the first time they take the test [1] [2]
CRCT - Science Actual Results FY 2002
80%
83%
83%
82%
76%
Grade
Fourth Fifth
Sixth Seventh Eighth
Note 1: FY 2002 was the first year in which students took the science
portion of the CRCT.
Note 2. The Department of Education did not provide Desired Results
for FY 2003 and FY 2004
Desired Result 4b: Average percentile ranking in science on norm-referenced tests (ITBS and STAT-9) [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003
ITBS
ITBS
ITBS
STAT-9 STAT-9 STAT-9
Desired Result - 3rd Grade
N/A
61%
65%
65%
69%
73%
- 5th Grade
N/A
58%
64%
64%
68%
72%
- 8th Grade
N/A
55%
61%
62%
65%
69%
Actual Result - 3rd Grade
59%
59%
59%
44%
N/A
- 5th Grade
59%
60%
60%
48%
N/A
- 8th Grade
55%
56%
56%
46%
N/A
FY 2004 STAT-9
Note 1: In FY 2001, the Georgia public schools replaced the Iowa Test of Basic Skills (ITBS) with the STAT-9 (Stanford-9). STAT-9 data for FY 2002 are not usable
Goal 5: Students will be proficient in social studies.
CRCT - Social Studies Actual Results FY 2002
84%
82%
81%
82% 80%
Desired Result 5a: Percentage of students meeting expectations on the social studies portion of Georgia's Criterion Referenced Competency Test the first time they take the test [1] [2]
Fourth
Fifth
Sixth
Grade Seventh Eighth
Note 1: FY 2002 was the first year in which students took the Social Studies portion of the CRCT. Note 2. The Department of Education did not provide Desired Results for FY 2003 and FY 2004
Desired Result 5b: Average percentile ranking in social studies on norm-referenced tests [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
ITBS
ITBS
ITBS
STAT-9 STAT-9 STAT-9 STAT-9
Desired Result - 3rd Grade
N/A
63%
65%
67%
55%
60%
- 5th Grade
N/A
62%
64%
66%
60%
65%
- 8th Grade
N/A
59%
61%
63%
60%
65%
Actual Result -3rd Grade
62%
64%
65%
51%
N/A
- 5th Grade
61%
63%
64%
56%
N/A
- 8th Grade
58%
60%
62%
54%
N/A
Note 1: In FY 2001, the Georgia public schools replaced the Iowa Test of Basic Skills (ITBS) with the STAT-9 (Stanford-9). STAT-9 data for FY 2002 are not usable.
Appendix A Page 5
BOARD OF EDUCATION - Results-Based Budgeting
READING (Subprogram)
Purpose: Improve the reading ability of all students by developing and implementing a program of reading instruction that focuses on research-based instructional practices.
Goal 1: Improve students' reading and comprehension abilities.
Desired Result 1: Percentage of students meeting or exceeding requirements to pass the Criterion-Referenced Competency Test (CRCT) for reading [1]
65%
86%
CRCT - Reading First Grade
85%
70%
CRCT - Reading
Fifth Grade
83%
85%
65%
70%
Desired Actual
Desired Desired
Desired Actual
Desired Desired
FY 2002
FY 2003
FY 2004
Note 1: FY 2002 was the first year CRCT scores were
reported for students in grades 1 - 3, 5, and 7.
65%
84%
CRCT - Reading Second Grade
85%
70%
Desired Desired Desired
FY 2002
FY 2003
FY 2004
Note 1: FY 2002 was the first year CRCT scores were reported for students in grades 1 - 3, 5, and 7.
65%
84%
CRCT - Reading Third Grade
85%
70%
FY 2002
FY 2003
FY 2004
Note 1: FY 2002 was the first year CRCT scores were
reported for students in grades 1 - 3, 5, and 7.
CRCT - Reading Sixth Grade
71%
80% 70%
75%
85%
Actual Actual
Desired Desired
Desired
CRCT - Reading
Seventh Grade
70%
85%
75%
85%
Desired Actual
Desired Desired
Desired Actual
Desired Desired
FY 2002
FY 2003
FY 2004
Note 1: FY 2002 was the first year CRCT scores were
reported for students in grades 1 - 3, 5, and 7.
CRCT - Reading Fourth Grade
65%
79% 65%
70%
85%
FY 2002
FY 2003
FY 2004
Note 1: FY 2002 was the first year CRCT scores were
reported for students in grades 1 - 3, 5, and 7.
CRCT - Reading Eighth Grade
75%
80% 75%
80%
85%
Actual Actual
Desired Desired
Desired Actual Actual
Desired Desired
Desired
FY 2001
FY 2002
FY 2003 FY 2004
FY 2001
Appendix A Page 6
FY 2002
FY 2003
FY 2004
BOARD OF EDUCATION - Results-Based Budgeting
FOREIGN LANGUAGE (Sub-program)
Purpose: Ensure that Georgia's K-12 students are academically prepared for further education and the workplace by providing them with an extended sequence of foreign language study.
Goal 1: Students who have had an extended sequence of foreign language study will perform better on the verbal and math portions of the SAT than their peers who have studied a foreign language.
Desired Result 1a: Mean verbal SAT scores of Georgia students who have had two, three, four or mores years of a
foreign language compared to students taking less than one year of a foreign language [1]
FY 1998
FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - More than 4 years
N/A
N/A
560
565
568
- 4 years
N/A
N/A
555
560
553
- 3 years
N/A
N/A
495
500
522
- 2 years This is a new measure;
N/A
N/A
460
465
475
Actual Result - More than 4 years
thus, there are no Desired
582
N/A
568
Results prior to FY 2002
- 4 years and no Actual Results
582
526
553
- 3 years prior to FY 2000.
527
477
522
- 2 years
478
449
475
- less than 1 year
390
393
443
Georgia Average
488
491
489
Note 1: The mean verbal SAT score for all Georgia students taking the SAT (Georgia Average) includes both students taking and not taking a foreign language.
Desired Result 1b: Mean math SAT scores of Georgia students who have had two, three, four or mores years of a foreign language compared to students taking less than one year of a foreign language
FY 1998
FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - More than 4 years
N/A
N/A
N/A
N/A
580
- 4 years
N/A
N/A
N/A
556
- 3 years
N/A
N/A
500
505
524
- 2 years This is a new measure;
N/A
N/A
465
470
476
Actual Result - More than 4 years
thus, there are no Desired Results prior to FY 2002
572
N/A
580
- 4 years and no Actual Results
558
526
556
- 3 years prior to FY 2000.
524
475
524
- 2 years
475
443
476
- less than 1 year
387
389
439
- Georgia Average
488
491
489
Note 1: The mean math SAT score for all Georgia students taking the SAT (Georgia Average) includes students taking a foreign language.
Goal 2: Students who study foreign language in the Georgia ESFL Model Program will be proficient in speaking a second language.
Desired Result 2a: The mean oral fluency ratings for fifth and third graders in the program compared to the mean oral fluency ratings for kindergarten students [1]
FY 1998
FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Kindergarten
N/A
1.5
1.5
1.5
- 3rd grade - 5th grade Actual Result - Kindergarten
N/A
This is a new measure, thus, there are
N/A
no Desired Results prior to FY 2002
and no Actual Results prior to FY 2001. 1.64
2.5
2.5
2.5
3.5
3.5
3.5
An evaluation (which is the source of
- 3rd grade
2.74 Actual Results data) was not conducted
- 5th grade
3.87
in FY 2002.
Note 1: The improvement between kindergarten assessments and 3rd and 5th grade assessments are statistically significant.
Appendix A Page 7
BOARD OF EDUCATION - Results-Based Budgeting
TECHNOLOGY/CAREER (VOCATIONAL) EDUCATION (Subprogram)
Purpose: Provide quality programs and services that enable Georgia's secondary students to develop the knowledge and skills needed to successfully transition to postsecondary programs and to enter career areas in rapidly changing workplace environments.
Goal 1: Increase the academic achievement of secondary students in Technology/Career (Vocational) Education programs.
Desired Result 1a: The percentage of students with technology/career and dual diploma seals passing the Georgia high school test in 1-5 attempts [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
N/A
67.53%
67.53%
68.03%
68.53%
69.03%
Actual Result - Percentage - Number
[1]
[1]
67.03%
75.35% Available
[1]
[1]
8,649 of 20,988 of January
12,903
27,855
2003
Note 1: Prior to FY 2000, this data could not be disaggregated to show only Technology/Career students; FY 2002 data were not disaggregated before the publication of the Governor's FY 2004 State Budget Report.
Desired Result 1b: The percentage of students achieving a "C" and above or "Satisfactory" grades in vocational courses (post-secondary non-developmental) [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
N/A
N/A
85.87%
86.37%
86.87%
87.37%
Actual Result - Percentage - Number
N/A
84.87%
85.37%
86.13% Available
N/A
N/A
33,852 of 31,012 0f January
39,653
36,008
2003
Note 1: FY 2002 data were not disaggregated before the publication of the FY 2004 State Budget Report.
Goal 2: The number of students who graduate from high school with credentials to succeed in post-secondary
education will increase. Desired Result 2a: The percentage of students that have
100%
87%
80% 66% 70% 72%
75% 75% 75%
the credentials to enter college or other post-secondary education because they have earned a dual career/tech and 60%
college prep seal
40%
Actua Actual Actual Actual Desire Desired Desired
20%
0% FY98 FY99 FY00 FY01 FY02 FY03 FY04 Note 1: This is a new measure; thus, there are no Desired Results prior to FY 2002. FY 2002 Actual Results data will be available in January 2003.
Goal 3: Increase the vocational/technical skill proficiencies of students in technology career education.
Desired Results 3a: Percentage of students successfully completing four or more courses in a concentrated vocational program area that receive or qualify to receive dual diploma or technology career seal [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
[1]
[1]
77.49%
77.50%
77.99%
78.49%
78.99%
Actual Result - Percentage - Number
[1]
[1]
86.13%
78.07% Available
[1]
[1]
12,334 of 27,930 of
14,321
35,776
January 2003
Note 1: Prior to FY 2000, this data could not be disaggregated to show only Technology/Career students; FY 2002 data were not disaggregated before the publication of the FY 2004 State Budget Report.
Appendix A Page 8
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 3b: Percentage of employers satisfied with students who complete youth apprenticeship and other structured work-based learning programs [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
95%
96.5%
95%
95%
95%
95%
Actual Result - Percentage
N/A
95.4%
96.1%
95.9%
99.2%
- Number
N/A
599 of 628 621 of 646 621of 647 640 of 645
Goal 4: Students graduating with college prep/technology prep seal or technology/career prep seal will be prepared for the workforce and higher education.
Desired Result 4a: Percentage of graduates with college prep/technology prep seal or technology/ career prep seal that are employed, enrolled in a post-secondary institution or in the military within 3 months of graduation [1]
Desired Result - Percentage Actual Result - Percentage
- Number
FY 1998 N/A N/A N/A
FY 1999 N/A N/A N/A
FY 2000
67.5%
94.3% 22,607 of 23,985
FY 2001 68.1% 87.5% 23,362 of 26,706
FY 2002 69.0%
Available January
2003
FY 2003 69.5%
FY 2004 70.0%
Note 1: FY 2002 data were not disaggregated before the publication of the Governor's FY 2004 State Budget Report.
AGRICULTURAL EDUCATION (Subprogram)
Purpose: Provide students with personal, managerial, and academic skills for employment in the agriculture industry and successful entry into a postsecondary program.
Goal 1: Promote the development of agricultural competency and academic skills.
Desired Result 1a: Average academic performance in Mathematics of the agriculture completers on the High Schools that Work Assessment (National Assessment of Education Progress - NAEP ) [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Agriculture completers
N/A
[1]
300
[1]
301
[1]
297
Actual Result - Agriculture completers
295.3
[1]
298
[1]
290
- All students
298.1
[1]
297.3
[1]
293
Note 1: 6,302 Technology/Career Education students participating in the High Schools That Work program participated in the NAEP's FY 2002 assessment; 378 (6%) were identified as agricultural completers.
Desired Result 1b - Average academic performance in Science of the agriculture completers on the High Schools That Work Assessment (National Assessment of Education Progress - NAEP ) [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result -Agriculture Completers
N/A
[1]
295
[1]
296
[1]
299
Actual Result -Agriculture Completers
287.3
[1]
292.8
[1]
286
- All students Note 1: See Note 1, Desired Result 1a
289.7
[1]
286.5
[1]
286
Goal 2: Agriculture education students will find jobs in their field of study or enroll in postsecondary education.
Desired Result 2a: Percentage of agriculture education students employed in agricultural-related jobs [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003
Desired Result - Percentage
N/A
[1]
30%
31%
33%
33%
Actual Result - Percentage
[1]
[1]
31.6%
27.5%
27%
- Number
[1]
[1]
1,058/3,344 792/2,880 877/3,251
Note 1: Fifty-two percent of the schools with programs in agricultural education that were surveyed for this data responded.
FY 2004 30%
Appendix A Page 9
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 2b: Percentage of agriculture education students who enroll in post-secondary education [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003
Desired Result - Percentage
N/A
N/A
30%
31%
33%
33%
Actual Result - Percentage
[1]
[1]
33.6%
40.7%
39%
- Number
[1]
[1]
1,124/3,344 1,172/2,880 1,258/3,251
Note 1: Fifty-two percent of the schools with programs in agricultural education that were surveyed for this data responded.
FY 2004 35%
Program Fund Allocation: Total Funds State Funds
FY 2002 Actual $4,514,879,246 $4,207,569,978
FY 2003 Budget $4,636,707,587 $4,320,085,049
FY 2004 Recommended $4,649,776,039 $4,334,359,820
EXCEPTIONAL STUDENTS
GIFTED AND TALENTED STUDENTS (Subprogram)
Purpose: Provide Georgia's gifted and talented students with appropriately challenging and enriching educational opportunities that are designed to encourage them to meet their full academic potential and assist them in the acquisition of the skills, knowledge, and attitudes necessary to become independent, life-long learners.
Goal 1: Students who participate in the Governor's Honors Program (GHP) will be empowered to take charge of their own learning.
Desired Result 1: Percentage of GHP students that reported their experiences during the summer contributed "a lot" or "totally" to their being able to turn future learning experiences to their advantage [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
86%
86%
86%
86%
86%
86%
Actual Result - Percentage
79%
90%
86%
89%
82.5%
- Number
94 of 119 76 of 84 90 of 105 105 of 118 94 of 114
Note 1: The 675 GHP students were sorted by 16 major areas of instruction; every 7th student was selected for survey.
Goal 2: Students who participate in Georgia's Programs for Gifted Students will excel academically; demonstrating exceptional performance on measures of mastery of Quality Core Curriculum standards in their areas of strength, as well as advanced research and communication skills, and creative thinking and creative problem solving skills.
Desired Result 2a: The percentage of gifted students in grades 1-8 who have received gifted education services in a content area exceeding expectations on that portion of the Georgia Criterion-Referenced Competency Test
FY 1998 FY 1999
FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
The Department's current data collection system does not provide this data.
Desired Result 2b: The percentage of gifted students in grades 9-12 who have received gifted education services in a content area exceeding expectations on the end-of-course tests for those courses
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 End of course tests are being developed and should be in place for FY 2003.
Desired Result 2c: The percentage of gifted students in grades 9-12 who have received gifted education services in a College Board Advanced Placement (AP) class scoring three, four, or five on that AP exam
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 The Department's current data collection system does not provide this data.
Appendix A Page 10
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 2d: The percentage of gifted students in grades 9-12 who have received gifted education services in an International Baccalaureate (IB) class scoring a five, six, or seven on the IB exam
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 The Department's current data collection system does not provide this data.
Desired Result 2e: The percentage of gifted students who have participated in gifted education classes for at least two years demonstrating skills in critical and creative thinking, logical and creative problem solving , research, and communication as evidenced by the development of innovative products and performances that reflect individual initiative and are advanced in relation to students of similar age, experience, or environment
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 Although data are not collected now, program representatives are now trying to identify valid measures for this measure.
EARLY INTERVENTION PROGRAM (Subprogram)
Purpose: Raise achievement of students who are below grade level to grade level achievement.
Goal 1: Raise achievement level of Kindergarten students who are in the Early Intervention Program.
Desired Result 1a: Percentage of students in the program who pass the Georgia Kindergarten Assessment Program (GKAP-R) sometime during kindergarten
FY 1998 FY 1999
FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
These data are not currently collected.
Desired Result 1b: Percentage of students in the Early Intervention Program who either meet or exceed the requirements of the Criterion Referenced Competency Test (CRCT) [1]
English
86%
84% 84%
75%
79% 84%
73%
70% 70%
60%
68%
60%
64% 60%
59% 64%
Reading
86% 84% 84% 79% 84%
73% 68% 64% 59% 64%
85% 83% 82%
75%
70%
70% 70%
69%
66%
Math
77%
63%
60% 60%
1st Grade 2nd Grade 3rd Grade 4th Grade
5th Grade
1st Grade
2nd Grade 3rd Grade 4th Grade 5th Grade
EIP Desired Results
EIP Actual Results
Note 1: FY 2002 is the first year for which this data were reported.
53%
1st Grade
38%
2nd Grade
3rd Grade 4th Grade 5th Grade
All Students Actual Results
REMEDIAL EDUCATION PROGRAM (Subprogram)
Purpose: Raise achievement of students who are in grades 9 -- 12 and below grade level to grade level achievement.
Goal 1: Students who participate in the Remedial Education Program will Complete high school.
Desired Result 1: Percentage of students in the Remedial Education Program that graduate from high school within four years
FY 1998 FY 1999
FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
The Department does not collect these data.
Appendix A Page 11
BOARD OF EDUCATION - Results-Based Budgeting
SPECIAL EDUCATION (Subprogram)
Purpose: Ensure that all students with disabilities have available to them a free, appropriate public education that emphasizes access to the general education curriculum and provides special education and related services designed to meet their unique needs and to provide the opportunity to develop into productive, successful citizens.
Goal 1: Students with disabilities will successfully transitioned to post-secondary education or the workplace.
Desired Result 1a: Percentage of students with disabilities who attend post-secondary education programs [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result
- % Total
N/A
7%
9%
23%
26%
29%
33%
- % University or College N/A
N/A
N/A
17%
18%
19%
21%
- % Technical College N/A
N/A
N/A
6%
8%
10%
12%
Actual Result
- % Total
5%
N/A
20.37%
22.39%
- Number N/A
N/A
119
890
- % University or College N/A
N/A
15.75%
11.14%
- Number N/A
N/A
92
443
- % Technical College N/A
N/A
4.62%
11.25%
- Number N/A
N/A
27
447
Note 1: This information, which includes 3,975 students with disabilities who graduated during the FY 2001 school year, was collected in a survey of all school systems in Georgia.
Desired Result 1b: Percentage of students with disabilities who are employed within 12 months of exiting school [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
6%
8%
64%
65%
66%
52%
Actual Result - Percentage
4%
N/A
63%
49.78%
- Number
N/A
N/A
367
1630
Available 3/2003
Note 1: While FY 2000 Actual Results were based upon a survey of a 25% of local school systems, FY 2001 and future results will be based on data from all Georgia School Systems. The FY 2004 Desired Result has been adjusted downward to reflect more comprehensive information.
Goal 2: Students with disabilities will succeed academically.
Desired Result 2a: Percentage of students with disabilities ages 17-22 who earn a regular education diploma [1] [2]
Mild Intellectual Disabilities
44%
39% 37% 37% 40%
38%
Severe Intellectual Disabilities
Sensory Disabilities
60% 59%
61% 56%
62%
62%
Desired Actual Desired Actual Desired Desired
Desired Actual Desired Actual Desired Desired
16% 14% 16% 9%
16% 16%
108/291 2,128/5,320
52/366
54/577
56/95
62/110
FY 2001
FY 2002 FY 2003 FY 2004 FY 2001
FY 2002 FY 2003 FY 2004 FY 2001
FY 2002 FY 2003 FY 2004
Note 1: Previously, the percentage of student with disabilities that earned regular diplomas were not shown by disability level and type. Note 2: FY 2002 Actual Results are based on FY 2001 data.
Appendix A Page 12
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 2b: Percentage of students with disabilities ages 14-22 who drop out of school [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003
Desired Result - Percentage Mild
N/A
N/A
N/A
N/A
47%
45%
Desired Result - Percentage Severe
N/A
N/A
N/A
N/A
15.5%
14.5%
Desired Result - Percentage Sensory
N/A
N/A
N/A
N/A
28.0%
27%
Actual Result - Percentage Mild
N/A
N/A
46.44%
48.30%
33.23%
- Number
N/A
N/A
2,982 of 6,421
3,357 of 6,950
2471 of 7437
- Percentage Severe
N/A
N/A
24.64%
16.28%
14.80%
- Number
N/A
N/A
152 of 617 97 of 596 91 of 615
- Percentage Sensory
N/A
N/A
19.80%
29.09%
15.13%
- Number
N/A
Note 1: FY 2002 Actual Results are based on FY 2001 data.
N/A
20 of 101 32 of 110 18 of 119
FY 2004 30.00% 12.5% 20.0%
Desired Result 2c: Percentage of students with intellectual or sensory disabilities meeting expectations on the reading portion of Georgia's Criterion Referenced Competency Test the first time they take the test
67% FY 2000 Actual
CRCT - Reading Mild Intellectual Disability
FY 2001 Actual FY 2002 Actual
43% 35%
32%39%
41% 36%
31%
33%
28%
49%
38% 36%
Desired Result 2d: Percentage of students with intellectual or sensory disabilities meeting expectations on the English/Language Arts portion of Georgia's Criterion Referenced Competency Test the first time they take the test
CRCT - English/Language Arts
62%
Mild Intellectual Disability
FY 2000 Actual FY 2001 Actual FY 2002 Actual
39% 36% 32%
18%19%21%18%19%23%
49% 45%
24%
FY 2003 Desired FY 2004 Desired
FY 2003 Desired FY 2004 Desired
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
FY 2000 Actual FY 2001 Actual FY 2002 Actual
CRCT - Reading
49%
Severe Intellectual Disability 44%
40%
35%
36% 35%
37%
32%
39%
27%
20% 26%
22%
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
FY 2000 Actual FY 2001 Actual FY 2002 Actual
CRCT - English/Language Arts Severe Intellectual Disability 48% 43%
36%
38% 34%
37%
33%
25%32%
38% 30%
20% 19%
FY 2003 Desired FY 2004 Desired
FY 2003 Desired FY 2004 Desired
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
FY 2000 Actual FY 2001 Actual FY 2002 Actual
CRCT - Reading Sensory Disability
55% 52%55%
54%
46%
42%
46%47%
42%
40%
62% 57% 51%
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
FY 2000 Actual FY 2001 Actual FY 2002 Actual
CRCT - English/Language Arts
63%
Sensory Disability 58%
52%
49%
44% 41%38%
38% 39% 35%
34%
53% 48%
FY 2003 Desired FY 2004 Desired
FY 2003 Desired FY 2004 Desired
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
Appendix A Page 13
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
BOARD OF EDUCATION - Results-Based Budgeting
65%
FY 2000 Actual FY 2001 Actual
CRCT - Mathematics Mild Intellectual Disability
49% 44%
Desired Result 2e: Percentage of students meeting expectations on the mathematics portion of Georgia's Criterion Referenced Competency Test the first time they take the test
FY 2003 Desired FY 2004 Desired
FY 2002 Actual
27%24%29%
26% 24%24%
16%20%
14%
19%
FY 2000 Actual
58%
CRCT - Mathematics Sensory Disability
65%
58% 53%
FY 2003 Desired FY 2004 Desired
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
CRCT - Mathematics Severe Intellectual Disability
38%
41%33%38%
FY 2001 Actual FY 2002 Actual
43%
43%
38%37%
32% 30%
29% 28%
33%
FY 2000 Actual FY 2001 Actual FY 2002 Actual
29%
23% 23%
20%27%
27%
18%7%
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
FY 2003 Desired FY 2004 Desired
5%
1st
4th
Grade Grade
6th Grade
8th Grade
All Grades
Goal 3: Children with disabilities will be identified early to avoid falling behind in school.
Desired Result 3a: Percentage of the total number of children age three and four identified as children with disabilities and served through an Individualize Education Plan by the local school system [1] [2]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Estimated % age 3
N/A
2%-2.2% 2%-2.2%
4.5%
4.5%
4.5%
3.0
- Estimated % age 4
[1]
[1]
4.5%
4.5%
4.5%
4.5%
4.5%
Actual Result - % served age 3
2%
2.18%
2.09%
2.16%
Actual Result - % served age 4
[1]
[1]
4.71%
4.52%
- Number
2,304
2,474
7,858
8,086
Note 1: Public Health statistics show that between 2% and 2.2% of children have disabilities. Since some types of disabilities do not manifest themselves until children grow older, the national average of students 3 to 5 years of age with disabilities is estimated to be 5.02%.
Note 2: Data collected prior to the FY 2000 school year included only three-year-old students.
Desired Result 3b: Percentage of students with disabilities who pass the GKAP-R and meet academic and behavioral requirements to progress to first grade
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
80%
82%
Actual Result - Percentage - Number
No historical data for this Desired Result is available.
78%
6,822 0f 8,746
Appendix A Page 14
BOARD OF EDUCATION - Results-Based Budgeting
Goal 4: Students with disabilities will be taught in the regular classroom with their peers to the maximum extent possible.
Desired Result 4: Percentage of the total number of children with disabilities who are able to successfully participate in regular education classrooms [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - >80% of day
N/A
37.3%
48%
40%
42%
44%
42%
- 40% -- 80% of day
N/A
N/A
28%
30%
33%
34%
38%
- <40% of day
N/A
N/A
20%
26%
25%
24%
20%
Actual Result - >80% of day
36.3%
37.4%
38%
36%
37.3%
- Number
53,633
55,269
52,095
56,011
63,526
- 40% -- 80% of day
N/A
N/A
39%
35%
36.2%
- Number
N/A
N/A
52,604
54,328
61,611
- <40% of day
N/A
N/A
27%
27%
25.1%
- Number
N/A
Note 1: FY 2002 Actual Results are based on FY 2001 data.
N/A
40,781
41,356
42,725
STATE SCHOOLS (Subprogram)
Purpose: Prepare sensory-impaired and multi-disabled students to become productive citizens by providing a learning environment addressing their academic, vocational, and social development.
Goal 1: Students attending state schools will be adequately prepared for successful employment and further education.
Desired Result 1a: Percentage of students who are successfully employed or attending a post secondary program 12 months following graduation [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result-Percentage: Total
N/A
65%
70%
75%
80%
85%
90%
Actual Result - Percentage: Total
N/A
73%
54%
81%
81%
- Number
N/A
22 of 30
6 of 11 17 of 21 21 of 26
- Percentage: Workforce
N/A
60%
45%
76%
50%
- Number
N/A
18
5
16
13
- Percentage: Post-secondary
N/A
13%
9%
5%
31%
- Number
N/A
4
1
1
8
- Percentage: Other/unknown
N/A
NA
5
NA
19%
- Number
N/A
NA
NA
5
Note 1: FY 2002 Actual Results are of students graduating between May 2002 and September 2002.
Goal 2: Students will achieve academically.
Desired Result 2a: Percentage of sensory impaired students who attend state schools who take the Criterion Referenced Competency Test (CRCT) as determined by statewide testing rule [Indicator] [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
NA
NA
N/A
N/A
80%
85%
85%
Actual Result - Percentage
N/A
N
N/A
62%
41%
- Number
N/A
N/A
N/A
37 of 60 29 of 71
Note 1: Some students included in these results also have intellectual disabilities that make other learning assessments more appropriate. Future collections will be based on percentage of students less students with intellectual disabilities.
Appendix A Page 15
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 2b: Percentage of students demonstrating significant improvement in the reading portion of the CRCT
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 data are not yet available.
FY 2002
FY 2003
FY 2004
Desired Result 2C: Percentage of students demonstrating significant improvement in the English/Language Arts portion of the CRCT
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 FY 2002 data are not yet available.
Desired Result 2D: Percentage of students that demonstrate significant improvement in the mathematics portion of the CRCT
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 FY 2002 data are not yet available.
Goal 3: Sensory impaired children will be ready to learn in school.
Desired Result 3b: Percentage of sensory impaired kindergarten students whose families had received early
intervention services before the children's third birthday [Interim Indicator]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003
Desired Result - Percentage
N/A
N/A
N/A
N/A
N/A
95%
Actual Result - Percentage
N/A
N/A
N/A
N/A
91%
- Number
N/A
N/A
N/A
N/A
10 of 11
FY 2004 95%
ENGLISH FOR SPEAKERS OF OTHER LANGUAGES (Subprogram)
Purpose: Assist students whose native language is not English in developing proficiency in the English language sufficient to perform effectively at the currently assigned grade level.
Goal 1: ESOL students will gain sufficient English proficiency to succeed in school.
Desired Result 1a: Students' scores on the Language Assessment Battery Test 12 months after entering the program compared to their initial scores [Proxy Measure]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004 The program does not collect these data. Because many factors affect educational achievement and the families of children in the ESOL program tend to be highly mobile, it is difficult to measure the academic progress of ESOL children. At a minimum, however, the program should begin assessing each child's progress in understanding and communicating English.
ALTERNATIVE EDUCATION PROGRAM (Subprogram)
Purpose: Facilitate psychological, disciplinary, health, and counseling services enabling students to be successful in their academic, social, emotional, and career development.
Goal 1: Program services will enable students, who have had difficulty in traditional classroom settings, to succeed academically.
Desired Result 1a: Percentage of Alternative Education Program students in grades 6-12 who pass 100 percent of their English/Language Arts, Math, Science, and Social Studies courses
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
N/A
N/A
N/A
18.7%
19.6%
20.5%
Actual Result - Percentage - Number
N/A
N/A
N/A
17.8%
Data not
N/A
N/A
N/A
3,301/18,573 provided
Appendix A Page 16
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 1b: Percentage of Alternative Education Program students in grades 6-12 who pass at least 51 percent of their core academic courses (English/Language Arts, Math, Science, and Social Studies)
77.3% 73.8% 74.0%
70.0%
80.8%
Actual Desired Actual
Desired Desired
FY2001 FY2002 FY2003 FY2004
Goal 2: Fewer students referred to the Alternative Education Program will drop out of school.
Desired Result 2: Percentage of current and former Alternative Education Program students who drop out of school
Desired Result - Percentage Actual Result - Percentage
- Number
FY 1998 N/A N/A
N/A
FY 1999 N/A N/A
N/A
FY 2000 N/A N/A
N/A
FY 2001 N/A
18.2% 3,376 of 18,573
FY 2002 N/A
17.3% 5,910 of 34,162
FY 2003 16.4%
FY 2004 15.6%
AT-RISK (Subprogram)
Purpose: Children who are most at-risk of school failure will complete and succeed in school. Goal 1: Kindergarten students in Title 1A schools will meet or exceed State performance standards. Desired Result 1: Percentage of students attending kindergarten in schools that receive Title 1A funding that pass the GKAP-R and are promoted to first grade compared to kindergartners in all public schools [1] [2] [3]
Percentage At-Risk Students Meeting GKAP-R Requirements
96% 95%
96% 95%
96% 95%
DesiredDReseirseudltRs e-sTuilttlse -1T-AitleS1c-hAooSlcwhiodoelwide DesiredDReseisreudltsRe- sTuitltles -1T-AitleTa1r-gAeTteadrgAetsesdisAtassnicsetance Actual RAectsuualltsR-esTuitlltes 1- -TAitleSc1h-AooSlwchidoeolwide Actual ARcetsuualtsRe- sTuitltlse -1T-AitleT1a-rAgeTteadrgAetsesdisAtassnicsteance Actual ARcetusaultRs e-sAultlsG-eAolrl gGiaeoPrguibal'iscPSucbhliocoSlschools
93% 95% 91%
94% 95% 93%
FY 2001
FY 2002
FY 2001
FY 2001
Note 1: Title 1A serves eligible children who are failing or most at-risk of failing to meet the state's performance standards. Title IA schools with 50% or more poverty may elect to become schoolwide and serve both eligible and non-eligible children. In FY 2000, 54% of Georgia's schools (1,032 of 1,911) were Title 1A schools.
Note 2: Student achievement data reported for targeted assistance schools include all students in the grade assessed thus the scores of Title IA students in some schools may be masked by the scores of more numerous other students.
Note 3: The Department of Education provided no explanatory information or disclosures of data limitations for this Desired Result.
Appendix A Page 17
BOARD OF EDUCATION - Results-Based Budgeting
Goal 2: Students in grades 4 and 8 attending schools receiving Title IA funding will meet or exceed State performance standards.
Desired Result 2a: Percentage of students assessed in Title IA schools that are at proficient or better on the reading
portion of the CRCT compared to percentage of student in all public schools [1] [2]
CRCT - Fourth Grade Reading1
CRCT - Eighth Grade Reading1
Actual Results
Desired Results2
75%
74% 82%
77% 83%
84% 78%
67%
65%
66%
Actual Results 82%
Desired Results2
83%
84%
76%
78% 77%
75%
76%
74%
78%
Title IA-Schoolwide Title IA-Targeted All Schools Title IA-Schoolwide Title IA-Targeted All Schools Title IA-Targeted Title IA-Schoolwide Title IA-Schoolwide Title IA-Targeted
Title IA-Schlwd. Title IA-Targeted All Schools
Title IA-Schlwd. Title IA-Targeted All Schools Title IA-Targeted Title IA-Schoolwide Title IA-Schoolwide Title IA-Targeted
FY 2001 FY 2002 FY 2003 FY 2004
FY 2001 FY 2002 FY 2003 FY 2004
Note 1: Because DOE's student information system does not track individual students, CRCT scores are reported by school. In some schools, the At-Risk Program serves all students, including high achieving students who are not Title 1-A eligible.
Note 2: Desired Results for the At-Risk Program are projected for all grades combined.
Desired Result 2b: Percentage of students assessed in Title IA schools that are at least proficient on the mathematics portion of the CRCT compared to percentage of student in all public schools [1] [2] [3]
CRCT - Fourth Grade Mathematics1
Actual Results
Desired Results2
76% 75%
62% 56%58%56%
67%
76% 68%
48%
CRCT - Eighth Grade Mathematics1
Actual Results
Desired Results2
76% 75%
76%
62% 67%
68%
56%58%56%
48%
Title IA-Schoolwide Title IA-Targeted All Schools Title IA-Schoolwide Title IA-Targeted All Schools Title IA-Targeted Title IA-Schoolwide Title IA-Schoolwide Title IA-Targeted Title IA-Schoolwide Title IA-Targeted All Schools Title IA-Schoolwide Title IA-Targeted All Schools Title IA-Targeted Title IA-Schoolwide Title IA-Schoolwide Title IA-Targeted
FY 2001 FY 2002 FY 2003 FY 2004
Note 1: See notes 1 and 2 for Desired Result 2a.
FY 2001 FY 2002 FY 2003 FY 2004
Goal 3: Students in grades 9-12 attending schools that receive Title IA funds will meet or exceed State performance standards.
GHSGT - /English Langauage Arts Pass Rates
Desired Result 3a - The percentage of students assessed in Title 1A schools that pass the English/Language Arts portion of the Georgia High School Graduate Test the first time they take it compared to students in all public schools
Actual Results
94%
94%
91%
92% 91% 89%
Desired Results 95% 95%
92%
93%
School Title IA-Targeted School Targeted School Title IA-Targeted School Title IA-Targeted School Title IA-Targeted
Schoolwide Title IA-Targeted Schoolwide Title IA-Targeted Schoolwide Title IA-Targeted Title 1A-Schoolwide Title IA-Targeted Title 1A-Schoolwide Title IA-Targeted
GHSGT - Mathematics Pass Rates
Actual Results 94%
Desired Results
88%
86% 84%
89% 84%
90% 91%
85%
86%
FY00 FY01 FY02 FY03 FY04
FY00 FY01 FY02 FY03 FY04
Desired Result 3b - The percentage of students assessed in Title 1A schools that pass the mathematics portion of the Georgia High School Graduate Test the first time they take it compared to students in all public schools
Appendix A Page 18
BOARD OF EDUCATION - Results-Based Budgeting
Program Fund Allocation: Total Funds State Funds
FY 2002 Actual $1,875,411,379 $1,363,015,627
FY 2003 Budget $1,926,016,997 $1,399,464,171
FY 2004 Recommended $1,931,445,432 $1,404,088,392
EDUCATION SUPPORT
Purpose: Ensure that all Georgia's K-12 students are able and willing to learn by providing services that support academic achievement.
SCHOOL TRANSPORTATION (Subprogram)
Purpose: To provide safe, cost effective, and timely transportation to and from Georgia's public schools.
Goal 1: Reduce the number of school bus accidents per 100,000,000 miles.
Desired Result 1a: School bus accidents per 100 million miles FY 1998 FY 1999 FY 2000
FY 2001
FY 2002
FY 2003
FY 2004
Actual Results data have been unreliable. The Department of Motor Vehicle Safety is now doing accident reporting, and preliminary statistics appear more reliable; however, the completeness and integrity of this data will not be verified until the close of FY 2003.
SCHOOL AND COMMUNITY NUTRITION (Subprogram)
Purpose: To deliver healthy foods, meals, and education that contribute to our customers' nutritional well-being and performance at school and work.
Goal 1: Students will eat nutritious meals at Georgia schools.
Desired Result 1a: Percentage of Georgia's public school students K-12 in attendance and choosing a school lunch as their midday meal at school [Proxy Measure]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage
N/A
71.2%
72%
73%
73.5%
74%
75%
Actual Result - Percentage - Number
70.9
955,144 of 1,346,623
73.6
939,719 of 1,276,163
73.3
949,748 of 1,294,868
73.8
964,098 of 1,305,937
74.1% 1,018,699 of 1,373,887
Desired Result 1b: Percentage of economically needy students that choose a school lunch as their midday meal compared to percentage of economically needy students [Proxy Measure] [1]
Percentage of Economically Needy Students Choosing School Lunch
87% 85% 86%86%85%87% 80%
512,834/
5199,.336688//
589,616
660022,,449922D47
505,119/
555533,,006688//
592,787
665500,62244
FY99 FY00 FY01 FY02 FY03 FY04
NNoottee11:: AAnntitcicipipaateteddchcahnagnegsesininfedfeedraelraelligeilbigiliibtyilirtyeqrueirqeumireenmtsenintsF,YF2Y002400w4il,l will reduce the reduce thneunmumbebrear nadndppeercrceenntataggeessooff eligibles idennttifiedd aannddsseervrveed.d.
Appendix A Page 19
BOARD OF EDUCATION - Results-Based Budgeting
Desired Result 1c: Percentage of schools that when reviewed on-site or as part of a sample are certified by the Georgia Department of Education as fully implementing federal nutrition standards [1] as opposed to 'working toward implementation' as required by federal regulations [1]
FY 1998 FY 1999 FY 2000 FY 2001 FY 2002 FY 2003 FY 2004
Desired Result - Percentage Actual Result - Percentage
N/A 13%
16% 17%
16% 16%
20% 22%
30% 28%
40%
40%
- Number
244 of 1,820 305 of 1,820 318 of 1,946 430 of 1,946 556 of 1,969
Note 1: Results are cumulative.
Program Fund Allocation: Total Funds State Funds
FY 2002 Actual $555,677,446 $355,569,294
FY 2003 Budget $570,671,703 $365,077,610
FY 2004 Recommended $572,280,128 $366,283,928
Total - All Programs:
Total Funds State Funds
FY 2002 Actual $6,945,968,070 $5,926,154,898
FY 2003 Budget $7,133,396,287 $6,084,626,829
FY 2004 Recommended $7,153,501,599 $6,104,732,141
A
Appendix A Page 20
Appendix B Descriptions of Tests
Norm-referenced Tests1 Norm-referenced tests are designed to highlight achievement differences between and among students to produce a rank order of students across a continuum of achievers. The results of norm-referenced tests are generally used to classify students so that they can be placed in remedial or gifted programs. Teachers may also used these results to assign students to different ability level reading or mathematics instructional groups. With norm-referenced tests, a representative group of students is given the test prior to testing all students. The scores of students who subsequently take the test are compared to that of the norm group.
Examples of norm-referenced tests administered in Georgia include:
Iowa Test of Basic Skills (ITBS): The test measures the performance of Georgia's students in the areas of English/language arts, reading, mathematics, science, and social studies. The test is administered to students in grades three, five, and eight.
Criterion-referenced Tests1 Unlike norm-referenced tests which rank students, criterion-referenced tests determine what test takers can do and what they know, not how they compare to others. The results of criterionreferenced tests indicate how well students are doing relative to a pre-determined level on a specified set of educational goals or outcomes included in the school, district or state curriculum. As a result, criterion-referenced tests may be used to determine how well a student is learning the desired curriculum and how well the school is teaching the curriculum.
Examples of criterion-referenced tests administered in Georgia include:
Advanced Placement Exams (AP): Under the Advanced Placement Program, high school students may elect to take college level courses in one or more of 35 subject areas (such as Biology, Calculus and Psychology). Upon course completion students may take the AP exam, and with a "qualifying" grade, colleges offer credit or advanced placement.
Georgia Criterion-Referenced Competency Tests (CRCT): The CRCTs are designed to measure student acquisition of skills and knowledge outlined in Georgia's Quality Core Curriculum (QCC). The content areas tested include English/language arts (assessed in grades one through eight), mathematics (assessed in grades one through eight) , science, and social studies (assessed in grades three through eight), and reading (assessed in grades one through eight).
Georgia High School Graduation Tests (GHSGT): These tests measure basic skills in the areas of reading, writing, mathematics, science, and social studies. The graduation tests are administered for the first time in students junior year. All students, regardless of the type of
Department of Education Results-Based Budgeting
Appendix B Page 1
diploma or diploma seal they are seeking, must pass the tests in order to graduate.
Georgia Kindergarten Assessment of Progress Revised (GKAP-R): The primary purpose of the GKAP-R is to provide cumulative evidence of a student's readiness for first grade. The test measures 32 Georgia kindergarten Quality Core Curriculum standards using performancebased assessment activities.
National Assessment of Education Progress (NAEP): The NAEP is designed to obtain achievement data on what American students know and can do. Its primary purpose is to document patterns and trends in student achievement and to inform education policy. It is administered every two years in reading and mathematics in grades four and eight. The selection of schools and students in each state is random.
Scholastic Aptitude Test (SAT): The test is designed to measure verbal and quantitative reasoning skills that are related to academic performance in college. SAT scores are intended primarily to help forecast the college academic performance of individual students.
1Bond, Linda A. (1996). Norm and criterion-referenced testing. Practical Assessment, Research & Evaluation, 5 (2).
Appendix B Page 2
Department of Education Results-Based Budgeting Data
For additional information, please contact Paul E. Bernard, Director, Performance Audit Operations Division, at 404-657-5220
or go to our website: http://www.audits.state.ga.us/internet/pao/rptlist.html