GEORGIA DOT RESEARCH PROJECT 09-03 FINAL REPORT
BEST PRACTICES IN SELECTING PERFORMANCE MEASURES AND STANDARDS
FOR EFFECTIVE ASSET MANAGEMENT
OFFICE OF MATERIALS AND RESEARCH RESEARCH AND DEVELOPMENT BRANCH
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
TECHNICAL REPORT STANDARD TITLE PAGE
1.Report No.: FHWAGA-11-0903
2. Government Accession 3. Recipient's Catalog No.: No.:
4. Title and Subtitle:
5. Report Date: June 2011
Best Practices in Selecting Performance Measures and
Standards for Effective Asset Management
6. Performing Organization Code:
7. Author(s): Adjo Amekudzi, Ph.D. Michael Meyer, Ph.D., P.E.
9. Performing Organization Name and Address: Georgia Tech Research Corporation Georgia Institute of Technology School of Civil and Environmental Engineering Atlanta, GA 30332-0355
12. Sponsoring Agency Name and Address: Georgia Department of Transportation Office of Materials & Research 15 Kennedy Drive Forest Park, GA 30297-2534
8. Performing Organ. Report No.: 09-03
10. Work Unit No.:
11. Contract or Grant No.: SPR00-0008-00-467
13. Type of Report and Period Covered: Final; January 2009 June 2011
14. Sponsoring Agency Code:
15. Supplementary Notes: Prepared in cooperation with the U.S. Department of Transportation, Federal Highway Administration.
16. Abstract: This report assesses and provides guidance on best practices in performance measurement, management and standards setting for effective Transportation Asset Management (TAM). The study is conducted through a literature review, a survey of the 50 state DOTs, an internal assessment of Georgia Department of Transportation's TAM capabilities and performance measurement and management procedures, and a review of risk applications in TAM with a case study demonstrating the impacts of uncertainty on project prioritization. The study isolates three generations of agencies as far as performance management is concerned. The study recommends conducting a review of GDOT's performance measurement and management process and procedures using current standards; benchmarking against similar and more mature state agencies; developing metrics for evaluating progress toward strategic goals; linking performance metrics with resource allocation decisions; developing analytical and data capabilities for evaluating tradeoffs in resource allocation decision making; refining measures for use in broad agency functions; refining performance communication tools; addressing uncertainties in performance metrics and management in TAM, and upgrading existing performance procedures and capabilities to meet state audit requirements.
17. Key Words: Performance Measures, Performance Management, Transportation Asset Management, Risk
18. Distribution Statement:
19. Security Classification (of this report):
Unclassified
20. Security
21. Number of 22. Price:
Classification (of Pages: 37
this page):
Unclassified
Form DOT 1700.7 (8-69)
2
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
Best Practices in Selecting Performance Measures and Standards
for Effective Asset Management
FINAL REPORT
Submitted to: Georgia Department of Transportation Angela Alexander, angela.alexander@dot.ga.gov
Submitted by: Georgia Institute of Technology Adjo Amekudzi, Ph.D., adjo.amekudzi@ce.gatech.edu Michael Meyer, Ph.D., P.E., michael.meyer@ce.gatech.edu
July 2011 3
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
TABLE OF CONTENTS
Contents
List of Tables and Figures Executive Summary 1. Objectives 2. Methodology 3. Key Messages and Findings
3.1 Guidelines for Selecting Performance Measures and Targets 3.2 Best Practices: Performance Measurement in State DOTs 3.3 Evolution of Asset Management at GDOT
3.3.1 Transportation Asset Management (TAM) Internal Review 3.3.2 TAM Peer Exchange 3.3.3 Development of TAM Program 3.3.4 Inventory of TAM Tools and Data 4. Uncertainty and Risk in TAM 4.1 Risk and TAM 4.2 Uncertainty and Risk 4.3 Risk Assessment and Risk Management 4.4 Risk Attitudes 4.5 Risk Applications in TAM 4.5.1 Risks Identification for Coastal Roadways 4.5.2 Risk Matrix for Projects 4.5.3 Risk Analysis for Bridge Prioritization 4.5.4 Risk Analysis for Asset Prioritization 4.5.5 Risk Analysis for Bridge Prioritization and Inspections 4.6 Applying Multiple Attribute Decision Making (MADM) Methodology to Prioritize Georgia Bridges 5. The Performance Resource Catalogue 6. Developing a Pipeline of Transportation Professionals 7. Conclusions and Recommendations References Appendices Appendix 1: Literature Review Synthesis Appendix 2: Performance Measurement in State DOTs Appendix 3: Performance Survey of State DOTs Appendix 4: GDOT Inventory of TAM Tools Appendix 5(a): Update to the U.S. Domestic TAM Scan Appendix 5(b): GDOT/UDOT/Indiana DOT TAM Peer Exchange Report Appendix 6: Effects of Performance Uncertainty on TAM Appendix 7: A Resource Catalogue for Transportation Performance Management
Page Numbers
4 5 7 10 10 10 16 16 17 17 17 18 20 20 20 21 22 22 22 23 23 24 26 27
29 30 31 34
4
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
LIST OF TABLES AND FIGURES
TABLES
TABLE 1: Generational Model of Performance Management TABLE 2: Summary of GDOT TAM Tools and Data TABLE 3: Values for Calculating Likelihood of Risk Events (England DfT) TABLE 4: Sample Risk Severity Zones (Edmonton, Canada) TABLE 5: GDOT Bridge Prioritization Formula -- Parameter Descriptions/Values
FIGURES
FIGURE 1: Overview of Transportation Asset Management FIGURE 2: Transportation Asset Management: Resource Allocation and Utilization FIGURE 3: Information Reporting Hierarchy at VicRoads, Victoria, Australia FIGURE 4: Risk Severity vs. Replacement Value Chart Edmonton, Canada FIGURE 5: Highway Bridge Risk Universe
5
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
EXECUTIVE SUMMARY
This report reports the results of a study that assesses and provides guidance on performance measures and standards for effective Transportation Asset Management (TAM). Performance measures are defined as indicators of system effectiveness and efficiency. Asset Management is the combination of management, financial, economic, engineering and other practices applied to physical assets with the objective of providing the required level of service in the most cost effective manner. Thus, performance measurement and management are critical components of an effective TAM system. TAM and performance management are both evolving practices, meaning that applications and best practices in these fields continue to expand and improve systematically over time.
The study was conducted through a literature review, a survey of the 50 states for current and best practices in performance measurement and management in TAM, an internal review of GDOT's present TAM capabilities and performance measurement and management procedures; and a review of risk applications in TAM followed by a case study demonstrating how uncertainty can be incorporated in project prioritization to enhance prioritization outcomes.
The study findings show that performance measurement alone is incomplete for effective TAM but, in addition, performance metrics must be applied in resource allocation decision making to manage agencies toward achieving their strategic goals consistently. Agencies with effective TAMs will have fewer, clearer strategic goals that are linked with performance measures (including outcome measures) for which metrics are developed and utilized in resource allocation decisions.
As performance management is an evolving practice, various agencies are at different levels in measuring and managing performance. First-Generation or "Traditional" agencies (with large number of measures, not strategically aligned), Second-Generation or "Hierarchy of Measurement" agencies (with many measures tracking system performance and organizational process improvement for their specific program and project decision-making purposes; but not usually linked meaningfully to other agency processes) and Third-Generation or "CatalystDriven" agencies ( that use lessons learned to refine practices and have developed the flexibility to retool and adapt an established system in response to changing agency priorities and external pressures). Communicating performance effectively to external stakeholders (i.e., the general public, the legislature and media) is critical. Effective performance communication within the agency is also critical for achieving strategic objectives.
The Georgia Department of Transportation (GDOT) has developed four strategic goals and is in the process of developing performance measures to evaluate and manage progress toward strategic objectives. GDOT has multiple infrastructure management tools (such as pavement management, bridge information management, and maintenance management, etc.) with supporting data that will be helpful for generating performance metrics. The study recommends the following: (i) performance benchmarking against other state DOTs; (ii) developing metrics to evaluate progress toward strategic objectives; (iii) linking performance metrics with resource allocation decision making and developing data and analytical capabilities for evaluating tradeoffs; (iv) refining metrics for use in broader agency functions (e.g., planning and management, operations and design/management); (v) refining performance reports to be more effective communication tools; (v) addressing uncertainties in performance management to improve the quality of performance outcomes data; and (vii) understanding the requirements of state performance audits in order to proactively address gaps in current performance procedures.
6
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
1. OBJECTIVES
FINAL REPORT
"The real value of performance measurement is in the development of an improved decision-making and investment process, not the achievement of many arbitrary short-term targets." - USDOT International Scan on Performance, 2004
This report presents the results for the research study "Best Practices in Selecting Performance Measures and Standards for Effective Asset Management," sponsored by the Georgia Department of Transportation (GDOT) and conducted from January 2009 to June 2011. The objectives of this study were to assess and provide guidance on factors influencing the selection of performance measures and standards for effective Transportation Asset Management (TAM). The report summarizes the key messages and findings of the study. Companion appendices include supporting deliverables that can be referenced for additional detail.
Performance measures are defined as indicators of system effectiveness and efficiency. State Departments of Transportation (DOTs), including GDOT, have long used performance measurement for analyzing system processes, outputs and outcomes as part of engineering and planning disciplines. However, the focus on performance measurement has largely grown and shifted to performance management during the period of this study. According to NCHRP Report 666, performance management is the regular ongoing process of selecting measures, setting targets and using measures in decision making; and reporting achievement, leading to the development of a culture of performance throughout the agency (CS and HS Consulting, 2010). Performance management thus goes beyond performance measurement to link metrics to resource allocation decision making in order to enable agencies achieve their strategic objectives. This report adopts the broader perspective of performance management, which is necessary for effective Asset Management.
Asset Management is the combination of management, financial, economic, engineering and other practices applied to physical assets with the objective of providing the required level of service in the most cost effective manner (NCHRP/AASHTO 2010). Similarly, the AASHTO TAM Strategic Plan defines TAM as a "strategic and systematic process of operating, maintaining, upgrading and expanding physical assets effectively throughout their life cycle" (AASHTO Asset Management Strategic Plan). Asset Management, like performance management, is an evolving practice, meaning that the current status and best practices of Asset Management (and performance measurement and management) expand and improve systematically over time. The current standard for Transportation Asset Management is contained in such documents as the AASHTO1 Transportation Asset Management Guide Vols. 1 and 2, and the International
1 American Association of State Highway and Transportation Officials
7
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
Infrastructure Management Manual, and the best practices at any point in time can be identified through studies. Figures 1 and 2 depict the key roles of performance in TAM. Figure 1 highlights the importance of performance measures in condition assessment, performance modeling and prediction, and project prioritization. Figure 2 depicts TAM showing the importance of performance in setting policy goals and objectives, allocating resources for planning and programming, and program delivery, evaluating tradeoffs, and monitoring the system (NCHRP 2002).
FIGURE 1: Overview of Transportation Asset Management (FHWA2) Source: Transportation Asset Management Guide, Vol. 1 (NCHRP/AASHTO 2002)
2 Federal Highway Administration
8
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
FIGURE 2: Transportation Asset Management: Resource Allocation and Utilization Source: Transportation Asset Management Guide, Vol. 1 (NCHRP/AASHTO 2002)
9
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
2. METHODOLOGY
The study was conducted by reviewing the transportation and management performance measurement literature and surveying state DOTs to determine current and best practices in the development of performance measures and targets/standards, and for performance management. The researchers also undertook an internal review of GDOT's asset management tools and data, and facilitated and reported on an asset management peer exchange that included the Utah, Georgia and Indiana DOTs. Additionally, a literature review was conducted on risk applications in TAM, data obtained from the National Bridge Inventory, and multiple attribute decision making (MADM) methodology was applied to demonstrate the impact of including uncertainty in project prioritization, and how normalization of attributes and data disaggregation in project ranking can affect the final prioritization outcomes. Finally, a catalogue of performance management resources was developed to facilitate access to available resources supporting the development of performance management programs in agencies.
3. KEY MESSAGES AND FINDINGS
Performance management involves the successful application of performance data to manage agency performance toward achieving strategic goals consistently. NCHRP Report 666 (Transportation Performance Management: Insight from Practitioners) identifies three basic considerations that shape performance management implementation: customer needs and desires; engineering requirements and limitations; and fiscal limitations. Performance management is viewed as closely linked with strategic planning and reporting where strategic planning involves identifying what an agency hopes to achieve. Strategic planning is based on developing an agency vision or mission, identifying supporting goals and objectives, and developing initiatives and implementation strategies to achieve these objectives in agreed upon time frames. Performance management is the regular on-going process of selecting measures; setting targets and using measures in decision making; and reporting achievement, leading to the development of a culture of performance throughout the agency (CS/HS Consulting 2010).
3.1 Guidelines for Selecting Performance Measures and Targets
The literature review indicates that many states have committed to using performance measures, but that the degrees to which performance measurement systems are developed varies widely among them. The literature highlighted the following guidelines for selecting performance measurers and targets:
1. Performance measures should flow directly out of an agency's mission and objectives.
2. Performance measures should provide a balanced picture of an agency's business and utilize input, output, outcome and productivity or efficiency measures in an appropriate manner.
10
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
3. An effective performance measurement system will have a few, well-defined measures tied to a handful of clear goals to be achieved within specific time frames.
4. Performance measurement systems should be periodically evaluated in an iterative process.
5. Performance measures should use reliable and available data that the agency can collect without straining its resources.
6. Performance measurement reporting and communication should be clear and easy to understand.
7. Comparative performance measurement, also known as benchmarking, has been recognized as important among state DOTs.
8. Customer satisfaction, environmental quality and sustainability are increasingly important outcome measures.
9. Performance targets should be set in relation to achieving the agency's strategic goals, considering policy guidance and public input, funding availability, benefits, costs, risks and tradeoffs (or opportunity costs of setting various targets). Scenario analysis is a useful analytic tool when setting targets.
10. A growing number of agencies are using formal performance frameworks to select performance measures. Performance frameworks are structured processes that provide guidance for selecting performance measures, e.g., the Balanced Scorecard Framework.
Internationally, various transportation agencies are using performance measurement for a range of functions. A 2004 international scan tour of performance measurement systems in Australia, Canada, New Zealand and Japan showed that performance measures were used more extensively in those countries than in the U.S (FHWA 2004). These systems often emphasized safety, included output and outcome measures including environmental and customer satisfaction indicators, integrated data collection, used before and after studies and benchmarks, and considered multimodal investment tradeoffs. Successful programs directly used performance measurement to influence programming discussions and budget allocation. The scan recommended in particular that safety and benchmarking should be emphasized more by the FHWA. Furthermore, the scan suggested that the U.S. generate research, training, conference meetings, technical guidance and sustainability actions, using these international examples.
The scan sheds light on some important points about performance measurement and target setting in other countries (FHWA 2004):
1. "A limited number of high-level national transportation policy goals linked to a clear set of measures and targets are used;
2. Intergovernmental agreements on how state, regions and local agencies will achieve the national goals are negotiated while translating them into local context and priorities; and
3. The real value of performance measurement is in the development of an improved decision-making and investment process, not in the achievement of many arbitrary short-term targets."
11
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
3.2 Best Practices: Performance Measurement in State DOTs
3.2-1 Survey Results
A survey conducted from September 2009 to February 2010 explored the use of performance measures in the 50 state DOTs. Approximately 75% (39) of the 50 state DOTs responded to the survey. A majority of state DOTs reported that they have linked performance measurement with strategic planning, and are using performance measures and targets in planning and management. The key findings of the survey are given below (Pei et al., 2010a):
1. Over 90% (36 out of 39 respondents) of the responding state DOTs reported having a strategic plan in place. Most of the responding agencies reported that they update their plans annually or bi-annually.
2. DOTs reported that strategic objectives are largely related to transportation system safety, system preservation and mobility. Agencies also reported to a lesser extent that employee and organizational development, customer satisfaction, economic growth and vitality and environmental quality are included in strategic objectives.
3. More than half of the responding DOTs (23) reported having performance measures tied to strategic goals and objectives.
4. About 33% (12) of the responding DOTs reported that they review their measures annually.
5. About 70% (28) of the agencies reported that performance measures are mostly used in management and planning, and not in all DOT functions. About half (21) reported using performance measures in operations and slightly less than half (18) in design/engineering.
6. Over 75% (30) of the responding DOTs reported that they use performance measures to engage stakeholders.
7. About 80% (31) of the responding DOTs reported that they set performance targets, developed largely by upper management and program managers, and also by benchmarking and consensus, considering funding levels and stakeholder input.
8. About 80% (31) of the responding agencies reported that top management reviews performance information.
9. About 70% (27) of the responding agencies reported that they have an asset management program in place with most programs used to monitor the condition of highways and bridges.
12
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
3.2-2 A Generational Model for Performance Management
"Performance measurement is an evolving practice. All state DOTs have used some aspect of performance measurement for analyzing system uses and conditions as part of the engineering and planning disciplines. Yet, the business process improvement and accountability aspects of the performance measurement field have only emerged in the transportation industry in the past decade." -Baird and Stammer 2000 as quoted in Bremmer et al. 2000
Performance measurement, like asset management, is an evolving practice. Bremmer et al. (2004) present a Generational Model of Performance Management to depict this evolving practice of performance management. Citing Baird and Stammer (2000), and Poister (2004), Bremmer et al. emphasize that while all state DOTs have used some aspect of performance measurement for analyzing system uses and conditions as part of engineering and planning, the business process improvement and accountability aspects of the performance measurement field are more recent in the transportation industry having emerged in the 90s. They recognize that DOTs can be vastly different from one state to another, managing transportation systems that vary in complexity and scope in distinctive political and economic environments. As the concept of measurement has expanded, states have tried to follow suit, some making the leap to track organizational performance in order to improve business processes or to demonstrate accountability. Some have taken the step to integrate measures into strategic frameworks aimed at focusing the organization on a few key outcomes. Such agencies are often focusing on the newer generation of performance measures described as more outcome-oriented, more integrated with strategic goals and objectives, and on quality and customer service than the input and output measures of the past. Table 1 summarizes the three types or generations of agencies when it comes to performance measurement and management.
Various pressures drive change that can influence how agencies organize their performance measures and management procedures. These include leadership changes at the top of the state DOT or the state (e.g., PennDOT, Caltrans); new funding or a legislature's view that a state DOT requires more oversight (e.g., MoDOT, MnDOT, VADOT, WSDOT); external mandates for benchmarks and performance reporting (Maryland State DOT, Oregon DOT, WSDOT); and performance audits and reviews of state DOTs (over 30 states) (Bremmer et al. 2004).
13
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
TABLE 1: Generational Model of Performance Management
Generation Characteristics
of Agencies
First-
Develops measures in response to internal Total
Generation Agency:
Quality Management initiatives or specific legislative requirements.
Traditional Infrastructure and Organizational Measurement
At the same time, may already have robust established measurements in traditional system planning and program areas, such as preservation.
"Standard measures" track basic system performance and organizational process improvement; useful for
specific program and project decision making
purposes but not meaningfully liked to other agency
processes
Usually lack a strategic measurement framework;
only starting to use performance measures to define
progress in meeting long-range plans or shorter-
range plan goals.
SecondGeneration
Generally has proliferation of measures as part of a framework or hierarchy for measuring the agency's
Agency:
performance.
Hierarchy of Measurement
Measures are usually based on a traditional planning framework and are often long range measurements
that link to mid-range strategic and/or short range
business plans.
Agency ties measurement areas together in a
strategic orientation used by leadership and
managers to track business functions and planning
goals.
Measurement areas eventually expand to include
difficult-to-measure higher-level outcomes, societal
goals and customer expectations.
As practices evolve, measurement systems can grow
increasingly complex, making results difficult to
communicate.
There could be a well developed public reporting
tool that communicates the results of the
measurement scheme to meet legislative, public or
agency needs.
Third
Agency can respond to change catalysts, e.g., new
Generation
agency administrations, governmental changes such
Agency: CatalystDriven
as a new governor, funding crises or increases, new state or federal requirements, etc., and retool and adapt an established system in response to changing
Adaptation
agency priorities and external pressures
Agency has the ability to proactively use
performance measures to set its agenda and more
effectively communicate its needs.
Agency is at the forefront of using dynamic
Examples in 2004 Alaska DOT Arizona DOT Delaware DOT
Florida DOT Missouri DOT Maryland DOT
Minnesota DOT Ohio DOT Washington State DOT
14
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
approaches that provide real-time information responsive to the needs of agency leadership and the state's political context and places high value on public accountability. Agency recognizes the complexity created within the traditional planning framework and explores alternative ways to measure and communicate performance. Agency's performance measurement system is of a narrow focus compared to earlier, less strategic outcomes: agency tends to focus on building effective measurement systems and communication tools centered on agency responsibilities and investment decision needs. While continuing to seek viable indicators for broader societal planning goals and outcomes, the agency tends to focus on building effective measurement systems and communication tools centered on agency responsibilities and investment decision needs.
Source: Bremmer, Cotton and Hamilton (2004)
3.2-3 Performance Communication to Multiple Stakeholders Performance measurement and reporting occurs for multiple functions and at multiple levels for an agency's internal and external stakeholders. Figure 3 depicts VicRoads' (in Victoria, Australia) information reporting hierarchy. Agencies that evaluate the comprehensiveness of their performance measurement activities and the quality and effectiveness of the performance reports for their internal and external stakeholders continue to refine their reports to be more effective communication tools for their various stakeholders.
15
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
FIGURE 3: Information Reporting Hierarchy at VicRoads, Victoria, Australia Source: Transportation Asset Management in Australia, Canada, England and New
Zealand (FHWA 2005)
3.3 Evolution of Asset Management at Georgia Department of Transportation (From September 2009 to the Present)
We get lots of projects done. We spend a lot of money. But we are not sure we are getting the best value on the dollar." -State DOT Upper-level Manager, Utah/Indiana/Georgia TAM Peer Exchange, August 2009
Asset management is a business process that can be used to improve the value of assets per dollars expended. In the course of this study, the Transportation Asset Management (TAM) program at GDOT evolved significantly from its initiation in 2009. At the onset of the project, an internal review was conducted of the status of TAM in the agency; the Project Investigators provided documentation support for a TAM Peer Exchange organized for GDOT, Utah DOT and Indiana DOT in September 2009, and developed an inventory of TAM tools and data at GDOT. The following sections summarize key messages and findings from these activities.
16
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
3.3.1 TAM Internal Review
Participating in an internal review conducted on the status of asset management at GDOT, several GDOT officials felt that the agency has a very good asset management program; however, most of the asset management activities were considered officespecific in the sense that each office has good data and uses the data to prioritize needs. For example the GDOT pavement management system (PMS) was used to prioritize pavement projects that were part of the economic stimulus package. Crash statistics were used together with PMS and bridge management system (BMS) information to prioritize projects. GDOT officials emphasized the importance of taking a ROW-to-ROW (i.e., right of way to right of way) line asset management perspective and were interested in obtaining a 100% database (rather than a sample database) for all assets being managed. There was interest in knowing what other states were doing in asset management but a feeling that what works in one state would not necessarily work in another.
3.3.2 TAM Peer Exchange
In September 2009, FHWA organized a scan/peer-exchange on TAM for Utah, Indiana and Georgia DOTs. The report, Asset Management Peer Exchange: Utah/Indiana/Georgia summarized the key findings and recommendations of the peer exchange (Amekudzi, 2009). The recommendations of the report with respect to important program steps for asset management were:
1. Conduct a self-assessment exercise; and 2. Develop an Asset Management Implementation Plan. The Implementation Plan
would involve: (a) streamlining strategic goals; (b) developing performance measures that align strategic goals with work at all levels of the agency; (c) developing analytical procedures for the bridge database; (d) integrating data, and (e) integrating analysis tools.
Additional details can be found in the Peer Exchange report included in Appendix 5(b).
3.3.3 Development of TAM Program at GDOT
Following the internal review and peer exchange, GDOT took definitive steps to advance asset management as a core business process in the agency. GDOT formally adopted Transportation Asset Management in 2009 to optimize infrastructure investment by applying program resource allocation and asset preservation techniques. Subsequently, TAM has been adopted as a core business process intended to serve as the basis for decision making throughout the agency (GDOT, 2010).
Formal strategic planning in GDOT, begun in 1994, is now used as a management tool in setting agency direction, identifying specific initiatives and facilitating employee teamwork to implement initiatives and projects that are necessary to achieve organizational improvements toward strategic agency goals. The Department
17
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
implemented a strategic management process in 2003. Four strategic goals were identified (GDOT, 2010):
1. PEOPLE: Making GDOT a better place will make GDOT a place that works better
2. SAFETY: Making safety investments and improvements where the public is most at risk
3. MAINTENANCE: Taking care of what we have in the most efficient way possible
4. CAPACITY: Planning and constructing the best set of mobility-focused projects we can, on schedule.
On-going work on performance management includes formally tightening the linkage between performance measurement and decision making to achieve strategic objectives.
3.3.4 GDOT TAM Inventory of Tools and Data
A review was conducted of the TAM analysis tools and data being used in GDOT. Table 2 summarizes some of the main TAM tools and data used in the agency (O'Har, Amekudzi and Meyer, 2009). Other tools and data can be found in Appendix 4.
TABLE 2: Summary of GDOT TAM Tools and Data
Tool (#1):
Highway Maintenance Management System (HMMS)
Objective:
Allows GDOT to track the daily work of maintenance crews throughout the state; assimilate outstanding work on roads from inspections; develop a work program for tracking equipment costs, labor costs and material costs (input measures)
Data:
Biannual drainage reports, condition assessment of pipe, location of signs and pipes (coordinate info), and data from inspections (guardrail, pavement, vegetation, etc.; no coordinate info)
Units using Tool:
Maintenance managers throughout the area and district maintenance offices
Use of Results: Tool (#2):
To develop an annual needs-based budget and an annual work program; determine the condition of pipe systems; compare actual and estimated costs with budget office costs Pavement Condition Evaluation System (PACES)
Objective:
A pavement condition assessment survey that rates every mile of every road each year
Data:
Condition evaluations of roadway (asphalt and concrete)
Units using Tool:
Area and district maintenance offices; Office of Materials and Research; data output from this tool feeds into the Georgia Pavement
18
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
Use of Results:
Tool (#3): Objective: Data: Units using Tool: Use of Results: Tool (#4) Objective:
Units using Tool Use of Results
Tool (#5): Objective: Data: Units using Tool Use of Results Tool (#6) Objective: Data: Units using Tool:
Management System (GPAMS)
To determine overall condition of roadway; determine what work needs to be done (e.g., crack sealing, resurfacing); predict the future condition of roadway (i.e., LOS of roadway) with available funds; determine the cost of work that needs to be done Pipe Inventory
A module of the HMMS, provides a condition assessment of pipe
Data from physical inspections of pipe tracked with a coordinate system
Area and district maintenance offices
To determine what work needs to be done on each line of pipe
Bridge Information Management System (BIMS)
Collects input data from bridge inspections; allows the Department to retrieve certain information without going through paper work; separate from the Federally-required National Bridge Inventory (NBI); collects more data than the Federal government requires.
Bridge Maintenance unit, Office of Transportation Data, upper management (for planning)
Federal reporting requirements for the NBI; generating deficiency reports; input data for HMMS; determining necessary repairs; routing (vertical clearance and load requirements for oversize/overweight loads); budgeting and funding decisions
Life Cycle Cost Analysis (LCCA) Tool
Gives comparison of lifecycle costs for different pavement types
Quantities of materials, length of a project, unit costs, maintenance costs, time frames
Pavement Management
Making decisions on pavement type; deciding between construction and rehabilitation Highway Performance Monitoring System (HPMS)
Mandated by the FHWA to provide the Department's road inventory data; sample-based system consisting of 98 data items; provides a variety of data (roughness, AADT, etc.)
Not used much within GDOT; the Department has its own road
19
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
Use of Results: Tool (#7) Objective: Data:
Units using Tool: Use of Results
inventory database
Used by the Federal Government in allocating funds; other data items from this tool are used within the Department Benefit/Cost Tool (B/C)
Used in the project prioritization process; contributes to a project score
Overall cost of a project (design, construction, etc.); benefits (times savings through a corridor, fuel cost); safety benefits; dollar values based on national average values (commercial versus non-commercial)
Planning office; Design office; and Traffic Operations
A piece of the decision-making process; everything is not based on the B/C ratio
4. UNCERTAINTY AND RISK IN TRANSPORTATION ASSET MANAGEMENT
4.1 Risk and Transportation Asset Management
All of the agencies examined in the FHWA/AASHTO international scan tour on Asset Management in 2005 practiced some degree of risk assessment in selected areas of their TAM programs. Furthermore, all the agencies used the concept of risk to establish investment priorities (FHWA 2005). As TAM systems are already in place in many state transportation agencies, particularly in larger agencies, they can be used as appropriate platforms to incorporate uncertainty and risk in decision making. In a 2006 scan on TAM conducted in the U.S., there was little evidence of risk being used in asset management (CS and Meyer 2007). A number of the agencies that have applied risk assessment methods have done so by conducting scenario analysis. Typically, different scenarios are defined based on different levels of funding. These scenarios then predict pavement, bridge and other asset condition ratings at various levels of funding.
4.2 Uncertainty and Risk
One of the most common uses of the term "risk" when applied to transportation infrastructure refers to the risk of catastrophic or non-catastrophic failure. Noncatastrophic failure can also be referred to as performance failure, i.e., the failure of a facility or system to perform as intended. This requires the selection of minimum levels of service (LOS). Risk in this context generally refers to the chance that a negative event occurs (e.g., bridge failure) and the severity of the consequences of this negative event, also known as technical risk (Haimes 2004 and Piyatrapoomi et al. 2004).
Uncertainty is an inherent part of the decision-making process when choices are made based on incomplete knowledge, when there are sources of error, or when there is
20
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
inherent randomness in the system or facility under consideration. (Piyatrapoomi et al. 2004; Helton and Burmaster 1996). Decision makers often do not have complete knowledge of every facet of a decision. Some level of uncertainty is present in nearly all decision making. This type of uncertainty is generally termed subjective uncertainty and is reducible. This is in contrast with objective uncertainty arising from the randomness of systems, which is irreducible (Winkler 1996).
While it is impossible to eliminate uncertainty from infrastructure asset management (Haimes 2004), uncertainty can be modeled to improve the quality of decision making. Sources of error for infrastructure assets include data errors, forecasting errors, and modeling errors. Data errors are due to measurement error and simple human error or forecasting errors. These types of errors can be measured through the use of statistical techniques and can be reduced by collecting more complete historical data. Model errors are a result of the difference between observed or realworld values and model values. Forecasting errors relate to the uncertainty associated with future events. Various studies have shown that forecasting errors are much more significant than model and data errors (Amekudzi and McNeil, 2000; AbouRizk and Siu, 2008). There are limitations on the ability to decrease forecasting errors since it is not easy to predict future events accurately. However, simulations can be applied to incorporate forecasting uncertainties in models (Amekudzi and McNeil, 2000).
4.3 Risk Assessment and Risk Management
Risks are often dealt with through risk assessment and risk management activities. The risk assessment and management process is aimed at answering specific questions in order to make better decisions under uncertain conditions. Risk assessment refers to the scientific process of measuring risks in a quantitative and empirical manner and usually precedes risk management. Risk management is a qualitative process that involves judging the acceptability of risks within any applicable legal, political, social, economic, environmental and engineering norms and implementing measures to reduce them to acceptable levels (Haimes 2004; Piyatrapoomi et al. 2004).
In the management of technological systems, the failure of a system can be caused by the failure of the hardware, the software, the organization, or the humans involved. The initiating events may also be natural occurrences, acts of terrorism or other incidents. In risk assessment, the analyst often attempts to answer the following set of questions (Kaplan and Garrick 1981; Haimes 2009):
What can go wrong? What is the likelihood that it will go wrong? What are the consequences (and what is the time domain)? Answers to these questions help risk analysts identify, measure, quantify and evaluate risks and their consequences and impacts.
Risk management builds on risk assessment by seeking answers to a second set of questions (Haimes 1991):
What can be done and what options are available?
21
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
What are the associated tradeoffs in terms of all costs, benefits and risks? What are the impacts of current management decisions on future options?
4.4 Risk Attitudes
The last question (in the risk management trio above) is the most critical one for any managerial decision making. It involves defining the agency's risk tolerance (i.e., the level of exposure and nature of risks that are acceptable). Decision makers must determine acceptable levels of risk. This acceptable level of risk is often influenced by public perceptions of risk. Society perceives various risks at different levels. For example, the risk of a traffic accident is far greater than the risk of bridge failures (judging from actual statistics), but in general, communities are more willing to tolerate the risk of a traffic accident than that of a bridge failure (Atkan and Moon 2009). In other words, communities will generally be willing to pay more to reduce the risk of catastrophic bridge failure than they would to improve roadway traffic safety in order to reduce roadway fatalities -- even though the risk for roadway fatalities is much higher than that of bridge fatalities. Risk attitudes influence how an agency determines investment priorities.
4.5 Examples of Risk Applications in TAM
To date, risk applications in TAM can be found in the prediction of facility performance and prioritization of projects, programs or plans for investment. A number of risk examples and applications are presented below to illustrate the nature of risk applications in TAM.
4.5.1 U.S. Federal Highway Administration - Risks Identification for Coastal Roadways
A FHWA hydraulic engineering circular highlights the fact that 60,000 miles of highways nationwide lie within the Federal Emergency Management Agency's (FEMA) 100-year floodplain (FHWA, 2008). This Circular also points out that more than 1,000 bridges may be vulnerable to failure modes that have been associated with recent coastal storms, such as Hurricane Katrina. Potential risks such as water level change, storm surge, shoreline erosion, shoreline recession, tsunamis, and upland runoff are presented in the guidance for analysis of planning, design and operations of highways in the coastal environment. Identifying such risks is the first step in risk assessment and management. Subsequent steps will involve quantifying the risks and developing actions to reduce the risks to acceptable levels. The failure of roadways and bridges in the Gulf Coast area during Hurricane Katrina would be considered catastrophic by most. In anticipation of future storms and sea level rise, several bridges in the Gulf Coast area have already been reconstructed at higher elevations (Meyer, 2008). 4.5.2 Department for Transport (England) Risk Matrix for Projects
For the Department for Transport (DfT), England's transportation agency, project prioritization includes identifying and managing risks associated with the road network.
22
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
The DfT has developed a risk matrix that assigns project values a score that relates to the probability of failure associated with a specific component. The higher the likelihood of failure, the greater the attention received in the investment program. The likelihood of a risk event is calculated as follows:
L(Risk Event) = L(Cause) * L(Defect)* L(Exposure) * L(Effect) where L stands for likelihood. (Equation 1)
Table 3 shows the agency's values for calculating the likelihood of risk events.
TABLE 3: Values for Calculating Likelihood of Risk Events (England DfT)
Likelihood Rating Description
Range of
Midpoint Values
Likelihood Values
Certain
Certainty
1.0
-
High
Highly likely
0.7-0.99
0.85
Medium
Likely
0.3-0.69
0.50
Low
Possible, but not
0.0-0.29
0.15
Likely
As an example, suppose that for a particular project, agency officials have determined that the likelihood of the cause of failure occurring is high (0.85), there is medium likelihood of the defect occurring (0.50), a low likelihood of exposure (0.15) and a high likelihood of the effect occurring (0.85). The risk associated with the project is estimated as follows:
L(Risk Event) = 0.85*0.50*0.15*0.85 = 0.054 or 5.4%
Similar assessments are made of all projects being considered and ranked according to the level or risk associated with each. This type of analysis can be conducted to identify the projects that pose the highest risk and allocate funds to solve the most serious problems (FHWA 2005).
4.5.3 Risk Analysis for Bridge Prioritization Queensland, Australia
Queensland has developed a program called Whichbridge that assigns a numerical score to each bridge based on the risks attached to the condition of the bridge. Factors considered in this assessment include the condition of the bridge components, environmental impacts, component materials, currency of inspection data; obsolete design standards and traffic volumes. System reports use a relative (rather than absolute) ranking to rank structures based on risk exposure and safety considerations. The risk is determined as a product of the probability of failure and the consequence of failure. Consequence is used as a surrogate for the costs of failure, which relate to such things as human factors, environmental factors, traffic access, economic significance and industry access consequences (FHWA 2005).
23
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
4.5.4 Risk Analysis for Asset Prioritization Edmonton, Canada
The City of Edmonton, Canada, has developed a risk-based approach for bridging their infrastructure gap (AbouRizk and Siu 2008; FHWA 2005). Their approach uses the traditional technical definition of risk as previously defined. Data such as the asset replacement value, age, dimensions, quantity and condition are collected. The condition rating system used is the ordinal scale for the ASCE Infrastructure Report card where A is very good, B is good, C is fair, D is poor and F is very poor. The alphabetical grades are converted to a numerical ordinal rating from 1 (F) to 5 (A), with 5 being the best. Using this system, estimates for expected failure of the assets are determined by multiplying the probability of failure of an asset in a particular condition by the elements of an asset in that condition by and summing the expected failure for each condition state as shown in Equation (1) below:
E(L) = E(LA) + E(LB) + E(LC) + E(LD) + E(LF) (Equation 2)
where: E(Lj) = Probability(asset failing while in condition j) x (# of elements in condition j)
Determining the impact of asset failure will vary depending on what risk factors an agency considers to have more impact. The City of Edmonton uses five areas to measure impact of failure and assigns the following weights (in parenthesis) to each area: safety and public health (33%), growth (11%), environment (20%), monetary value required to replace an infrastructure element (20%), and services to people (16%). The level of importance assigned to various types of impacts relates to the values of the communities that an agency serves.
Once the expected failure of an asset and the impact of failure are determined, the risk severity can be calculated as the product of the two values. The City of Edmonton defines risk severity zones as shown in Table 4. Classification of the assets into various risk severity zones provides information for allocating resources to manage the prevailing risks most cost effectively.
24
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
TABLE 4: Sample Risk Severity Zones (Edmonton, Canada)
Zone Acute
Description An acute level of severity is one in which both the expected failure and the impact of each unit of failure are intolerably high. At this level, there is the potential for loss of life if an asset fails combined with a high likelihood that an element asset will fail.
Critical Serious
If the asset is deemed to be at a critical level of risk, then either the expected failure will be high and the impact substantial or the impact of an asset's failure will be devastating and the probability of failure still moderate.
Assets with a serious level of risk may have severe or substantial levels of impact; however, these tend to be combined with a low level of expected failure. As such, assets at this level of risk will require attention, yet their needs do not necessarily require immediate rehabilitation or repair.
Important
An asset considered to be at an important level of risk corresponds to a situation where the levels of expected failure and impact can be addressed in keeping with a municipality's strategic approach. An important level of risk has been anticipated for most elements.
Acceptable The acceptable level of risk represents a situation in which the combined expected failure and level of impact are manageable.
The City of Edmonton has also applied risk analysis to develop a risk severity/replacement value chart that shows the relative risks and costs of different assets. The risk analysis segments the infrastructure assets into logical groupings based on common characteristics. For each segment (e.g., 1 km of road), data are collected describing the inventory, state and conditon of the 0-year rehabilitation estimates for the asset. The asset condition is categirzed using Emondton's standardized rating system and conditions assessed by reviewing the assets within a given department through a combination of workshops and independent analysis. Failure is assumed to occur in two ways, either suddenly and unexpectedly (i.e., catastrophic failure) or gradually and expectedly (i.e., performance faliure). The approach uses 155 deterioration curves and probabilities to determine expected failure. Risk severity values are plotted againset relacement values (Figure 3). Assets found in the upper right quadrant (i.e., high risk severity, high replacement value) are considered to be greater priority.
25
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
FIGURE 4: Risk Severity vs. Replacement Value Chart Edmonton, Canada Suurce: (FHWA 2005)
4.5.5 Risk Analysis for Bridges Prioritizing Bridge Investments and Inspections In light of the collapse of the I-35W Bridge in Minneapolis, Minnesota, there has
been growing interest in incorporating risk into transportation asset management as these systems relate to bridge management. Cambridge Systematics, in collaboration with Lloyd's Register, a firm that specializes in risk management in the marine, oil, gas, and transportation sectors, have developed a highway bridge risk model for 472,350 U.S. highway bridges, based on NBI data. The model developed uses Lloyd's Register's Knowledge Based Asset Integrity (KBAITM) methodology, which was implemented on Lloyd's Register's asset management platform, ArivuTM (19). This application defines risk as the product of the chance of failure and consequence of failure. However, failure is not defined as catastrophic failure, but rather as performance failure. Failure is defined as bridge service interruption, which may be caused by emergency maintenance or repair, or some form of bridge use restriction. The model predicts the mean time until a service interruption. A so-called, highway bridge risk universe, as shown in Figure 5, can be visualized using the ArivuTM platform (Maconochie et al. 2010).
26
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
FIGURE 5: Highway Bridge Risk Universe Source: (Maconochie 2010)
Probability of service interruption is calculated based on three risk units: deck, superstructure, and substructure. The probability that each one of these units would cause a service interruption is calculated. These probabilities are then added together to determine the overall probability that a bridge will experience a service interruption in the next year. Consequences of service interruption are determined using a number of bridge characteristics, such as ADT, percentage of trucks, detour distance, public perception, and facility served, that indicate the relative importance of the bridge to the network and users of the system. The consequence of service interruption is dimensionless, which allows the user to define the characteristics used to determine the relative importance of the bridge (Maconochie et al., 2010).
This model has a variety of potential applications. It can be used to prioritize bridge investments to minimize risk, and to prioritize bridge inspections.
4.6 Scenario/Risk Analysis: Applying MADM Methods to Prioritize Georgia Bridges Using NBI data and the GDOT bridge prioritization formula, Multiple Attribute
Decision Making (MADM) methods were applied to demonstrate the following: (1) The importance of normalizing bridge (or other asset) attribute scores before summing and ranking; (2) The potential impact of disaggregated data on bridge (or other asset) prioritization outcomes; and
27
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
(3) The potential impact of performance risk on bridge (or other asset) prioritization outcomes.
GDOT has a working bridge prioritization formula to allocate investment dollars. The formula has multiple criteria taking into consideration a range of factors of bridge condition and performance (Table 2). Each bridge is assigned an overall score based on the formula. GDOT's Bridge Information Management System (BIMS) contains data elements for each state or locally owned bridge in Georgia. The data elements used in the bridge prioritization formula are identical to (or based on) data elements from the NBI. The general form of GDOT's bridge prioritization formula is:
(Equation 3)
Table 5 describes the decision criteria in the bridge prioritization equation. Each variable in the formula is assigned a number of points based upon predetermined criteria set by the Department. For example, the point values for ADT range from 0 to 35; bridges with ADT greater than 24,999 receive 35 points, those with ADT greater than 14,999 receive 27 points, etc. The extreme values of points for any factor indicate the best and worst values for that particular factor. The point values for each bridge are inserted into the prioritization formula to calculate an overall score.
TABLE 5: GDOT Bridge Prioritization Formula -- Parameter Descriptions/Values
Variable
Description
Point Values
HS
Inventory Rating
0, 13, 25, 35
ADT
Average Daily Traffic
1, 3, 6, 10, 15, 21, 27, 35
BYPASS
Bypass/detour length (Also accounts for
0, 10, 18, 25
posting, ADT, and % trucks)
BRCOND Bridge Condition based on condition of
0, 10, 15, 20, 25, 30, 35, 40
deck, superstructure, and substructure
Factor
Weighting Factor based upon functional
1.0, 1.3, 1.5, 1.8
classification, i.e., interstate, defense, NHS
TimbSUB
Timber Substructure
0, 2, 5 (state owned)
TimSUP
Timber Superstructure
0 or 2
TimbDECK
Timber Deck
0 or 2
POST
Bridge Posting
0 to 5
TEMP
Temporary Structure Designation
0 or 2
UND
Underclearance
0, 1, 2, 3, 4, 5, 6
FC
Fracture Critical
0 or 15
SC
Scour Critical
0, 1, 2, 3, 4, 5, 6
HMOD Inventory Rating less than 15 tons for HMOD
0 or 5
truck
Narrow
Based on number of travel lanes, shoulder
0 or 30
width, length, and ADT
(Source: GDOT Bridge Prioritization Formula, January 13, 2010)
28
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
Using data for seven selected bridges in Georgia, three scenarios were developed to examine the impacts of (a) normalization, (b) data disaggregation and (c) performance risk on the bridge prioritization outcomes.
The results of the study demonstrate that in bridge (or other asset) prioritization (ranking), it is important to normalize the values of the different decision criteria (e.g., ADT, bridge condition, bypass/detour length, etc.) prior to finding the aggregate value of the prioritization function in order to indicate the relative utilities of each decision criterion to the decision maker. Not normalizing these values can result in misleading information in the bridge prioritization outcomes.
Secondly, the results show that disaggregating the bridge condition data into substructure, deck and superstructure data can result in a different ranking than when they are aggregated, indicating the value of using more disaggregate data when it is available. In aggregated data, for example, poor substructure condition can be averaged out by very good superstructure condition, and the result of the ranking can fail to reflect the poor substructure condition.
Thirdly, the results demonstrate that including historic bridge data in the bridge prioritization formula can capture the performance risk of bridges and result in a change in bridge prioritization outcomes. The analysis results also show that performance risks will influence minimum standards for TAM.
This study recommends that bridge prioritization decision making will be enhanced if the bridge data is normalized before it is aggregated into an overall score; better prioritization outcomes will be obtained if the bridge condition data is disaggregated as far as the data makes it possible, and bridge performance risk should be captured in the prioritization by using historic bridge condition data when this is available. The results also show that a failure to address performance risk in bridge (and other asset) prioritization may result in undetected performance reduction in the overall system. A full-scale analysis is available in the Appendix 6.
5. THE PERFORMANCE RESOURCE CATALOGUE
A catalogue on performance management resources was developed to facilitate GDOT's access to performance management resources. The Transportation Performance Management Resource Catalogue organizes performance management resources under seven main headings and makes them readily available to agencies for use as they develop their performance management programs:
1. Strategic Planning 2. Performance Measures 3. Performance Targets 4. Funds Allocation and Programming 5. Organizational Structure
29
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
6. Data 7. Communicating with Stakeholders.
The full catalogue is included in the Appendix 7.
FINAL REPORT
6. PIPELINE OF TRANSPORTATION PROFESSIONALS
This project supported three students in obtaining masters degrees in Civil Engineering (Transportation): Ms. Yi Lin Pei (currently employed at Cambridge Systematics, Atlanta); Mr. J. P. O'Har (currently in the Ph.D. Program in Transportation Systems at the Georgia Institute of Technology); and Ms. Jamie Fischer (currently in the Ph.D. Program in Transportation Systems at the Georgia Institute of Technology). Developing and presenting peer-reviewed research is a critical part of the graduate education of students supported by research through the Georgia Transportation Institute/University Transportation Center. Listed below are additional related conference presentations and peer-reviewed journal publications developed and delivered by these students during their masters programs.
1. Pei, Y. L., A. A. Amekudzi, M. D. Meyer, E. M. Barrella and C. L. Ross. Performance Measurement Frameworks and Development of Effective Sustainable Transport Strategies and Indictors. Transportation Research Record: Journal of the Transportation Research Board, No. 2136, Transportation Research Board of the National Academies, Washington, D.C., 2010, pp. 73-80.
2. Meyer, M., Amekudzi, A. and J.P. O'Har. Transportation Asset Management Systems and Climate Change: An Adaptive Systems Management Approach, Paper accepted for publication in the Journal of the Transportation Research Board, Washington D.C: National Academy Press, 2010.
3. O'Har, J. P. Risk-Oriented Decision Making Approaches in Transportation Asset Management. Sixth Annual Interuniversity Symposium on Infrastructure Management, University of Delaware, June 2010.
4. Fischer, J. M., A. A. Amekudzi, M. D. Meyer and A. Ingles. The Transportation Performance Management Resource Catalogue, Fourth International Transportation Systems Performance Measurement Conference, May 2011, Irvine CA. (Poster Presentation)
5. O'Har, J. P., and A. A. Amekudzi. Effect of Uncertainty on Project Prioritization, Fourth International Transportation Systems Performance Measurement Conference, May 2011, Irvine CA. (Poster Presentation)
30
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
7. CONCLUSIONS AND RECOMMENDATIONS
This study identifies factors and guidance for developing performance measures and targets for effective asset management. The study was conducted through a review of the transportation, performance management and performance measurement literature, a statewide survey to determine the status of performance management, an evaluation of risk applications in TAM and a scenario/risk analysis to contribute to the enhancement of GDOT bridge prioritization procedures.
The study finds that performance measurement is an evolving practice and occurring widely among state DOTs, with different agencies at different levels of maturity in the process, As performance measurement has evolved, there has been a shift in focus from performance measurement to performance management which entails using the data collected to make budget allocation decisions that result in the achievement of strategic goals. The study identifies three generational models of performance management moving from the traditional model with several measures not necessarily integrated with any overarching strategic goals (Generation 1) to streamlined outcome measures strategically selected to evaluate progress toward agency strategic goals (Generation 2) to increased adaptability to respond quickly to political and other external pressures to create responsive performance measurement and management (Generation 3).
Over the period of this study, GDOT has moved to refine agency strategic goals to four clear goals and taken steps to develop performance measures and metrics for evaluating progress toward the goals, assigning ownership of various measures to different agency officials, all characteristics of a second generation agency. The following recommendations are made based on the study findings.
1. Conduct a review of GDOT's performance measurement and management process against current standards: Using the performance standards identified in this study, conduct a review of GDOT's performance measurement and management process and procedures.
2. Benchmark against selected DOTs: Given that performance measurement and management in TAM is an evolving practice, benchmarking has been found to be a worthwhile activity in progressively refining agency performance measurement and management in TAM. Other second-generation agencies identified in 2004 (such as Florida DOT, Missouri DOT, Maryland DOT) are good candidates for benchmarking: GDOT can compare notes on what such agencies are considering as their next steps. Third-generation agencies (such as Minnesota DOT, Ohio DOT and Washington State DOT) are good candidates for benchmarking: GDOT can compare notes on longer range options particularly to add flexibility to enable the agency to quickly adapt or fold in new requirements. This capability will allow the agency to respond quickly to leadership, legislature, funding and other changes -- anticipated and unanticipated. Utah DOT and Indiana DOT are also good candidates for
31
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
benchmarking: having participated in a peer exchange with GDOT in 2009, these agencies can be considered to compare progress made within the last two years.
3. Develop Metrics for Evaluating Progress toward Strategic Goals: Demonstrated progress toward strategic objectives is a critical element of a well functioning performance measurement and management program in TAM.. Appropriate metrics are importance for measuring performance progress, and appropriate targets for managing progress in reasonable timeframes.
4. Link Performance Metrics with Resource Allocation Decision Making/Develop Capabilities for Evaluating Tradeoffs: These two actions are internationally linked because developing appropriate performance reports for resource allocation decision making will entail having the appropriate capabilities for evaluating investment tradeoffs across different business units and asset classes to achieve agency strategic objectives more effectively. Using performance metrics to actually manage agency progress toward strategic objectives will involve linking metrics with decision making to allocate resources across different business units and assets. Doing this successfully will involve having adequate capabilities for evaluating tradeoffs for investments in different asset categories with respect to how these investments achieve various agency strategic objectives and goals.
5. Refine Metrics for Use in Broader Agency Functions: The survey shows that about 70% (28) of the responding agencies in the survey reported that performance measures are mostly used in management and planning, and not in all DOT functions. About half of the responding agencies (21) reported using performance measures in operations and slightly less than half (18) in design/engineering. Evaluating the use of performance metrics in agency functions and developing appropriate reports for resource allocation decisions is a critical step to link performance metrics with decision making. In addition, an internal survey to understand the performance data needs and opportunities for planning, management, operations and design/management, can assist in refining performance data for such needs. In addition, identifying performance data needed to manage to achieve goals for the "people" objective will help the agency make progress in these areas. This will involve the development of near-term and longer-term targets, aligned with agency objectives, financial constraints, customer satisfaction data, etc.
6. Refine Performance Communication Tools: This recommendation speaks to the importance of improving public and internal communication. At least one thirdgeneration agency (i.e., WSDOT) has reported that surveying external and internal stakeholders about transportation performance (including the general public, legislature and media) was critical in helping them improve performance communication with their stakeholders. Quarterly reporting emerged in response to a credibility crisis with the legislature and media and the need to demonstrate accountability. Through quarterly reporting, WSDOT has demonstrated accountability and improved credibility with the legislature and media. This credibility gain led to the 2003 Transportation Funding Package which raised the gas
32
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
tax and several fees to support an expanded highway and rail construction program as well as transit and demand management programs. A number of agencies have adopted project delivery performance reporting systems, e.g. Missouri DOT and Virginia DOT, including project dashboards, quarterly report cards, etc. Bremmer et al. (2004) recommend proactive performance communication to prepare stakeholders for various future initiatives in the horizon.
7. Address Uncertainties in Performance Management: Identify and assess uncertainties in existing TAM procedures and data and develop appropriate procedures to incorporate the uncertainties in performance reporting. Unaddressed uncertainties in TAM procedures, e.g., performance modeling and project prioritization, can affect the quality of decision support information from TAMS as demonstrated in this study (Appendix 6). The study demonstrated that incorporating uncertainties in prioritization procedures can lead to notably different results in prioritization outcomes.
8. Performance Audits: Evaluate performance audits for states to determine the requirements of state audits for DOTs and address gaps in existing performance management procedures to ensure readiness. State DOTs that use and publish performance measures are increasingly being subjected to performance audits. Information supplied by the National Association of State Auditors, Comptrollers and Treasurers (NASACT) suggests that programs for efficiency and economy audits were being conducted in at least 30 states as reported in Bremmer et al. (2004) and Raaum and Campbell (2006). In the state of Georgia, the Georgia State Department of Audits and Accounts (DOAA) conducts evaluations of state funded programs and activities to answer such questions as: (i) Is this program achieving its goals and objectives? (Are there other ways to achieve this goal?) (Is this goal still relevant?) (How do other states achieve this goal or fulfill this need?) (ii) How well does this program do what it is intended to do? (How many are served?) (What does it cost per unit?) (How does Georgia compare with other states in this regard?) (iii) Is this program complying with all applicable laws and regulations? (Does this program meet all federal grant requirements?) (Is the program fulfilling its obligations as mandated by state law?) (Georgia DOAA Website)
33
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
REFERENCES 1. AbouRizk, S.M. and Siu, K.L. Standardized Risk Analysis for Infrastructure Assessment. A. Amekudzi and S. McNeil (eds.). 2008, Infrastructure Reporting and Asset Management: Best Practices and Opportunities, pp. 131-140. 2. Amekudzi, A. and McNeil, S. Capturing Data and Model Uncertainties in Highway Performance Estimation. Journal of Transportation Engineering, American Society of Civil Engineers, Vol. 126, pp. 455-463, 2000. 3. American Association of State Highway and Transportation Officials (AASHTO). AASHTO Subcommittee of Asset Management Strategic Plan 2011 - 2015. Federal Highway Administration (FHWA). 4. American Association of State Highway and Transportation Officials (AASHTO). Measuring Performance among State DOTs: Sharing Best Practices, Washington, D.C., 2006. 5. Asset Management Best Practices/Lessons Learned Utah/Indiana/Georgia Peer Exchange/Scan, Prepared by A. Amekudzi for Georgia Department of Transportation, September 2009. 6. Aktan, A.E. and Moon, F.L. Mitigating Infrastructure Performance Failures Through Risk-based Asset Management. Drexel Intelligent Infrastructure Institute, Drexel University. Philadelphia, PA: 2009. 7. Baird, M.E., and Stammer, R. E., Jr. Measuring Performance of the State Transportation Agencies: Three Perspectives. In Transportation Research Record: Journal of the Transportation Research Board, No. 1729, TRB, National Research Council, Washington, D.C., 2000, pp. 26-34. 8. Federal Highway Administration. Highways in the Coastal Environment. Hydraulic Engineering Circular No. 25, 2008. 9. Federal Highway Administration, Office of International Programs, Transportation Performance Measures in Australia, Canada, Japan and New Zealand. Report prepared by the International Scanning Team, Washington, D.C., 2004. 10. Federal Highway Administration. Transportation Asset Management in Australia, Canada, England, and New Zealand. Prepared for NCHRP Panel 20-36. Washington, D.C.: Transportation Research Board, 2005. 11. Federal Highway Administration, U.S. Domestic Scan Program: Best Practices in Transportation Asset Management. NCHRP Project 20-68. Report Prepared by Cambridge Systematics, Inc. with Michael D. Meyer, Ph.D., National Cooperative Highway Research Program, Transportation Research Board, February 2007. 12. Fischer, J. M., A. A. Amekudzi and M. D. Meyer. Transportation Performance Management: A Resource Catalogue. Working Report, May 2011. 13. Georgia Department of Audits and Accounts. Home/AboutUs/Performance Audits Operations Division, Accessed at http://www.audits.ga.gov/PAO/PAOdivision.html in July 2011. 14. Georgia Department of Transportation. Bridge Prioritization Formula, January 13, 2010. 15. Georgia Department of Transportation, FY 2011 Strategic Plan Update, Draft, June 2010.
34
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
16. Georgia Institute of Technology, An Inventory of Asset Management Tools at the Georgia Department of Transportation, PowerPoint Presentation prepared by O'Har, J. P., A. Amekudzi and M. Meyer for Georgia Department of Transportation, January 15, 2009.
17. Haimes, Yacov Y. Risk Modeling, Assessment, and Management. 2nd Edition. Hoboken, New Jersey: John Wiley & Sons, Inc., 2004.
18. Haimes, Yacov Y. Risk Modeling, Assessment, and Management. 3rd Edition. Hoboken, New Jersey: John Wiley & Sons, Inc., 2009.
19. Haimes, Y. Y. Total risk management. Risk Analysis International Journal, Vol.
11 No. 2, 1991, pp. 169171.
20. Helton, J.C. and Burmaster. "Treatment of Aleatory and Epistemic Uncertainty in the Performance of Complex Systems." Reliability Engineering and System Safety, Vol. 54, 1991, pp. 91-94.
21. Kaplan, S and B. J. Garrick. On the Quantitative Definition of Risk. Society for
Risk Analysis, Vol. 1, No. 1, 1981, p. 11-27.
22. Maconochie, J.A. U.S. Highway Bridge Risk Model - Development, Summary Results, and Applications for Federal and State Transportation Agencies. Transportation Research Board 89th Annual Meeting, Washington, D.C., 2010.
23. Meyer, M.D. Design Standards for U.S. Transportation Infrastructure: The Implications of Climate Change. Georgia Institute of Technology. Commissioned Paper. Transportation Research Board, Washington, D.C., 2008.
24. NAMS Group, International Infrastructure Management Manual, 2006.
25. National Cooperative Highway Research Program, AASHTO Transportation Asset Management Guide, Vol. 1, 2002.
26. National Cooperative Highway Research Program, AASHTO Transportation Asset Management Guide, Vol. 2 - A Focus on Implementation, 2010.
27. National Cooperative Highway Research Program, Transportation Research Board. Transportation Performance Management: Insight from Practitioners. NCHRP Report 666. Report prepared by Cambridge Systematics, Inc., and High Street Consulting, Washington, D.C., 2010.
28. National Transportation Safety Board. Collapse of I-35W Highway Bridge: Minneapolis, Minnesota, August 1, 2007. Washington, D.C.: 2008.
29. O'Har, J. P. and A. A. Amekudzi. Effect of Uncertainty on Project Prioritization and Target Setting, Working Paper, May 2011.
30. Pei, Y. L., Fischer, J. M. and A. A. Amekudzi. Performance Measurement in State Departments of Transportation: A Literature Review and Survey of Current Practice. 89th Annual Meeting of the Transportation Research Board, CDROM, January 2010a.
31. Pei, Y. L., A. A. Amekudzi, M. D. Meyer, E. M. Barrella and C. L. Ross. Performance Measurement Frameworks and Development of Effective Sustainable Transport Strategies and Indictors. Transportation Research Record: Journal of the Transportation Research Board, No. 2136, Transportation Research Board of the National Academies, Washington, D.C., 2010b, pp. 73-80.
35
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
32. Piyatrapoomi, N, Kumar, A and Setunge, S. Framework for Investment DecisionMaking under Risk and Uncertainty for Infrastructure Asset Management. Research in Transportation Economics, Vol. 8, 2004, pp. 199-214.
33. Poister, T. H. NCHRP Synthesis 326: Strategic Planning and Decision Making in State Departments of Transportation, National Cooperative Highway Research Program, TRB, National Research Program, Washington, D.C., 2004.
34. Raaum, R. and R. Campbell. Challenges in Performance Auditing, APA CPAG Research Series: Report No. 5, June 2006.
35. Winkler, R.L. Uncertainty in Probabilistic Risk Assessment. Reliability Engineering and System Safety, Vol. 54, 1996, pp. 91-94.
36
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
FINAL REPORT
This page is intentionally left blank.
37
Best Practices in Selecting Performance Measures and Standards
for Effective Asset Management
APPENDICES
Submitted to: Georgia Department of Transportation Angela Alexander, angela.alexander@dot.ga.gov
Submitted by: Georgia Institute of Technology Adjo Amekudzi, Ph.D., adjo.amekudzi@ce.gatech.edu Michael Meyer, Ph.D., mike.meyer@ce.gatech.edu
July 2011
Appendices 1 2(a)
2(b) 3 4 5 (a)
5(b) 6
7
TABLE OF CONTENTS
Contents
Literature Review Synthesis (October 2009/Text Report) Pei, Y. L., J. M. Fischer and A. A. Amekudzi. Performance Measurement in State Departments of Transportation: A Literature Review and Survey of Current Practice. 89th Annual Meeting of the Transportation Research Board, January 2010. (Peer-reviewed conference paper) State-of-the-Practice Survey of the State DOTs on the Use of Performance Measures and Targets (April 2010/Power Point Report) Report on Interview of GDOT Officials on Status of Asset Management at GDOT (September 2009/Text Report) Inventory of Asset Management Tools at GDOT (June 2009/Power Point Report) Transportation Asset Management Best Practices at State DOTs: An Update to the Domestic Scan (July 2009/PowerPoint Report)
Asset Management Peer Exchange: Utah/Indiana/Georgia (September 2009/Text Report) O'Har, J. P. and A. A. Amekudzi. Effects of Performance Uncertainty and Data Disaggregation on Project Prioritization in Transportation Asset Management (Working Paper) Fischer, J. M., A. A. Amekudzi and M. D. Meyer. Transportation Performance Management: A Resource Catalogue. Working Report, May 2011. (Text Report)
Best Practices in Selecting Performance Measures and Standards
for Effective Asset Management
APPENDIX 1
Best Practices in Selecting Performance Measures and
Standards
A LITERATURE REVIEW
Submitted to: Georgia Department of Transportation Georgene Geary, georgene.geary@dot.ga.gov
Organization: Georgia Institute of Technology
Principal Investigator: Adjo Amekudzi, Ph.D. Co-Principal Investigator: Michael Meyer, Ph.D., P.E. Graduate Research Assistant:
Yi Lin Pei
October 5, 2009 (Revised)
Best Practices in Selecting Performance Measures & Standards
A Literature Review
TABLE OF CONTENTS
1. Introduction
2
2. Selecting Performance Measures
2
3. Selecting Performance Targets
9
4. Performance Frameworks
10
References
13
1
Best Practices in Selecting Performance Measures & Standards
A Literature Review
1. INTRODUCTION
This literature review was conducted to highlight current and best practices for selecting performance measures and targets in transportation asset management in particular, and transportation planning in general, both domestically and internationally. The review is part of the deliverables for the project: "Best Practices in Selecting Performance Measures and Targets in Transportation Asset Management," funded by the Georgia Department of Transportation. The transportation planning, transportation asset management and business management literature were reviewed to identify current and best practices. What follows is a summary of key considerations for selecting performance measures and targets. In addition, the review touches upon frameworks for developing performance measures and best practices in transportation agencies. The Appendix includes an annotated bibliography of the documents that provided major content for this report.
2. SELECTING PERFORMANCE MEASURES
Performance measures are specific numerical measurements to track progress toward particular goals and objectives of an agency. The central function of any performance measurement process is to provide regular, valid data on indicators of performance. The current planning and management literature identifies some basic principles of good performance measurement presented below.
1. Performance measures should flow directly out of an agency's mission and objectives.
Establishing a performance measurement process begins with identifying a program's or agency's mission and its basic objectives. Setting clear, concise and achievable goals and objectives is critical to the success of any planning effort (CS, 2000). What is the agency intended to accomplish? Performance information should flow from, and be based upon, the answer to this fundamental question. A mission/objectives statement should identify the major results an agency or program seeks to achieve. It should also identify who the agency's or program's customers are, unless it is already obvious. Who benefits from the program? Who are direct recipients? Who are indirect recipients? What other people not directly targeted by the program can be significantly affected? (Hatry and Wholey, 2007)
This best practice not only includes the need to create an integrated framework that aligns agency objectives across different levels vertically (i.e., one that is vertically integrated), but also ensures that such a framework is horizontally integrated across the agency's functional units. While top-to-bottom consistency is essential for providing a strong linkage between policy objectives and decision making, horizontal consistency allows for tradeoffs to be made across different functional areas. Ohio DOT and New York State DOT have a vertical alignment of performance measures while Michigan DOT conducts
2
Best Practices in Selecting Performance Measures & Standards
A Literature Review
regular meetings across different functional units horizontally to improve communication (CS, 2006). Virginia DOT's strategic process emphasizes the use of performance measures in achieving each goal that is ultimately tied to improving organization accountability (Poister, 2004).
2. Performance measures should provide a "balanced" picture of the agency's business.
The populated framework of performance measures should provide a concise overview of the organization's performance (Kennerly and Neely, 2002). They should reflect financial and non-financial measures, internal and external measures, and efficiency and effectiveness measures (Kaplan and Norton, 1992; Keegan et al., 1989). General categories of information used in performance measurement systems are given below. Effective performance measurement systems tend be results oriented, incorporating output and outcome measures.
Inputs relate to the resources (i.e., expenditures or labor) dedicated to the program to produce output and outcomes.
Outputs relate to the products and services delivered by the program, such as the amount of work done by the organization or its contractors (e.g., the number of miles of road repaired).
Outcomes relate to conditions that are outside the activity of a program itself and that are of direct importance to customers and the general public. While outputs are the work that the organization does, outcomes are what these outputs accomplish for the customer. Outcomes are not what the program itself did but the consequences of what the program did.
Efficiency or productivity relates to the relationship between the amount of input and the amount of output or outcome of an activity or program (Hatry and Wholey, 2007).
Input and output measures have been more common in the past two decades. However, there has been a general movement toward managing for results or outcomes, driven by increased demands for accountability (Poister, 2007). Results-based measures not only reflect an agency's success in meeting stated goals and objectives, they also focus on the beneficiaries of the agency's service, i.e., the customers. However, an over-focus on outcome measures has been criticized recently owing to difficulties in measurement, higher cost, and their technical nature that makes them harder to understand (CS, 2006). As a result, many agencies are reverting to including output measures, as a blend of output and outcome measures is believed to be preferable to using either type alone (CS, 2006). At the state level, MnDOT has already started to reemphasize output measures at lower levels and Montana DOT has recognized the difficulty in coordinating pavement and bridge preservation strategies using outcome indicators (CS, 2006). Internationally, officials also have a good understanding of the importance of using both output and outcome indicators. In
3
Best Practices in Selecting Performance Measures & Standards
A Literature Review
Canada for instance, a chain that divides outcomes into immediate, intermediate and ultimate outcomes, is used in each functional area to support the ultimate objective of developing a more sustainable transportation system (FHWA, 2004).
The use of input and efficiency measures can help with tracking how efficiently agencies are using their resources to generate outputs and outcomes.
3. An effective performance measurement system will have few and welldefined measures that are tied to a handful of clear goals to be achieved within a specific timeframe.
Conventional practice has it that what gets measured gets managed, and that a short and more targeted list of performance measures is likely to be applied more effectively than a long and unfocused one. An effective performance measurement framework will contain a handful of clear objectives that are linked with the organization's goals. More goals are not necessarily better than fewer goals as the latter can provide a clearer picture of the agency's priorities and have a higher likelihood of being used effectively. Along the same lines, the performance measures used under each goal should be kept to a meaningful few that help to measure progress in reaching that goal. Numerical targets are also better than obscure or `aspirational' targets to track progress toward goals. Also, specifying a timeframe for achieving strategic goals is highly recommended to ensure accountability.
As performance measures are increasingly used to report to external audiences, such as the governor and the general public, creating more performance measures simply to comply with external mandates sometimes becomes attractive. However, performance measures appear to be more useful when they are created out of a genuine commitment on the part of agency officials to measure performance and use the data meaningfully toward achieving agency goals. Among DOTs, decision rules in developing performance measurement systems, such as tracking only performance that the agency seeks to influence and believes it can feasibly impact, are used to keep the number of measures both meaningful and manageable.
In addition, formal performance measurement frameworks may be used to develop meaningful measures. Such structures tend to be useful when the accompanying performance measures are well thought out to link with broader agency goals and objectives. For instance, Montana DOT uses a balanced scorecard model for performance measurement. After implementation, the agency realized that too many action items were used, some of which were rather general with no indication of tasks to be undertaken, while some had unpredictable effectiveness. As a result, the plan became too cumbersome and the DOT worked to reduce action items down to about 150 from 200 (Poister, 2004).
An international scanning tour on performance measurement found that the most important measures are those needed to influence budget allocation
4
Best Practices in Selecting Performance Measures & Standards
A Literature Review
and investment decisions, and that long lists of measures that lack focus tend to exert little influence on decision making. For example, Japan uses a core set of 17 performance measures, which not only reflect issues considered really important but also simplify data collection and reporting and lessen the burden on staff (FHWA, 2004).
Lastly, a harder task lies in how to select performance measures that are collectively unbiased and lead to improved performance in the right direction. Potential biases need to be thought through as measures are selected for tracking progress toward broader agency goals.
4. Customer satisfaction is a key performance measure.
Customer satisfaction should be a key factor in setting up performance measurement for a transportation system, as the end purpose of transportation infrastructure is to provide service to its users, the customers. A good performance measurement system must therefore have systemic customer feedback.
Several state DOTs have a customer focus that is reflected in their performance measurement systems. In the early 1990s, for example, Minnesota DOT begun to survey motorists in the state to assess the percentage that are satisfied with travel times. PennDOT uses surveys to determine the condition of roads used by motorists. Montana DOT conducts public opinion surveys and meets with stakeholder groups regarding the outcomes of its Performance Programming Process. The process provides feedback to the agency and assists in future policy formulation. New Mexico DOT's Compass incorporates 16 customer-focused measures (Bremmer et. al., 2005).
Internationally, measures of customer satisfaction are common. For example, New Zealand's approach to customer satisfaction focuses on identifying customer dissatisfaction. By asking more focused questions in customer surveys, agencies are more successful in getting feedback to determine organizational performance.
Balancing the satisfaction of the public/media, legislature and management are all important within a political environment. Sate DOTs such as New Mexico, Minnesota and Washington have demonstrated real time success with balancing the three factors (Bremmer et. al., 2005).
A framework such as the Balanced Scorecard used in Business Management and to a limited extent in Transportation can be effective in balancing customer, financial, internal business and growth perspectives (Poister, 2007) across vertical and horizontal levels.
5
Best Practices in Selecting Performance Measures & Standards
A Literature Review
5. Performance measurement systems should be periodically evaluated in an iterative process.
A performance measurement system should evolve in response to evolving goals and changing priorities of an agency, and data availability, among other factors. A performance measurement system therefore needs to be periodically refined through evaluation and feedback.
There are several ways to structure the feedback process to support policy and resource allocation decisions in asset management. For example, Florida DOT uses a Continuous Cycle approach where policy is developed and implemented, performance is measured and the results affect the long and short range plans through the adjustment of policies (CS, 2006). Frequent performance reviews, such as the quarterly management review adopted by Colorado DOT can also be used, where problems, e.g., under performance, can be recognized quickly and corrected. In addition, performance evaluation can also be achieved through public feedback. Such performance measurement systems are viewed as customer focused. Montana DOT, for example, conducts public opinion surveys that provide critical feedback to their performance programming process and help with future policy formulation (CS, 2006).
In addition to helping with policy formulation, the performance measures can also be revised and improved. In this regard, DOTs can experiment to develop and revise approaches to performance measurement in an attempt to resolve issues with quality, methodology, reliability, cost and usefulness. For example, before and after studies are important elements of performance measurement in Japan and Australia (FHWA, 2004). The impact of adopted actions on selected performance measures serve as feedback to the decision making process helping officials to determine the likely results of similar actions. The relative usefulness of performance measures should be periodically evaluated to help refine the measures as needed.
6. Performance measures should use good and available data that the agency can reasonably collect without straining their capacity.
As outlined in the AASHTO Transportation Asset Management Guide, good data are critical to performance measurement (2002). However, balancing data availability and affordability with quality and analytical rigor is often a difficult task. While having too little data makes it difficult to track performance effectively, having too much data is not only expensive, but less cost-effective, and potentially confusing and lacking in cohesiveness to the general public and other external stakeholders.
An integrated data collection strategy can be used to address this issue. Centralizing the data collection function at the highest level possible can also lessen the effort needed for data collection and allow for greater consistency. For example, the small size of Maryland gives the DOT an advantage of having
6
Best Practices in Selecting Performance Measures & Standards
A Literature Review
only one inspecting team to conduct statewide data gathering, saving costs and providing greater data consistency (CS, 2006).
Internationally, some of the more successful performance measurement programs have occurred in data-rich environments with a history of strong data collection and analysis. Sophisticated measures can be used in areas where there is a need and the institutional capacity allows for the collection of supporting data.
7. Performance measures increasingly include measures of environmental quality and sustainability.
A recent survey of the 50 state DOTs indicates that various state DOTs appreciate the importance of sustainability in their internal and external activities, and can point to specific initiatives that demonstrate their interest in or commitment to sustainability (Barrella et al., Forthcoming).
In Transportation, sustainability is a term used to capture the balance between transportation mobility and accessibility, and the economy, environment and social quality of life including equity. The concept of sustainability is increasingly important as energy and climate change, and other related issues have become a national and global priority.
A number of DOTs have performance measurement systems that include sustainability factors, particularly environmental factors, e.g., Washington State DOT, Missouri DOT and Iowa DOT. CalTrans and Texas DOT have adopted a range of sustainability indicators. A number of DOTs have also developed green rating systems that use sustainability principles and measures to prioritize projects for development, e.g., GreenLITES, i.e., Green Leadership in Transportation Environmental Sustainability (NYSDOT); Green Roads (WashDOT), and STARS, i.e., Sustainable Transportation Access Rating System (Oregon). Sustainability measures in Transportation are increasingly being used internationally as well, e.g., in the U.K. and New Zealand. In addition, while dollar valuations of environmental measures such as air pollution have long existed, the monetization other sustainability measures is gaining more traction (Weisbrod et. al., 2007).
All of these activities reflect a growing interest in incorporating environmental quality and sustainability concepts and measures in Transportation planning.
8. Performance measurement reporting and communication should be clear and easy to understand.
Increasing demands for accountability make performance measurement communication a critical issue in transportation agencies today. Effective reporting to external stakeholders, i.e., reporting on budget and demonstrating on-time performance, are critical to obtaining funding. Various approaches are
7
Best Practices in Selecting Performance Measures & Standards
A Literature Review
used by DOTs to communicate key issues to political decision makers and the general public.
One approach is the scorecard where key indicators are presented as measures of success in achieving objectives. Actual values are presented against target values for designated time periods. For example, Missouri DOT tracks the implementation of various strategies using scorecards in key areas; these scorecards are reviewed by top management on a quarterly basis (Poister, 2004). While scorecards may be used largely for internal communication, report cards and reports are developed by various DOTs, e.g., Florida DOT, Washington State DOT and Virginia DOT, to report performance to external stakeholders. Posting these reports on the Internet not only increases readership but also improves transparency and accountability.
The dashboard has been designed to report progress at a glance, often employing symbols and colors to display results. Virginia DOT has a dashboard online that can be easily updated to track progress, and can also allow different units within the Department to easily crosscheck each others' progress (Bremmer et. al., 2005). Minnesota DOT has developed dashboard reports that clearly show performance versus targets for each department (Bremmer et. al., 2005).
Visualization of critical information is important to effectively communicate performance to stakeholders. Ineffective presentation can result in the loss of funding and public support, and impede progress.
3. SELECTING PERFORMANCE TARGETS
One of the important gaps in managing transportation performance is how to set performance targets, or standards, for performance measurement. While there is extensive and growing literature on performance measures, relatively little attention has been given to how to set performance targets and the role of targets in transportation planning (Schmitt, 2007). A research proposal was generated for setting effective performance targets (Schmitt, 2007). NCHRP Project 8-70 is developing a comprehensive set of methods for establishing performance targets for all aspects of transportation. The final report is anticipated this year. NCHRP Report 551 on Performance Measures and Targets for Transportation Asset Management (CS, 2006) provides some guidance on setting performance targets. Despite its focus on asset management, the steps it outlines can be extended to other DOT functional divisions as well. The report recommends that consideration should be given to financial, policy, technical and economic factors when setting performance targets. In addition, it suggests that the establishment of long term and short term targets should follow seven logical steps as follows (CS, 2006):
1. Define contexts and time horizons 2. Select scope of measures for targets 3. Develop long-term goals
8
Best Practices in Selecting Performance Measures & Standards
A Literature Review
4. Consider funding availability 5. Analyze resource allocation scenarios and tradeoffs 6. Consider policy and public input 7. Establish targets and track progress
A piece of the literature that examines performance targets in the UK provides additional information on different methods of establishing performance targets, and the tradeoffs among the methods (Marsden and Bonsall, 2006). It first summarizes the motivations for developing targets: legal and contractual obligations, resource constraints, consumer orientation and political aspirations. Based on these motivations, three ways to set targets are discussed. Modelbased methods rely on computer models to examine how a given indicator varies under a range of scenarios. It is the most realistic method and can allow for different scenarios to be examined. Where variables cannot be modeled, extrapolation and evidence-led judgment can be used in a second method that is based on historical data. The most subjective method is aspirational, where targets are set because they should be set. While each method has positive and negative aspects, the best method is perhaps one that can establish targets that can be tied back to the most fundamental goals (Marsden and Bonsall, 2006). The target setting procedure presented in NCHRP Report 551 appears to be a combination of the three methods.
A case study on performance measures and target setting in Detroit's planning process provides a good example of performance target setting in the US. The Southeast Michigan Council of Governments (SEMCOG) uses the AssetManagerNT program to explore different scenarios in program funding and the expected future performance of different program areas, such as bridge preservation. The target setting process not only involves running different scenarios, but involves the engagement of stakeholders to determine which scenarios are most positively received (Guerre and Evans, 2008). Such a process that considers different constraints and involves stakeholder input can generate realistic and effective performance targets.
4. PERFORMANCE FRAMEWORKS
Performance frameworks are structured processes that provide guidance for selecting performance measures. They explain the rationale used in selecting adopted measures. While various agencies may have informal and undocumented processes for selecting performance measures, there is usually a rationale behind the adoption of performance measures. Some examples of formal frameworks are given below to highlight documented procedures for selecting performance measures. Documented processes can help agencies reevaluate measures periodically to keep them current with agency goals and objectives, customer expectations and other internal and external factors.
9
Best Practices in Selecting Performance Measures & Standards
A Literature Review
Balanced Scorecard Framework
Performance frameworks in the Management and Accounting fields are being used in a limited but growing extent in Transportation field. Perhaps, the most popular example is the Balanced Scorecard framework for performance measurement.
The Balanced Scorecard model was conceived in 1992 by Kaplan and Norton (12manage, 2009). It provides a strategic and balanced approach to measuring corporate performance from four perspectives: 1) finance, 2) the customer, 3) business process and 4) learning and growth. This framework has helped companies to achieve success by focusing the organization on a few strategic efforts, integrating various programs and vertically integrating measures at all levels in an organization to improve performance (12manage, 2009).
Because of the success of this model, various government organizations, including some state DOTs, have adopted the Balanced Scorecard framework. The City of Charlotte DOT (North Carolina) was the first agency to adopt the model. Illinois DOT and TxDOT have also customized the model (Poister, 2007; Wholey et al., 2004). Figure 1 shows the modified model for TxDOT that still keeps four quadrants of measurement, but with modified contents.
The Balanced Scorecard Framework identifies goals that relate directly to the internal operations of the agencies and external stakeholders such as the customers, political decision makers, who are important elements of the agency's operations and success. It is important that the Balanced Scorecard Framework also identifies "process" and "results" elements, which can help the agency fine tune its efficiencies in meeting outcomes while tracking its progress in achieving these outcomes. The Balanced Scorecard Framework reflects that the structure used in developing performance measures can influence the overall effectiveness as well as efficiency of the agency.
Figure 1: The Balanced Scorecard Framework (Doyle, 1998) 10
Best Practices in Selecting Performance Measures & Standards
A Literature Review
REFERENCES
12manage. (2009). Balanced Scorecard . Retrieved from 12manage, the executive fastrack: http://www.12manage.com/methods_balancedscorecard.html
American Association of State Highway and Transportation Officials (AASHTO). Transportation Asset Management Guide. Prepared by Cambridge Systematics, Inc. National Cooperative Highway Research Program (NCHRP) 20-24(11). November 2002.
Barrella, E., Amekudzi, A., Meyer, M., Ross, C., and D. Turchetta. Best Practices and Common Approaches for Considering Sustainability at U.S. State Departments of Transportation. Forthcoming, 2010 Annual Transportation Research Board Conference. Transportation Research Board, Washington, D.C., January.
Bremmer, D., Cotton, K. C., & Hamilton, B. (2005). Emerging performance measurement responses to changing political pressures at state departments of transportation: Practitioners' perspective. Transportation Research Record. (1924), 175-183.
Cambridge Systematics (CS). (2000). A Guidebook for Performance-Based Transportation Planning, National Cooperative Highway Research Program Report 446, Transportation Research Board, Washington D.C.: National Academy Press.
Cambridge Systematics (CS). (2006). Performance measures and targets for transportation asset management (NCHRP Report 551). Washington, D.C.: Transportation Research Board.
Doyle, D (1998). Performance measure initiative at the Texas Department of Transportation. Transportation Research Record. (1649), pp. 124-128.
Federal Highway Administration, Office of International Programs (2004). Transportation performance measures in Australia, Canada, Japan, and New Zealand. (Scanning Tour Report), Washington, D.C.
Guerre, J., & Evans, J. (2008). Applying System-Level Performance Measures and Targets in Detroit's Metropolitan Planning Process . Transportation Research Board 2009 Annual Meeting. Washington DC : Transportation Research Board .
Hatry, H. P., and J. S. Wholey. (2007). Performance Measurement. Getting Results. 2 Ed. Washington, D.C. Urban Institute Press.
11
Best Practices in Selecting Performance Measures & Standards
A Literature Review
Kaplan, R.S. and Norton, D. P. (1992). The Balanced Scorecard Measures that Drive Performance. Harvard Business Review, 70 (1, January/February), 71-79.
Keegan, D. P., Eiler, R. G., and Jones C. R. (1989). Are your performance measures obsolete? Management Accounting (US), 70 (12, June), 45-50.
Kennerly, M. and Neely, A. Business Performance Measurement: Theory and Practice. Cambridge Univeristy Press, Cambridge: 2002.
Marsden, G., & Bonsall, P. (2006). Performance targets in transport policy. Transport Policy. 13 (3), 191-203.
Neely, A.D., Adams,C. (2002). The performance prism. Retrieved from performance-measurement.net: http://www.performance-measurement.net/newsdetail.asp?nID=31, March 20.
Neely, A. D., Adams, C., & Kennerley, M. (2002). The performance prism: The scorecard for measuring and managing business success. London: Prentice Hall Financial Times.
Poister, T. H. (2004). Strategic planning and decision making in state departments of transportation: A synthesis of highway practice (NCHRP Synthesis 326). Washington, D.C.: Transportation Research Board.
Poister, T. H. (2007). Performance measurement in transportation agencies: State of the practice. Handbook of Transportation Policy and Administration. 485504.
Schmitt, R. (2007). Research Problem Statement: Setting Effective Performance Targets for Transportation Programs, Plans and Policy . Challenges of Data for Performance Measures Workshop (pp. 106-108). San Diego : Transportation Research Board .
Weisbrod, G., Lynch, T., and M. Meyer. Monetary Valuation Per Dollar of Investment in Different Performance Measures, NCHRP 8-36-61, Transportation Research Board, February 2007.
Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (2004). The handbook of practical program evaluation. San Francisco, Calif: Jossey-Bass.
12
Pei, Fischer, Amekudzi
1
Performance Measurement in State Departments of Transportation: A Literature Review and Survey of Current Practice
Yi Lin Pei, MSCE, MCRP Cambridge Systematics, Inc. 730 Peachtree St. Atlanta, GA 30308 Tel: 352-246-3767 Fax: 404-443-3201 Email: ypei@camsys.com
Jamie M. Fischer (Corresponding Author) Graduate Research Assistant Transportation Systems Program School of Civil & Environmental Engineering Georgia Institute of Technology Atlanta, GA 30332-0355 Tel: 404-643-3642 Fax: 404-894-2278 Email: jm.fischer@gatech.edu
Adjo A. Amekudzi, Ph.D. Associate Professor Transportation Systems Program School of Civil & Environmental Engineering Georgia Institute of Technology Atlanta, GA 30332-0355 Tel: 404-894-0404 Fax: 404-894-2278 Email: adjo.amekudzi@ce.gatech.edu
Submitted: Wednesday, May 19, 2010 Word Count: 6249 Words + 1(x250) Figures + 3(x250) Tables = 7249 Word Equivalent
Pei, Fischer, Amekudzi
2
ABSTRACT
Performance measurement, when properly implemented, can ensure efficiency, accountability and transparency for transportation agencies. This principle is to be highlighted in the next federal legislation for surface transportation, which will call for the explicit use of performance based measures as part of a strategic planning process. Clearly, understanding the current state of performance measurement practice in the United States is important for identifying and filling in existing gaps. As a result, the objective of the paper is twofold: 1) to explore the use of performance measurement in state DOTs through review of the literature, and 2) to explore the use of performance measurement in general, in setting targets and in asset management through a comprehensive survey. Results from the literature review show that performance measurement systems in transportation agencies are increasingly more strategically focused, and tied to the long term goals of the organization. Performance measurement is also used in different program areas, such as asset management, and is being used in other ways, such as benchmarking for comparative performance. While gaps exist in understanding performance target setting, recent efforts to learn from peer countries foretell of a promising future of development in the area of performance measurement. Results from the survey show that there is increased integration between performance measurement systems and strategic planning. Second, benchmarking is observed to be an important method to measure performance. Third, target setting, while it exists for most DOTs, can be a more formal process. Fourth, asset management is being viewed as an important area by most DOTs and more integrated systems are needed. The implication of the results on transportation in the US is direct and significant in several ways: 1) on a strategic level, the developments noted in performance measurement can aid transportation agencies to be better prepared for the reauthorizing of the federal surface transportation legislation; 2) the identification of a performance measurement system can help agencies stabilize their financial situation; 3) a comprehensive strategic planning framework can lead to better integration and accountability through the local, state, and regional levels; and 4) such a system will eventually lead to long term system effectiveness, transparency and longevity. Such a system would also be dynamic and readily responsive to changes in DOTs.
Pei, Fischer, Amekudzi
3
INTRODUCTION Performance measures, defined as indicators of system effectiveness and efficiency, are increasingly becoming a central focus in transportation planning in the United States. A performance-based transportation planning system is important because as the saying goes, ZKDW JHWV PHDVXUHG JHWV GRQH $ SURSHU performance-based measurement system can help ensure effectiveness, efficiency, accountability and transparency. The next federal legislation for surface transportation will call for the explicit use of performance-based measures as a central tenet, acknowledging the importance of using performance measurement as part of a strategic planning process. The current US Department of Transportation (DOT) Strategic Plan is already performance based, where under each strategic goal, outcomes, strategies, performance measures and external factors are clearly laid out (1). It is a results-oriented strategic plan. DOTs at the state level adopt more concrete and context-specific strategic plans that can be used to execute, track and monitor progress to ensure accountability especially in light of the recent economic climate.
Clearly, understanding the current state of performance measurement practice in the United States is important for identifying areas of improvement and addressing them. The purpose of this paper is to illuminate the state of performance measurement practice in state transportation agencies. The paper does the following: 1) explores the use of performance measurement in state DOTs through review of the literature, and 2) explores the use of performance measurement in general, in setting targets and in asset management through a comprehensive survey. The results of the explorations should aid DOTs in preparing for the reauthorizing of the federal surface transportation legislation and lead to long term agency effectiveness, efficiency, accountability and transparency.
LITERATURE REVIEW OF PERFORMANCE MEASUREMENT IN STATE DOTS
Review of Performance Measurement
State DOTs have long used performance measurement for analyzing system processes, outputs and outcomes as part of the engineering and planning disciplines. 2XWSXWV DUH SURGXFWV DQG VHUYLFHV GHOLYHUHG E\ WKH DJHQF\ HJ PLOHV RI URDGZD\ UHSDLUHG ZKHUHDV RXWFRPHV DUH WKH WKHFRQVHTXHQFHVRIZKDWWKHSURJUDPGLGHJ. percent reduction in crashes) (2). Yet, using performance measurement to manage, especially for accountability is a relatively new concept (3). Privatization or management reforms have affected performance management in state DOTs. For instance the balanced scorecard model, which is by far the most used business performance model, has also been widely adopted by transportation agencies. In addition to privatization and a need to be competitive, other important factors have triggered interest in DOT performance measurement. These include: 1) the need to support strategic planning processes with information on DOT performance; 2) demands for increasing accountability from the public, legislators, and governors; 3) government-wide mandates; 4) growing commitment to customers; 5) leadership changes; and 6) funding and politics (3-5).
As far back as 1993, NCHRP Report 357 (6) intended to isolate and define the key program performance measures and indicators of state highway and transportation departments for effective and efficient administration. This report provided information on the value of goal setting, the necessity of tailoring performance measurement systems to the special characteristics and transportation needs of each state, and the need for public accountability. However, the
Pei, Fischer, Amekudzi
4
report also noted that while several states had initiated programs to develop and use performance measurement tools, no state had comprehensive experience (6).
NCHRP Report 357 (6) reflects a model of the first generation transportation agency, where measures were typically developed in response to internal initiatives or to specific legislative requirements. Performance measures were often robust and well-developed, but they were usually not meaningfully linked to other agency processes. Second generation frameworks on the other hand, which emerged in the late 1990s, usually tied measurement to strategies for tracking business functions and planning goals (4). During this period, many states took significant steps to measure the performance of their programs and services, moving beyond traditional operation-level, system-oriented measures to monitoring inputs and immediate outputs. This generation of performance measurement also put JUHDWHUHPSKDVLVWKHFXVWRPHUV perspective. However, second generation performance measures were often too complex, making results difficult to communicate, and agencies struggled to develop tools for reporting to stakeholders (3, 4).
In 2000, a guidebook was published linking performance measurement to transportation planning. It was intended to provide transportation organizations, planning practitioners, and decision makers with practical tools for considering system performance in the multimodal transportation planning and decision-making process. It is also aimed to support the investment decisions needed in major transportation systems (7).
Subsequent publications have furthered these concepts and moved towards a third generation of performance measurement that uses dynamic approaches providing real time information. Third generation frameworks respond to the needs of agency leadership and the political context while placing high value on accountability (3). Performance measurement is also increasingly tied with strategic planning, asset management and other program areas. For instance, a handbook for CEOs and executives was developed on strategic planning that combined performance measurement and strategic management into a strategic performance measurement system. The report included detailed information about setting up and maintaining a strategic performance measurement system that can energize strategic management efforts, maintain focus, and enable organizational change, in addition to being able to track progress (8).
NCHRP Synthesis 326 examines the experience of sWDWH DQG SURYLQFLDO '27V ZLWK strategic planning in 2004. It synthesizes the existing approaches to strategic planning and decision making, including performance measurement. Although many DOTs still struggled with defining meaningful, reliable, accessible and cost effective (9) performance measures in 2004, they were placing a greater focus on customer satisfaction and feedback. Also, DOTs began using time-sensitive numerical targets around this time, and they began developing asset management programs within the frameworks of their strategic plans (9).
The importance of performance measurement and asset management is further explored in NCHRP Report 551 of 2006, which describes several principles to support asset management. The report determines that performance measures should be policy driven, strategic in perspective, considerate of tradeoffs and options, and should be implemented across organizational units and levels. In addition, performance decisions should be based on good information and should be evaluated and monitored through a feedback process (10).
Comparative performance measurement, also known as benchmarking, was recognized as important in the 2006 report Measuring Performance Among State DOTs (11). It was found that many DOTs were still skeptical about benchmarking but were willing to try it. The report
Pei, Fischer, Amekudzi
5
summarized the basic elements for developing a comparative framework, including a multistate working group, adequate staff, identification of common strategic focuses, identification of templates for measures, data collection and analysis systems, and the sharing of information. A peer group study of several states tracked two performance measures, on-time performance and on-budget performance, and found that there is great variation between different states (11).
Learning from other countries can prove valuable. A 2004 scan of performance measurement systems in Australia, Canada, New Zealand and Japan showed that performance measures were used more extensively in those countries than in the US (12). These systems often emphasized safety; included output, outcome, customer satisfaction, and environmental indicators; integrated data collection; used before and after studies and benchmarks; and considered multimodal investment tradeoffs. Successful programs directly used performance measurement to influence programming decisions and budget allocation. The scan recommended, in particular, that safety and benchmarking should be emphasized more by the FHWA. Furthermore, the scan suggested that the US generate research, training, conference meetings, technical guidance and sustainability actions, using these international examples.
This review of the literature indicates that many states have committed to using performance measures, but the degrees to which performance measurement systems are developed may differ widely among states. A list of attributes of good performance measurement are generated below, synthesized from the best practices found in the literature.
Review of Performance Targets
Little attention has been given to setting performance targets and what role targets may play in transportation planning (13). NCHRP Report 551 (10) provides some guidance on setting performance targets. The report recommends that the setting of targets should consider financial, policy, technical and economic factors. In addition, it suggests that the establishment of long term and short term targets should follow seven logical steps as follows (10):
1. Define contexts and time horizons, 2. Select scope of measures for targets, 3. Develop long-term goals, 4. Consider funding availability, 5. Analyze resource allocation scenarios and tradeoffs, 6. Consider policy and public input, and 7. EstabOLVKWDUJHWVDQGWUDFNSURJUHVV
A 2006 examination of performance targets in the UK provides additional information on different methods for establishing performance targets, and the tradeoffs between them (14). It summarizes the motivations for developing targets as legal and contractual obligations, resource constraints, consumer orientation and political aspirations. Based on these motivations, three ways to set targets are discussed. Computer-based models examine how a given indicator varies under a range of scenarios. These are the most realistic methods and can allow for different scenarios to be examined. Where variables cannot be modeled, extrapolation and evidence-led judgment based on historical data can be used. The most subjective method is aspirational, based on the desires of agency decision makers. While each method has positive and negative aspects, the best method is perhaps one that can establish targets that are tied back to the most
Pei, Fischer, Amekudzi
6
fundamental goals (14). The target setting procedure presented in NCHRP Report 551 appears to be a combination of the three methods.
The overseas literature on performance targets points to the need for the US to learn from its peers. A 2010 international scan, Linking Transportation Performance and Accountability (15), carried out in Australia, Great Britain, New Zealand and Sweden, studied how the transportation agencies of different countries use target setting to demonstrate accountability to elected officials and the public. This timely scan shed light on some important points about performance measurement and target setting in other countries:
1. A limited number of high-level national transportation policy goals that are linked to a clear set of measures and targets are used,
2. Intergovernmental agreements on how state, regional, and local agencies will achieve the national goals are negotiated while translating them into local context and priorities, and
3. The real value of performance management is the development of an improved decision making and investment process, not the achievement of many arbitrary short-term targets. (15)
The scan is a step in the right direction to help the US develop better performance measurement systems for accountability. Further, a web tool called State Measures has been created that synthesizes documents such as state transportation statistical, annual, and performance reports (16). These recent developments show that challenges in the area of performance measurement are being actively addressed, perhaps in anticipation of the performance measurement requirements expected with the pending reauthorization of the surface transportation bill.
SURVEY ON PERFORMANCE MEASUREMENT AND TARGETS SETTING IN STATE DOTS
Introduction The goal of this comprehensive survey was to identify common approaches to selecting performance measures and targets in state transportation agencies. While other surveys have been carried out to understand performance measurement, no survey was found that looks at SHUIRUPDQFHPHDVXUHVKROLVWLFDOO\IURPDQDJHQF\VVWUDWHJLFSODQQLQJ perspective, and whether agencies have systematic procedures for setting targets. This survey tries to fill in the knowledge gaps within the literature review above, in addition to providing information on state of the practice in asset management at DOTs.
Survey Methodology
The survey took place from September 2009 to February 2010, and was conducted through telephone interviews and online questionnaires, consisting of eight survey questions. Mainly planning and performance measurement departments or divisions within the DOTs were contacted. Respondents were given a choice between being asked the questions on the phone, or
Pei, Fischer, Amekudzi
7
filling out the responses online. For the latter, respondents were further contacted to clarify responses if needed.
Survey Results
The overall response rate of the survey was quite good, as 39 State DOTs (or equivalents) responded to the survey out of the 50 states plus the District of Columbia. This corresponds to a response rate of 78%. Figure 1 below shows the geographic spread of the states that responded. The following sections present the survey results.
1. Organizational Strategic Goals and Objectives The purpose of this question is to find out whether an agency has a functional strategic plan on which performance measurement can be based. It also seeks to find how often the strategic plans are updated, how these plans are organized, and which specific goals are set.
Out of the 39 responses, 36 agencies responded \HV indicating they have a strategic planning process, while 3 DJHQFLHV UHVSRQGHG QR UHIOHFWLQJ D KLJK SRVLWLYH UHVSRQVH UDWH RI 92%. However, it should be noted that while most DOTs understood that strategic plans are different from Long Range Transportation Plans (LRTP), certain DOTs gave objectives from their LRTPs.
The survey results show that most DOTs have a strategic plan that is updated annually, with some DOTs updating them biennially or in three- and four-year intervals. Plans updated less frequently than every five years are very rare. These results imply that most DOTs are proactive in responding to new planning imperatives. Short review intervals also provide feedback loops that can allow for faster improvements in performance.
There are different ways in which agency goals are organized. The most common organization is a one tier arrangement. For instance, Virginia DOT lists six broadly defined goals addressing transportation issues such as safety, systems preservation, and mobility; outcomes such as economic vitality and quality of life; and organizational issues such as financial accountability and inter-agency collaboration.
The second way that goals can be presented is through a multi-tiered arrangement, where goals are broadly defined, and more specific objectives are defined to clarify the broader goals. More intricate structures that are tied to a specific performance measurement framework, such as WKH EDODQFHG VFRUHFDUG DUH DOVR XVHG )RU LQVWDQFH 1HZ +DPSVKLUH '27V JRDOV DUH DUUDQJHG according to a multi-tiered balanced scorecard structure, with four big-picture areas of performance, each with two to four specific goals.
The third way strategic goals can be arranged is in an area-specific manner, where different goals are listed for each division, and some agency-wide goals may overlap across divisions 7KH 1<6'27V VWUDWHJLF JRDOV DUH RUJDQL]HG DFFRUGLQJ WR seven specific program areas: highway and bridge infrastructure, public transportation system, statewide rail system, aviation system, multimodal transportation mobility, environmental sustainability, and multimodal transportation safety. Goals are described within each area, and in some cases organized into multiple subareas; for instance, different statewide rail system goals are specified for passenger and freight rail.
Naturally, there iV QR EHVW ZD\ WR DUUDQJH DQ DJHQF\V JRDOV 6R ORQJ DV WKH\ DUH comprehensive and reflect agency and stakeholder priorities, they are potentially effective goals. Agencies range from having as few as four goals to having as many as hundreds of goals arranged in several categories. However, most agencies have fewer than 10 goals. Also, the
Pei, Fischer, Amekudzi
8
survey has shown that most DOT goals fall into few major categories. Table 1 below lists 29 categories which capture all of the goals used by survey respondents, sorted from the highest to
the lowest number of occurrences. Although some of these categories are closely related, they have been formulated based on the wording of the various survey responses.
It can be seen from Table 1 that goals related to safety, systems preservation, and mobility are the most common RI DOO VWUDWHJLF JRDOV 7KH WUDQVSRUWDWLRQ V\VWHP VDIHW\ DQG VHFXULW\ category relates to safe roadway designs and is represented in 67% of all survey responses. It is considered separately from the similarly ZRUGHGV\VWHPSUHSDUHGQHVVVHFXULW\ category, which relates to responsiveness in emergency situations; however if the two were considered together, they would be represented in 76% of responses. The DVVHWPDQDJHPHQWDQG V\VWHPV SUHVHUYDWLRQ FDWHJRU\ is especially important to note, in light of the recent and upcoming legislative focus on better infrastructure management. Its broad representation (56%), LV LQ VWDUN FRQWUDVW ZLWK WKDW RI SXEOLF DQG DOWHUQDWLYH WUDQVSRUWDWLRQ H[SDQVLRQ DQG LPSURYHPHQW DQG KLJKZD\ H[SDQVLRQ DQG FDSDFLW\ LQFUHDVH JRDOV 7UDQVSRUWDWLRQV\VWHPPRELOLW\VHHPVWREHVLPLODUWRWUDQVSRUWDWLRQV\VWHPHIIHFWLYHQHVVDQG HIILFLHQF\ ZKLFK UHODWH WR VXFK SHUIRUPDQFH PHDVXUHV DV WUDYHO WLPH GHOD\ 7RJHWKHU WKHVH mobility and efficiency goals are represented in 53% of the responses.
Compared with the goals mentioned above, which relate to the direct physical and functional aspects of the transportation system, outcome goals related to the economy, the environment, anG VRFLHW\ DUH OHVV ZLGHO\ DGRSWHG (FRQRPLF JURZWK DQG YLWDOLW\ ZKLFK LV D community-oriented outcome, is a goal area for 28% of respondents. Organization-oriented goals UHODWHG WR WKH HFRQRP\ DUH DOVR UHSUHVHQWHG LQ RI UHVSRQVHV DJHQF\ FRQVHUYDWLon and EXVLQHVV HIILFLHQF\ DORQJ ZLWK FRVW HIIHFWLYH SURGXFWV (QYLURQPHQWDO TXDOLW\ DQG VHQVLWLYLW\LVVSHFLILFDOO\PHQWLRQHGE\RIWKHUHVSRQGHQWV ZLWKRWKHUVPHQWLRQLQJ UHODWHGLGHDVVXFKDVVWHZDUGVKLSDQGVXVWDLQDELOLW\
&XVWRPHU VDWLVIDFWLRQ LV WKH PRVW SRSXODU VRFLDOO\-oriented goal area, appearing in 28% of responses. However, this relates more to agency image than community outcomes. Other agency-RULHQWHG VRFLDO JRDOV DUH UHODWHG WR HPSOR\HH LQQRYDWLRQ DQG DJHQF\ OHDGHUVKLS Relatively few agencies set goals related to quality of life and accessibility, however, which are more community-oriented. Social equity was not mentioned explicitly by any of the respondents.
7KH FRQFHSW RI VXVWDLQDELOLW\ ZKLFK ZDV PHQWLRQHG explicitly by two survey respondents, implies a commitment to improving the economic, environmental, and social outcomes. Although the concept has become more widespread in recent years, the results of this survey show that sustainability is of less frequent concern to transportation agencies than are measures of effectiveness and efficiency. If agencies wish to improve their relative sustainability, they will need to incorporate human outcomes, related to the economy, the environment, and social equity, more explicitly into their strategic goals.
2. Strategic Planning and Performance Measures This question seeks to find out the extent to which DOTs are using performance measures to monitor the progress of their strategic plan, and to find out how the performance measures are structured. It is not to find out exactly what performance measurements are used, but how they are tied to the overall strategic planning process. From the survey results, 23 out of the 39 DOTs indicated they do have performance measures that are used to gauge success in achieving their strategic goals and objectives. While the rest do not have performance measures linked to the strategic plan, several DOTs are in the process of adopting such a system.
Pei, Fischer, Amekudzi
9
Most of the measures are organized in a multi-level structure where the highest level usually consists of goals identified in the first question (also called Key Performance Indicators) and shape the overall priority of the organization. The second level contains more detailed objectives, and underneath that specific strategies (action-level measures) are identified. This indicates that most DOTs align their measures to strategies to help achieve their objectives in an organized manner.
The number of measures also varies greatly between different DOTs. While some DOTs have only a few measures (e.g. Oklahoma DOT has 12 measures in 5 goal areas), others, for instance Maryland, have over 400 measures in its different divisions. Several DOTs also follow a performance measurement framework that aids in measurement formulation and better feedback. For instance, Florida DOT has always used a well developed pyramid framework that sets the goals and objectives from the policy level down to the project level. Interesting to note, Florida DOT also has developed measures in a kind of multi-perspective structure, in order to answer three separate questions (17):
How we report on what we are accomplishing How we are being measured by others How we measure ourselves on an ongoing basis
These three questions are important because they distinguish performance measurement from benchmarking, where the latter can sometimes be more effective in improving the organizations.
While measures are important in and of themselves, how well measures are tied to the overall planning process is perhaps more important. For instance, Caltrans provides a good framework in which the performance measurement system is directly linked to the operational plan, and informs both strategic planning through program evaluation (18). Another good example comes from Louisiana DOTD (19), which adopted a Performance Indicator Matrix that vertically integrates performance measures with objectives set at the program level. In this framework, each objective is clearly stated, and measures are divided into input, output, outcome, efficiency and quality categories. Also, Missouri DOT has a tracker system that is built around 18 tangible results that corresponds to over 100 performance measures. This system allows for easy updates to be made and easy tracking.
3. Performance Measurement Review To carry the previous question further, this question attempts to find out how often the performance measures are reviewed. Out of the 23 DOTs that have a performance measurement system for strategic planning, 13 reported that they review their measures annually, four quarterly, three biennially and two semi-annually. The remaining one agency reported that they review their measures when their plans are updated. The results indicate that most agencies that have performance measures in their existing strategic plan review them frequently, usually coinciding with how often the plans are updated.
4. Role of Performance Measures in Functional Divisions This question seeks to find out the extent to which performance measurement is used in each division of the DOT. For the 39 DOTs that responded, Table 2 lists the twelve most common functional divisions in which performance measurement is used. As can be seen from the results, performance measurement in planning and program development is considered important by
Pei, Fischer, Amekudzi
10
most DOTs, followed by operations and engineering. In order for performance measurement in these areas to be effective, agencies will need clear and comprehensive strategic plans that can guide operations, engineering and other action areas. Other divisions, which were listed by very few DOTs and are not listed in the table, include environmental divisions, multimodal divisions and public private partnership initiatives. Also worth noting is that several DOTs report performance management within an operations division, but not within maintenance, although WKHVH WZR DFWLYLWLHV 2 0 DUH RIWHQ WKRXJKW RI DV FORVHO\ OLQNHG. Certainly, some agencies may deal with maintenance within operations. Nonetheless, more research could uncover whether performance management practices in maintenance might facilitate the shift to a system preservation focus.
According to the survey, there are two ways in which DOT functions are organized. The first consists of a one-tier structure, where the DOT functions are broken down into distinct divisions (usually above 10) and each manages their functions independently. The second is a two-tier system, where the DOT is broken down into broad functional areas, such as engineering, headed by a director, and each area is further broken down into several divisions, such as maintenance, civil rights, and planning. Regardless of the organizational structure, functional divisions should reflect a comprehensive picture of the priorities the agencies represent.
Regarding the role of performance measures in each division, DOTs generally responded that performance measures are used for overall management and planning to advance projects and make business decisions. While several DOTs use performance measurement in each of their units, most DOTs only use it in certain business units for internal tracking. For certain DOTs, different performance measurement models are used by different divisions to track progress. Or, as in the case of NYSDOT, the same division may use a combination of multiple models. 1<6'27V Engineering Division utilizes a Performance Improvement Model (PIM), but the Office of Design, within the Engineering Division, has also incorporated a balanced scorecard approach and publishes its performance metrics and DQ RYHUDOO LQGH[ RQ WKH 'HSDUWPHQWV internal website. A few other respondents also stated their use of a balanced scorecard system, and several DOTs have spearheaded such a process. However, the majority of DOTs could better use performance measurement in a manner that is both horizontally integrated across divisions and vertically integrated within a division, linking performance measurement more clearly to division and agency goals.
5. Performance Measures and External Stakeholders The extent to which performance measures are used to engage external stakeholders is looked at in this question. Out of the 30 DOTs that do engage with external stakeholders, they reported that primary stakeholders are the public, legislature, governor and industries. Engaging with external stakeholders is important to ensure customer satisfaction, transparency, accountability and improve the organization through useful and unbiased feedback. The most common ways DOTs use to engage with external stakeholders include customer satisfaction surveys, focus groups, public meetings and public hearings. Websites also contain information available to stakeholders, such as dashboard information. Simulation and trend analysis are used in public meetings to explain capital needs and budget impacts. Annual and quarterly reports are used to report progress to key stakeholders. Customer feedback can be used to improve performance. For LQVWDQFH 0LVVRXUL '27V 7UDFNHU SURJUDP LQFOXGHV PHDVXUHV WUDFNLQJ WKH QXPber and satisfaction of customers involved in public planning processes.
Pei, Fischer, Amekudzi
11
6. Setting Performance Targets One of the observable gaps in the transportation performance management literature is the lack of guidance for setting performance targets, or standards for performance measurement. To fill this gap, the sixth question asks DOTs if they set target performance levels and how they go about setting their targets. Thirty-one out of the 39 DOTs responded that they do set targets. This response rate is higher than for performance measures because many DOTs do not directly tie targets to the strategic planning process or performance measurement. Based on the survey, Table 3 shows the most common ways performance targets are set. Some agencies use multiple methods, or multiple inputs, for setting targets.
It is clear from the results that the majority of DOTs do not follow a scientific process in setting targets. Rather, funding opportunities and constraints play significant roles in determining how ambitious targets will be. The results from this question also reveal that methods for setting targets vary depending on the type of targets being set )RU LQVWDQFH 0DU\ODQG '27V RYHUDOO outcome targets are established by senior leadership while output targets are determined by program managers based on funding levels. Furthermore, benchmarks have been used as a target setting tool for several DOTs. Missouri DOT, for instance, prefers benchmarking between to traditional performance measurement because it has improved their performance relative to other region. This preference is also shared by Texas DOT, which focuses on continuous improvement WRZDUGVDJRDOVXFKDV]HURIDWDOLW\UDWKHUWKDQVHWWLQJDQDEVROXWHVWDQGDUG
7. Top Management and Performance Information The review of performance data by top management is important to help keep an organization on track with respect to strategic goals and to reflect necessary policy and strategic changes in a timely manner. For instance, Missouri DOT indicated that strategies and actions to improve performance are worked on and implemented continually to show improved results in the next period. Thirty two DOTs responded that top management does review performance data. With the overwhelming majority of these, data is reviewed on a quarterly basis in meetings. However, these meetings might merely include informal reviews of any performance information, regardless of whether they are tied to a strategic framework.
Annual, semi-annual and continuous reviews are also carried out in several DOTs. For instance, in Minnesota DOT, top staff convenes once a year (during the first quarter) to review performance data across the functional areas and make decisions about results. To manage the capital budget, DOT and District top staff meet once a year (3rd Quarter) to review the actual and predicted results of their four- and 10-year program against statewide performance targets for safety, smooth pavements, bridge preservation, and travel speeds. Each prepares a performance-based scenario that identifies total resource needs to meet performance targets, and a fiscally constrained scenario that identifies projects to be built with available revenues. In addition, managers at the division level receive updates of the performance data quarterly.
8. Asset Management Asset Management is seen as an important program area for state DOTs, as demonstrated by their objectives. This may be due to the increasingly constrained funding situation in transportation, which requires better management of assets to reduce costs in the long run. Twenty-seven state DOTs responded that they have an asset management program in place, while the rest are in the process of developing one or did not respond. Most state DOTs use their asset management programs for monitoring and determining the conditions of highways and
Pei, Fischer, Amekudzi
12
bridges. Other areas where it can be used are maintenance, traffic Level of Service and safety. While many DOTs have asset management programs, almost all of these indicate that their asset management programs are not integrated across divisions. For instance, Colorado DOT employs different programs for the three different assets (pavement, bridges, and maintenance) and uses different software for managing each. Top managers allocate resources among the three areas based on their needs relative to performance targets.
It is important to note that most DOTs have realized that an integrated and unified asset management program is beneficial and many have started developing such a program. Vermont Agency of Transportation (VTrans) is one example of an agency that has a well developed asset PDQDJHPHQWDQGSHUIRUPDQFHEDVHGSURJUDPWKDWUHDOO\VKLIWVWKHDJHQF\VIRFXVWRSUHVHUYDWLRQ and maintenance by emphasizing preservation of existing assets rather than the construction of new highways (20). In addition, New Hampshire DOT, together with Vermont and Maine, has a tri-state, collaborative asset management program, demonstrating mature inter-jurisdictional cooperation.
SUMMARY AND CONCLUSIONS
This paper has explored current practices in performance measurement through 1) review of the literature on performance measurement in state DOTs and abroad, and 2) a survey on the current use of performance measurement, on setting targets and on asset management in DOTs.
From the literature review, it can be seen that performance measurement has had a long history of being used in state DOTs. In the last two decades, however, significant development has occurred with movement through a first, second and third generation of performance measurement systems. Today, performance measurement systems in transportation agencies are increasingly more strategically focused, and tied to the long term goals of the organization. Performance measurement is also used in various program areas, such as asset management, and it is being used in other ways such as benchmarking.
Articulation of the relationship between strategic plans, transportation system plans, and performance measurement systems in general is needed (5). Recent efforts to better understand performance targets, however, suggest a promising future of development in the area of performance measurement.
Current DOT practices largely coincide with what would be expected based on the literature review. The survey results show that performance measurement is widely used among DOTs, and many agencies have successfully integrated their performance measurement practice with strategic planning. Several methods of organizing the performance measurement program are used in the US, but the study does not suggest that any one of these methods is best. Furthermore benchmarking is observed to be an important method for setting performance targets, although target setting is still an informal process for some DOTs. Finally, the survey has shown that asset management is being viewed as an important area by most DOTs, although more integrated systems are needed.
These results signify that progress has been made in performance measurement for transportation in the US. However, some significant challenges remain. For instance, target setting practices are less mature in the US than in other countries such as the UK. The NCHRP scan of international practices provides some useful guidelines in this area (10). As agencies seek continued improvement, they can develop more systematic, data-driven targets which also account for stakeholder and public priorities. They can ensure that targets and performance
Pei, Fischer, Amekudzi
13
measures are closely linked to their strategic planning processes, and that they are integrated horizontally and vertically throughout the organization. On a strategic level, these developments can aid transportation agencies to be better prepared for the reauthorizing of the federal surface transportation legislation, and agencies will experience benefits such as increased public transparency and accountability as they improve performance measurement practices.
In the future, studies will be needed to follow up on the progress of strategic planning, performance measurement, target setting, and asset management in state DOTs. As methods vary, specific future research could include surveys and case studies to identify best practices that maximize the benefits of performance measurement relative to strategic goals.
ACKNOWLEDGEMENT
This paper was supported in part by the Georgia Department of Transportation under the UHVHDUFK%HVW3UDFWLFHV LQ6HOHFWLQJ3HUIRUPDQFH0HDVXUHVDQG6WDQGDUGV IRU(IIHFWLYH $VVHW 0DQDJHPHQW7KHDXWKRUVUHPDLQH[FOXVLYHO\UHVSRQVLEOHIRUWKHFRQWHQWVRIWKLVSDSHU
Pei, Fischer, Amekudzi
14
REFERENCES (1) US Department of Transportation. Department of Transportation Strategic Plan: New Ideas for a Nation on the Move. www.dot.gov/stratplan2011/dotstrategicplan.pdf Accessed January 22, 2010.
(2) Amekudzi, A. A, Meyer, M.D., Pei, Y.L. Best practices in selecting performance measures and standards: A literature review. Report prepared for the Georgia Department of Transportation, Atlanta, Georgia, 2009.
(3) Bremmer, D., Cotton, K. C., & Hamilton, B. Emerging performance measurement responses to changing political pressures at state departments of transportation: Practitioners' perspective. Transportation Research Record No.1924, TRB, National Research Council, Washington, D.C., 2005, pp. 175-183.
(4) Poister, T. H. Performance measurement in state departments of transportation (NCHRP Synthesis 238). Washington, D.C.: Transportation Research Board, National Research Council, 1997.
(5) Poister, T. H. Performance measurement in transportation agencies: State of the practice. Handbook of Transportation Policy and Administration, 2007, pp.485-504.
(6) Reed, M. F., Richard A. Luettich, Lester P. Lamm, and Thomas F. Humphrey. Measuring State Transportation Program Performance. Transportation Research Board, National Research Council, Washington, D.C., 1993.
(7) National Research Council (E.U.). A Guidebook for Performance-Based Transportation Planning (NCHRP Report 446). National Academy Press, Washington, D.C., 2000.
(8) National Cooperative Highway Research Program (NCHRP), TransTech Management, Inc, & American Association of State Highway and Transportation Officials. Strategic performance measures for state Departments of Transportation: A handbook for CEOs and executives. American Association of State Highway and Transportation Officials, Washington, D.C., 2003.
(9) Poister, T. H. Strategic planning and decision making in state departments of transportation: A synthesis of highway practice (NCHRP Synthesis 326). Transportation Research Board. Washington, D.C., 2004.
Pei, Fischer, Amekudzi
15
(10) Cambridge Systematics (CS). Performance measures and targets for transportation asset management (NCHRP Report 551). Transportation Research Board. Washington, D.C., 2006.
(11) American Association of State Highway and Transportation Officials (AASHTO). Measuring performance among state DOTs..: American Association of State Highway and Transportation Officials. Washington, D.C., 2006.
(12) MacDonald, Douglas B. Transportation Performance Measures in Australia, Canada, Japan, and New Zealand. U.S. Dept. of Transportation, Federal Highway Administration, Office of International Programs, Washington, D.C., 2004.
(13) Schmitt, R. (2007). Research Problem Statement: Setting Effective Performance Targets for Transportation Programs, Plans and Policy. Challenges of Data for Performance Measures Workshop. Transportation Research Board. San Diego, 2007, pp. 106-108.
(14) Marsden, G., & Bonsall, P.. Performance targets in transport policy. Transport Policy. Vol. 13 No. 3, 2006, pp.191-203.
(15) Federal Highway Administration (FHWA). Linking Transportation Performance and Accountability: Executive Summary. www.international.fhwa.dot.gov/pubs/pl10009/pl10009.pdf. Accessed March 27, 2010.
(16) Midwest Transportation Knowledge Network(MTKN). (2010). DOT State Stats. members.mtkn.org/measures/. Accessed March 27, 2010.
(17) Florida Department of Transportation (Florida DOT). Transportation Performance Reporting in Florida. www.dot.state.fl.us/planning/performance/. Accessed March 27, 2010.
(18) The California DOT (Caltrans). Strategic Plan 2007-2012. www.dot.ca.gov/docs/StrategicPlan2007-2012.pdf. Accessed March 12, 2010
(19) Louisiana DOTD. (2007). Five-year strategic plan. Louisiana DOTD. 2007 (Obtained by Author by email)
(20) Vermont Agency of Transportation (VTrans). Asset Management at the Vermont Agency of Transportation . www.aot.state.vt.us/Planning/Documents/Planning/Asset%20Management%20in%20 Vermont%20Jan%2010.pdf. Accessed March 13, 2010.
Pei, Fischer, Amekudzi
1
LIST OF TABLES AND FIGURES
FIGURE 1: State DOTs that responded to survey. TABLE 1: DOT Goals and Objectives TABLE 2: Major Functional Divisions within state DOTs TABLE 3: How Performance Targets are Developed in DOTs
Pei, Fischer, Amekudzi
2
FIGURE 1: State DOTs that responded to survey (Alaska and Hawaii did not respond to the survey).
Pei, Fischer, Amekudzi
3
TABLE 1: DOT Goals and Objectives
Goals
Tally
Transportation System Safety and Security
26
Asset Management and Systems Preservation
22
Transportation System Mobility
14
Employee and Organizational Development
11
Customer Satisfaction
11
Economic Growth and Vitality
11
Environmental Quality and Sensitivity
10
Transportation System Effectiveness and Efficiency
7
Integrated and Multimodal Transportation System
7
Agency Program Service Delivery
7
Better Freight Movement
6
Stewardship
4
Public and Alternative Transportation Expansion and 4
Improvement
System Preparedness, Security
4
Quality of life
4
Agency Accountability and Transparency
4
Stakeholder Communication and Cooperation
4
Modal Shift and Auto Trip Reduction
3
Agency Conservation and Business Efficiency
3
Highway Expansion and Capacity Increase
2
Agency Program Funding
2
Employee Innovation
2
Land Use and/or Economic Development Connection
2
Congestion Reduction
2
Accessibility
2
Sustainability
2
Cost Effective Products
2
Agency Leadership
1
Needs vs. Community Wants
1
Pei, Fischer, Amekudzi
4
TABLE 2: Major Functional Divisions within state DOTs
Functional Division Planning/Programming/Development Operations Design/Engineering Administration Maintenance Finance Construction Public Transportation Aeronautics Safety Motor Vehicles Program Delivery
Tally 28 21 18 17 14 11 10 10 7 5 5 4
Pei, Fischer, Amekudzi
5
Table 3: How Performance Targets are Developed in DOTs.
How Targets are Development Upper Management Program Manager Funding Levels Benchmarking Stakeholder Input Consensus Historic Data and/or Past Experience Customer or Public Input Internal Discussion Engineering Judgment Expert Panel Resource Management Alignment with National Goals Engineering Analysis General Accepted Standards
Tally 7 6 5 3 3 3 2 2 2 2 2 1 1 1 1
Best Practices: Selecting Performance Measures and Targets for Effective Asset Management
Survey Results (State DOTs)
Yi Lin Pei, Graduate Research Assistant Adjo Amekudzi, Ph.D., Principal Investigator Michael Meyer, Ph.D., P.E., Co-Principal Investigator
Georgia Institute of Technology
April 28, 2010 (Revised)
Survey Purpose
To identify common approaches for selecting performance measures and targets in state transportation agencies.
1/25
Survey Methodology
Survey was conducted by telephone interviews and online questionnaire.
Survey took place from September 2009 to February 2010.
Planning and performance measurement departments were contacted.
2/25
Survey Response Rate
39 out of 50 DOTs + Washington DC (76%) responded.
3/25
*No responses from Alaska and Hawaii
Survey Results
4/25
Organizational Strategic Goals/Objectives
This question sought to find out whether an agency has a functional strategic plan upon which performance measurement can be based.
36 agencies have a strategic plan in place.
These plans are usually updated annually, bi-annually.
Several ways agency goals are organized:
One tier arrangement Multi tier arrangement (sometimes employing specific models, such as the balanced scorecard) Program area specific (sometimes overlapping)
5/25
Examples of PM Models (1)
One tier (Virginia DOT)
Provide a safe, secure and integrated transportation system that reflects the needs throughout the Commonwealth. Preserve and manage the existing transportation system through technology and more efficient operations. Facilitate the efficient movement of people and goods, expand travel choices, and improve interconnectivity of all transportation modes. Improve Virginia's economic vitality, environmental quality, quality of life, and facilitate the coordination of transportation, land use and economic development planning activities. Ensure that VDOT is continuously improving its financial accountability, business practices and workforce. Strengthen the culture of preparedness across state agencies, their employees and customers.
6/25
Examples of PM Models (2)
Multi-tier Balanced Scorecard (NH DOT)
Area 1: Employee development 1.To hire efficiently. 2.To optimize employee health and safety. 3.To develop and retain employees. Area 2: Performance 1.To identify, collaborate and communicate with partners. 2.To optimize transportation strategies. Area 3: Resource management 1.To effectively manage financial resources. 2.To effectively manage workforce. 3.To protect and enhance the environment. Area 4: Customer satisfaction 1.To increase customer satisfaction. 2.To improve asset conditions. 3.To increase mobility. 4.To improve system safety and security.
7/25
Examples of PM Models (3)
8/25
Program Area Specific (NYSDOT)
Area 1: Highway and Bridge Infrastructure
Extend the service life of all highway and bridge-related assets, with priority given to the facilities that are the most critical links in the transportation system serving economic and community needs, through the application of both maintenance and capital investments.
Area 2: Public Transportation System
Ensure the efficient, safe and reliable movement of public transportation users through investments in core public transportation infrastructure, equipment and services which improve connectivity, accessibility, livability, sustainability and modal choice.
Area 3: Statewide Rail System
Subarea 1: High-Speed Intercity Passenger Rail Service
Maintain and improve safe, efficient and reliable intercity passenger rail service through strategic investments in core system infrastructure, including track, train control signals and passenger stations. Facilitate increased service, frequency, reliability and expanded High-Speed Passenger Rail Service.
Subarea 2: Freight Rail and Upstate Ports System
Extend the service life of essential rail and port facilities through public investments that promote asset preservation and the attainment of a State Of Good Repair infrastructure condition. Promote intermodalism, accessibility and mobility and support initiatives to improve service reliability. Improve rail and ports systems' energy efficiency, environmental sustainability and economic competitiveness.
Area 4: Aviation System
Extend the service life of essential aviation facilities through public investments that promote asset preservation and the attainment of State Of Good Repair infrastructure condition and ensure secure facilities. Promote economic development of commercial and general aviation airports and improve the connectivity of the overall transportation network.
Area 5: Multimodal Transportation Mobility
Enhance the movement of people and goods through improvements in system reliability, cost-effective congestion mitigation, network connectivity, accessibility and modal choice.
Area 6: Environmental Sustainability
Support a sustainable environment through improved energy efficiency in the transportation system and the protection and improvement of air and water quality.
Area 7: Multimodal Transportation Safety
Improve safety in all transportation modes, regardless of jurisdiction, to save lives, to reduce the number and severity of personal injuries and to prevent crashes.
9/25
DOT Goals and Objectives (1)
Goals Transportation System Safety and Security Asset Management and Systems Preservation Transportation System Mobility Employee and Organizational Development Customer Satisfaction Economic Growth and Vitality Environmental Quality and Sensitivity Transportation System Effectiveness and Efficiency Integrated and Multimodal Transportation System Agency Program Service Delivery Better Freight Movement Stewardship Public and Alternative Transportation Expansion and Improvement System Preparedness, Security
Tally 26 22 14 11 11 11 10 7 7 7 6 4 4 4
DOT Goals and Objectives (2)
Quality of life
4
Agency Accountability and Transparency
4
Stakeholder Communication and Cooperation
4
Modal Shift and Auto Trip Reduction
3
Agency Conservation and Business Efficiency
3
Highway Expansion and Capacity Increase
2
Agency Program Funding
2
Employee Innovation
2
Land Use and/or Economic Development Connection
2
Congestion Reduction
2
Accessibility
2
Sustainability
2
Cost Effective Products
2
Agency Leadership
1
Agency Needs vs. Community Wants
1
10/25
Strategic Planning and Performance Measures
This question sought to find out the extent to which DOTs are using performance measures to monitor the progress of their strategic plan, and how the performance measures are structured.
23 DOTs reported having performance measures tied to strategic goals and objectives.
11/25
Most measures are organized by a multi -level structure.
Measures can vary from a few to a few hundreds.
Oklahoma DOT has 12 measures in 5 goal areas. Maryland DOT has over 400 measures that reflect about 75 objectives.
Benchmarking seen as important way of measurement. e.g. Florida DOT's performance measures answers:
How we report on what we are accomplishing How we are being measured by others How we measure ourselves on an ongoing basis
Several DOTs have well-integrated measurement system. E.g., LA DOTD Performance Indicator Matrix and 12/25 Missouri DOT Tracker.
Louisiana DOTD Example: Performance Indicator Matrix for a Particular Objective
13/25
Caltrans Strategic Planning Process (Performance-Based Planning)
14/25
Performance Measurement Review
This question sought to find out how often the performance measures are reviewed.
Out of the 23 DOTs responding that they had performance measurement:
13 responded that they review their measures annually 4 responded that they review them quarterly 3 responded that they review them biennially 2 responded that they review them semi-annually 1 reviews them when plans are updated
15/25
Role of Performance Measures in Divisions
This question sought to find out the extent to which performance measurements are used in each divisions of the DOT.
Planning and Programming are the most important functions for DOTs.
More DOTs indicated having an operations division than a maintenance division.
DOT functional units are organized in one-tier or two-tier systems.
Performance measures are mostly used in management and planning, and not in all DOT functions.
Different models are used for different divisions.
16/25
17/25
Major Functional Divisions within state DOTs.
Functional Division Planning/Programming/Development Operations Design/Engineering Administration Maintenance Finance Construction Public Transportation Aeronautics Safety Motor Vehicles Program Delivery
Tally 28 21 18 17 14 11 10 10 7 5 5 4
Performance Measures and Stakeholders
This question sought to find out the extent to which performance measures are used to engage external stakeholders.
30 DOTs do use performance measures to engage with stakeholders.
Key stakeholders are the public, legislature, governor and industries.
Common ways DOTs engage with stakeholders include:
Customer Satisfaction Surveys Focus groups Public meetings and hearings Websites
18/25
Setting Performance Targets
This question sought to find out how agencies go about setting performance targets if they do set target levels.
31 DOTs do set performance targets.
The method for target setting varies depending on what the targets are. For example:
Maryland DOT's overall outcome targets are established by senior leadership while output targets are determined by program managers based on funding levels.
Benchmarking with other states is seen as a good way to set targets rather than absolute values.
19/25
20/25
Ways in which Performance Targets are developed in DOTs
How Targets are Developed Upper Management Program Manager Funding Levels Benchmarking Stakeholder Input Consensus Historic Data and/or Past Experience Customer or Public Input Internal Discussion Engineering Judgment Expert Panel Resource Management Alignment with National Goals Engineering Analysis General Accepted Standards
Tally 7 6 5 3 3 3 2 2 2 2 2 1 1 1 1
Top Management Performance Information
This question sought to find out how often performance data is reviewed by top management.
32 DOTs responded that top management reviews performance information.
The majority of performance information is reviewed quarterly, followed by annual, semi-annual and continuous reviews.
Different performance information can be reviewed at different frequencies.
21/25
Asset Management
This question sought to find out whether an agency has an asset management program and if so, to what extent the program is integrated throughout the organization.
27 DOTs responded that they have an asset management program in place.
Most programs are used to monitor condition of highways and bridges.
22/25
AM program is not integrated throughout the whole department. E.g.,
Colorado DOT employs different asset management programs for the three different assets (pavement, bridge, maintenance) where different software are used for managing the different assets.
Few agencies have a well integrated system where their whole planning process is focused on asset management.
Vermont AoT has an asset management based program.
23/25
Vermont Agency of Transportation Budget Development Process (Asset performance based strategic plan)
24/25
Summary (1)
The majority of DOTs have a strategic plan in place (36 out of 39 respondents) and more than half of the responding DOTs (23 out 39) reported having performance measures tied to strategic goals and objectives.
DOTs reported that strategic objectives are largely related to transportation system safety, system preservation and mobility. Agencies also reported to a lesser extent that employee and organizational development, customer satisfaction, economic growth and vitality and environmental quality are included in strategic objectives.
Over 30% (13) of the responding DOTs reported that they review their performance measures annually.
About 70% (28) of the responding agencies reported that performance measures are mostly used in management and planning, and not in all DOT functions. About half (21) of the responding DOTs reported that they use performance measures in operations, and slightly under half (18) in design and engineering.
25/25
Summary (2)
Over 75% (30) of the responding DOTs reported that they use performance measures to engage stakeholders. About 80% (31) of the responding DOTs reported that they set performance targets, developed largely by upper management and program managers, and also by benchmarking and consensus, considering funding levels and stakeholder input. About 80% (31) of the responding agencies reported that top management reviews performance information. About 70% (27) of the responding agencies reported that they have an asset management program in place with most programs used to monitor the condition of highways and bridges.
27/27
BestPracticesin SelectingPerformanceMeasuresand
Standards
SUMMARYOFBASELINEINTERVIEWS
GeorgiaDepartmentofTransportation
GeorgiaInstituteofTechnology AdjoAmekudzi,Ph.D. MichaelMeyer,Ph.D.,P.E.
YiLinPei J.P.O'Har
September20,2009
Best Practices in Selecting Performance Measures & Standards
Summary of Baseline Interviews
Dr. Michael Meyer conducted interviews on the status of Transportation Asset Management (TAM) at Georgia Department of Transportation (GDOT) in March 2009. The objective was to assess the status of TAM at GDOT and also determine what top management considered as key issues for advancing TAM at GDOT. Dr. Meyer interviewed selected officials including Mr. David Crim and Mr. Steve Henry. Below is a general summary of the results of the interviews.
x In general, several officials felt that GDOT has a very good asset management program, although it was not a comprehensive definition of an asset management program. Their sense was that each office has good data and uses it to prioritize needs, but most of the asset management efforts are office-specific. For example, the maintenance office is responsible for the pavement management system, signs and markings, etc.; the bridge office is responsible for the bridge management system, and traffic operations is responsible for traffic signals.
x The GDOT Brief Book was not really considered an internal document but rather something that was developed for outside stakeholders. There were no suggestions to improve the Book, nor suggestions of other performance measures that might be useful as part of the GDOT program.
x GDOT's pavement management system was used to prioritize the pavement projects that were part of the economic stimulus package. The bridge management system was not used as much because of the need to have ready-togo projects.
x Those interviewed are generally interested in obtaining a 100% database for condition as cost-effectively as possible. They mentioned the work that Dr. James Tsai is doing for them using video imagery for condition assessment. They also emphasized that it is important to take a ROW-to-ROW line asset management perspective. Some members of management felt very strongly that good asset management can only be done with a full universe of data and not sample data.
x Although there was an understanding of the potential role for asset management in GDOT, it was not clear to those interviewed what steps would be necessary to achieve a more comprehensive approach, if such an approach was desired.
x In responses to a question on the linkage between safety and other management systems, interviewees explained that crash statistics are used in combination with PMS and BMS information to prioritize projects.
x There was interest in knowing what other states are doing in asset management, but a feeling that what works in one state will not necessarily work in another.
x GDOT's management felt that the important thing in Georgia is to get funding flowing once again. Once funding is flowing, GDOT will be able to prioritize investments quite well.
2
An Inventory of Asset Management Tools at the Georgia Department of Transportation
Prepared for:
Georgia Department of Transportation
Prepared by: John Patrick O'Har, Graduate Research Assistant
Best Practices in Selecting Performance Measures and Standards for Effective Asset Management
Adjo Amekudzi, Ph.D. (PI)/Michael Meyer, Ph.D., P.E. (Co-PI) School of Civil & Environmental Engineering, Georgia Institute of Technology
June 15, 2009 (Revised)
Table of Contents
Introduction AM Tools Currently Available at GDOT New Directions for AM at GDOT
Introduction
1991 Congress passes the Intermodal Surface Transportation Efficiency Act (ISTEA)
ISTEA mandated state transportation agencies to establish six infrastructure management systems for:
Bridges Safety Congestion Public transportation Intermodal facilities
Congress failed to provide funding for these mandated infrastructure management systems
Mandate repealed in 1995 Some states already began developing the infrastructure
management systems and continued to use them
Introduction (2)
1996 AASHTO and FHWA co-sponsor a workshop in D.C. "Advancing the State of the Art into the 21st Century Through Public-Private Dialogue"
Representatives from Chrysler, Wal-Mart, GTE Conrail, public utilities
Principles, practices, and tools of good AM that existed in private organizations could also apply to public organizations
1997 2nd workshop at Rensselaer Polytechnic Institute's Center for Infrastructure and Transportation Studies
Practices, processes, and tools of AM as they apply to state DOTs further examined
1999 During a reorganization effort FHWA creates Office of Asset Management
Introduction (3)
1999 Government Accounting and Standards Board issues Statement No. 34
GASB 34 requires government agencies to report their capital assets using a historical cost and depreciation approach OR using a modified approach
Modified approach requires government agencies to use some sort of asset management process
1999 National Conference on TAM in Scottsdale, Arizona
Peer exchange between state DOTs
2001 4th Conference in Madison, WI
"Taking the Next Step"
Introduction (4)
2003 5th Conference in Atlanta and Seattle
"Moving From Theory to Practice"
2005 6th Conference in Kansas City
"Making Asset Management Work in Your Organization"
2007 7th Conference in New Orleans
"New Directions in Asset Management and Economic Analysis"
2009 8th National Conference on TAM in Portland from October 19-21
"Putting the Asset Management Pieces Together"
Introduction (5)
AASHTO Standing Committee on Asset Management definition of TAM:
"A strategic and systematic process of operating, maintaining, upgrading, and expanding physical assets effectively throughout their lifecycle. It focuses on business and engineering practices for resource allocation and utilization, with the objective of better decision making based upon quality information and well defined objectives"
AM Tools at GDOT
Tool What does the tool do?
What data does this tool use?
Highway Maintenance Management System (HMMS)
Allows GDOT to track the daily work of maintenance crews throughout the state; assimilate outstanding work on roads from inspections; allows the department to develop a work program for tracking equipment costs, labor costs, and material costs
Biannual drainage reports, condition assessment of pipe, location of signs and pipes (coordinate info), and data from inspections (guardrail, pavement, vegetation, etc. no coordinate info)
Which unit(s) use this tool?
How are the results used?
Maintenance managers throughout the area and district maintenance offices
To develop an annual needs based budget; an annual work program; determine the condition of pipe systems; compare actual and estimated costs with budget office costs
AM Tools at GDOT (2)
Tool
What does the tool do?
Pavement Condition Evaluation System (PACES)
A pavement condition assessment survey that rates every mile of every road each year
What data does Condition evaluations of roadway (asphalt and concrete) this tool use?
Which unit(s) use this tool?
How are the results used?
Area and district maintenance offices; Office of Materials and Research; data output from this tool feeds into the Georgia Pavement Management System (GPMS)
To determine the overall condition of roadway; determine what work needs to be done (i.e. crack sealing, resurfacing); predict the future condition of roadway (i.e. LOS of roadway) with available funds; determine the cost of the work that needs to be done
AM Tools at GDOT (3)
Tool
Pipe Inventory
What does the A module of the HMMS; provides a condition
tool do?
assessment of pipe
What data does Data from physical inspections of pipe tracked with a this tool use? coordinate system
Which unit(s) use this tool?
Area and district maintenance offices
How are the results used?
To determine what work needs to be done on each line of pipe
AM Tools at GDOT (4)
Tool
Highway Performance Monitoring System (HPMS)
What does the tool do?
Mandated by the FHWA to provide the department's road inventory data; sample based system consisting of 98 data items; provides a variety of data (roughness data, traffic data, AADT, etc.)
What data Some of the data used include performance data, traffic counts, does this tool percent trucks, physical road data (i.e. number of lanes), etc. use?
Which unit(s) Not used much within GDOT; the department has its own road use this tool? inventory database
How are the Used by the federal government in allocating funds; other data results used? items from this tool are used within the department
AM Tools at GDOT (5)
Tool
What does the tool do?
Life Cycle Cost Analysis Tool (LCCA)
Gives a comparison of life cycle costs for different pavement types
What data does Quantities of materials, length of a project, unit costs, this tool use? maintenance costs, time frames
Which unit(s) use this tool?
Pavement management branch
How are the results used?
Making decisions on pavement type; deciding between reconstruction and rehabilitation
AM Tools at GDOT (6)
Tool What does the tool do?
What data does this tool use?
Which unit(s) use this tool?
Bridge Information Management System (BIMS)
Collects input data from bridge inspections; allows the department to retrieve certain information without going through paperwork; separate from the federally required National Bridge Inventory (NBI); collects more data than the federal government requires
Bridge serial number, location number (latitude and longitude), rating system (0 to 9), sufficiency data (federal requirement); bridge inspection data bridges inspected every 2 years, data gets reviewed, entered into a master database, data from previous years archived
Bridge maintenance unit, Office of Transportation Data, upper management (for planning)
How are the results used?
Federal reporting requirements for the NBI; generating deficiency reports; input data for HMMS; determining necessary repairs; routing (vertical clearance and load requirements for oversize/overweight loads); budgeting and funding decisions
AM Tools at GDOT (7)
Tool
Benefit/Cost Tool (B/C)
What does the Part of the project prioritization process; assigns projects a
tool do?
score
What data does this tool use?
Which unit(s) use this tool?
Overall cost of a project (design, construction, etc.); benefits (time savings through a corridor, fuel cost); safety benefits; $ values based on national average values (commercial vs. non-commercial)
Planning office, preconstruction office, and traffic operations office
How are the A piece of the decision-making process; everything is not results used? based on the B/C ratio
AM Tools at GDOT (8)
Signal System
Inventory of signals is maintained Current inventory is not very accurate
IT Department had a program called remedy
Designed to advise the department about upgrades and provide a responsive and preventative maintenance program
Program is not completed
Department is in the process of upgrading the database of controllers to a new platform (SIEMENS 2070 platform)
6,000 of 8,000 controllers have been upgraded
Signals are maintained by individual districts, many of which maintain individual databases
Databases are strictly route identifiable and intersection identifiable (no coordinate data); only inclusive of signals on the state route system
AM Tools at GDOT (9)
Intermodal
No comprehensive tools or databases for intermodal assets No financial resources available
Multimodal Transportation Planning Tool (MTPT)
Developed for the department at one point Currently not in use
Office of General Accounting
Some tools, primarily software, that are used to meet the requirements of the modified approach of GASB 34
Currently a homegrown tool is used to manage infrastructure assets Agency in the process of implementing fixed asset management software
Purchased the Asset 4000 Suite from RAMI
Department has special needs
1,000 active projects that are constantly growing and changing, large volume of information, data integration issues, software limitations
When to capitalize?
AM Tools at GDOT (10)
Enterprise GIS Database
Enterprise GIS data architecture
Enterprise GIS Manager in the
contains a new server and a new
process of creating an enterprise GIS Database
108 data sets in the database
storage method
New hardware in place by end of June, then begin to move data onto servers
(AADT, crash locations, fatalities, In the future all GIS data could be
traffic counts, etc.)
published as a single kml file so it could
Many of the data sets are generated
be accessed by open source software
through scripts from the business Currently GDOT GIS data is
databases
accessible to the public through the
Current database uses Oracle
TREX application
software and a GSRI spatial
Not showing all layers
database connector
New technology in development with
200 users connecting on a regular basis
17,000 users connecting on the web each month
IT using an ArcGIS server
Would allow someone with no GIS knowledge to mark up a map and export it as GIS data (i.e. inspection crews)
New Directions for AM at GDOT
During the inventory survey employees made suggestions/comments regarding future possibilities of AM at the agency
How to relate data from current inspections to the overall condition of the roads?
Establish performance criteria for acceptable road conditions Maintain an accurate inventory of GDOT's roads Data integration Establish boundaries of an AM program Need a champion Disconnect between inventorying and condition rating of physical
infrastructure assets and the GASB 34 standard
dZZ D W ^ Kd^ h Z Z ^
W Z
'Z Z dZZ
W :Z W K, ' Z
W ^ WZ D ^ Z D
Z W W/D D W W ZW/ ^ZZ Z Z 'Z / Z dZZ
: Z
d Z Z
&Z
^ Z h
D
^ Z h
DZ
^ Z h
KZ
^ Z h
KZ
^ Z h
h
^ Z h
&Z ^ Z
&Z ZZ Z Z
D D Z Z
Z
^ Z Z Z
D Z Z Z
&Z ^ Z
^ /Z ^ ^/^ Z Z &Kd Z Z Z Z Z Z ZZ Z
Z &Z ZZ Z
EZ D Z Z D Z Z Z
> Z Z Z D Z
&Z h
&Kd D Z Z Z Z
^ Z
W ZZ Z
>Z Z Z Z
/ ZZ ZZ
Z Z Z Z Z
D Z / ZZ Z Z ZZ Z Z
Z
K ZZ
&Z Z ZZ Z
& Z Z Z ZZ Z Z Z
ZZ Z Z Z Z
D ^ Z
K Z Z D
D D Z
ZZ Z Z Z Z Z ZZ Z ZZ
> Z Z D Z
D ^ Z
K Z Z Z /Z ^ dZZ /^d
dZZ D ^ dD^ Z ZZ ZZ Z Z ZZ Z ZZ
Z Z Z
'W^ ZZ ZZ
tZ Z &Z Z '/^
D h
D Z
Z Z Z Z D
&Z ZZ Z Z
d ZZ Z Z Z ZZ
&
DKd D Z Z Z Z
Z > Z Z
D h
ZZ Z D Z
' DKd Z Z Z ZZ Z Z
Z Z Z
D Z Z Z Z
ZZ Z Z Z Z
d Z Z Z Z Z ZZ Z D Z Z Z
>Z Z D Z Z Z Z
DZ ^ Z
DKd Z Z Z Z ZZ
h Z Z Z
KZ Z Z Z
EZ /d^ Z ZZ
>
Z Z Z Z Z dZZ W dW Z
Z
DZ ^ Z
^ Z Z Z Z
Z ZZ Z D Z ^,dK D ^ '
h Z Z
^ Z Kd Z Z Z Z Z Z Z
DKd Z Z
WZ Z Z Z Z Z Z
DZ h
^ Z Z Z Z Z
DKd Z Z Z Z Z
DZ h
tZ Z Z
W Z Z Z Z Z Z ZZ Z
WZ Z Z
KZ ^ Z
KKd Z Z Z Z Z D Z Kd
EZ D ZZ Z D Z Z Z ZZ
D Z Z Z Z Z Z Z
Z KZ ZZ ZZ Z Z
Z Z ZZ ZZ > KKd Z
Z Z Z D
KZ ^ Z
Z Z Z Z Z
KKd Z Z Z
Z Z ZZ Z Z Z
Z Z Z Z Z Z
D Z 'W^ ZZZZ Z Z Z Z Z ZZ
KZ h
D Z
/ Z Z ZZ Z
>ZZ Z Z Z
Z Z Z Z
ZZ Z Z Z Z
>Z Z Z
KZ h
KKd Z Z
WZ Z
D Z Z ZZ
Z
h Z
Z Z
Z Z Z ZZ Z
Z Z
Z Z Z Z
Z ZZ
D Z Z ZZ Z
KZ ^ Z
KKd Z D
D Z KZ Z Z Z Z ZZ D Z
^Z Z Z Z Z Z
WZ Z Z ZZ ZZ ^d/W Z
WZ /Z Z Z Z
ZZ Z dZZ d Z Z Z KKd Z Z Z Z Z
KZ ^ Z
' Z Z ZZ Z Z
KKd Z Z Z ZZ Z Z Z D
> Z Z Z D ZZ Z
WZ Z Z Z Z Z Z D
s Z Z KKd Z Z
DZ ZZ Z Z Z
KKd ZZ Z Z Z Z Z Z Z h^
KZ h
Z Z Z Z
KKd ZZ Z Z ZZ Z
Z ZZ Z ZZ
Z Z ZZ '/^ ZZ ZZ Z
Z Z Z Z Z Z ZZ Z Z Z ZZ
KZ h
D KZ Kd Z &,t Z Z Z Z Z ZZ Z Z
ZZ Z KKd ZZ &,t
DZ Z Z Z ZZ
Z Z Z
& Z Z Z Z Z Z Z
Z '/^ Z ZZ
K Z '/^ Z Z ZZ ZZ Z
KZ h
WZ Z Z Z Z
Z Z
dZ Z Z Z Z
t Z Z
KKd Z Z Z
Z Z ZZ Z
Z Z Z Z Z
KKd Z Z Z
Z Z
h ^ Z
Z h^ Z Z
Z Z Z Z Z Z Z
hKd Z Z Z Z Z 'ZZ ZZ Z >
ZZ Z D
Z Z Z dZZ ^ D d dZ E^D d h dZZ ZZ hd
D ^ D^ Z ZZ Z Z Z
Z
Z Z Z Z D Z Z ^,dK d D '
h ^ Z
/Z ZZ Z hKd D Z
Z ZZ Z
/Z Z Z
D^
D^ hKd Z Z
Z
,ZZ Z Z Z Z
Z Z
Z Z Z Z
^ Z Z ZZ
Z Z ZZ
WZ Z Z Z
ZZ Z
h h
d D ^ D^ Z Z
^ ZZ Z
Z ZZ
Z Z Z Z Z ZZ
Z Z Z
sZ Z ZZ 'W^ Z Z ZZ
h h
hKd Z Z Z '/^
^ Z Z Z Z Z Z
Z Z Z
,ZZ Z
Z ZZ
Z Z
Z Z Z
^Z Z
D Z
D Z Z D ZZ Z Z
Z Z
Z
/ Z Z
Asset Management Best Practices/Lessons Learned Utah/Indiana/Georgia Peer Exchange/Scan
FINAL REPORT
Submitted to: Georgia Department of Transportation Georgene Geary, georgene.geary@dot.ga.gov
Submitted by: Georgia Institute of Technology Adjo Amekudzi, Ph.D., adjo.amekudzi@ce.gatech.edu
October 2009
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
INTRODUCTION This report summarizes the highlights of the Asset Management Best Practices/Lessons Learned Utah-Indiana-Georgia Peer Exchange/Scan, held from August 24 to 26, 2009. The purpose of the Peer Exchange was to provide an opportunity for these states to share best practices and lessons learned from their respective efforts to institute working asset management programs, policies and procedures. The objective was for each participating state to gain practical information leading them to implement the next steps in a maturing Asset Management program. The Peer Exchange was facilitated by the Federal Highway Administration (FHWA), and officials from Georgia Department of Transportation (GDOT) and Indiana Department of Transportation (INDOT) were hosted by the Utah Department of Transportation (UDOT).
The Peer Exchange included the following participants:
UTAH DOT Ahmed Jaber, Director of Systems Planning & Programming Tim Rose, Director of Asset Management Bill Lawrence, Director of Finance Austin Baysinger, Asset Modeling Engineer Gary Kuhl, Pavement Management Engineer Kevin Nichol, Planning Statistics Engineer Russ Scovil, Pavement Condition Engineer
INDIANA DOT Brad Steckler, Director of Program Engineering Dwane Myers, Greenfield District Planning Director
GEORGIA DOT Georgene Geary, State Materials and Research Engineer Jane Smith, State Transportation Data Administrator Mike Clements, State Bridge Maintenance Engineer Eric Pitts, Assistant State Maintenance Engineer
FHWA Brain Cawley, Utah ADA Paul Ziman, Utah Area Engineer David Unkefer, Indiana Division Engineering Services Team Leader Dan Keefer, Indiana Division Asset Management Program Manager Dana Robbins, Georgia Division Technology Applications Team Leader Francine Shaw-Whitson, Headquarters Asset Management Office, Evaluation and Economic Investment Team Leader
GEORGIA INSTITUTE OF TECHNOLOGY Adjo Amekudzi, Associate Professor, Transportation Systems Program
2
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
"We get a lot of projects done. We spend a lot of money. But we are not sure we are getting the best value on the dollar." - State DOT Upper-level Manager
Status of Current Asset Management Programs and Next Steps for Deployment INDOT, GDOT and FHWA Participants FHWA Utah Office/8-24-09 (1-2:30 PM)
UDOT FHWA Utah participants gave an overview of UDOT's Asset Management (AM) program highlighting UDOT's streamlined strategic goals and performance measures, and explaining that all work plans that funnel up through each department must align with one of these goals. The agency put in a lot of effort and time to simplify their original list of goals to four final goals. UDOT's final four goals are:
1. Take care of what we have 2. Make the system work better 3. Improve safety 4. Increase capacity (www.udot.utah.gov/main/)
FHWA explained that although the Utah Division Office has worked with UDOT to align their programs to allow them to qualify for FHWA funding, UDOT is the driving force behind their AM program. He pointed to a positive response from the Utah State Legislature indicating that UDOT receives $800-900 million per year from their Legislature for highway funding. He emphasized that the drive must come from the DOT leadership. The FHWA puts in about $200 million annually toward highway funding. UDOT has bonding authority to move projects forward. UDOT is currently doing a significant amount of capacity expansion using state funds. Federal funds are going largely toward preservation. The American Recovery and Reinvestment Act (ARRA) has moved forward a backlog of preservation projects. A lot of Utah State funds cannot be applied towards preservation projects. UDOT is working with the State Legislature to elevate the importance of preservation projects, particularly because there is a wave of bridges that are coming due for preservation. FHWA Headquarters explained that several DOT experiences indicate that State Legislatures are more sensitive to the needs and priorities of DOTs when they understand how their decisions affect the State DOT program. FHWA emphasized that it is in the best interest of DOTs to educate their Legislatures on Asset Management. UDOT does a lot of marketing to their state legislators through an annual report that is particularly tailored to these stakeholders. The agency is also transparent to the public, and makes most of their material, including change orders, freely available on the Web. UDOT goes through a project selection process based on engineering, environmental and socioeconomic criteria. The Transportation Commission approves the projects.
3
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
"Asset Management is a continuous journey. It does not end. It is always about improving what you have."
-UDOT Director of Systems Planning and Programming
INDOT An INDOT official stated that many state DOTs have been struggling with a better way to make investment decisions because resources are drying up and DOTs have to figure out a more efficient way of doing things. INDOT has developed a vision of where they want to be. They do Asset Management and want to take it the next step by making it more structured and quantitative. INDOT uses HERS ST to calculate the economic impacts of projects that have been selected, and would like to refine its use for project selection. They use several analytical tools that are not integrated. There is a lot of data collected and there needs to be QC/QA (i.e., quality control/quality assurance) on the data, as well as better reliability in the use of the data. INDOT found it extremely helpful to go through the Transportation Asset Management Self Assessment Survey with FHWA. Therefore, they have a good idea of their status and where they want to be. INDOT acknowledged that one of the challenges to executing an organized Transportation Asset Management program is the organizational structure of the agency. INDOT has a strong bridge inspection program. Purdue University has been involved in developing code for the bridge management software, dTIMS, (Deighton Total Infrastructure Management System), which is currently being tested. INDOT has a FWD (i.e., falling weight deflectometer) program, and pavement condition data (rutting, IRI) is collected by a contractor, using video. In 2005, INDOT started down the road of good asset management. They had a maintenance section and a pavement section. In 2005, they started a systems section. They emphasized that automation is important. They also emphasized the importance of getting a leader who will champion Transportation Asset Management in order for it to get established. At the same time, there must be a simultaneous building of the culture and structure that will continue to work beyond this champion. The staff needs to be able to demonstrate money savings and demonstrate that the system is getting better over time. FHWA Headquarters indicated that one way to sell Transportation Asset Management is to tie it to the pending Highway Bill. Asset Management will be required, it is just not clear what form it will be in.
GDOT GDOT discussed their pavement management system, Georgia Pavement Management System (GPAMS), and their Bridge Information Management System (BIMS). They explained that they have several systems, though there is no integrated system for a comprehensive asset management process. Condition data is collected for every state road in Georgia by a team of maintenance engineers who rate the same roads every year. There are 18,100 center line miles of state roads in GA. Three independent visual inspections are done for projects that are recommended for treatment. The visual inspections are heavily resource intensive. There is a friction program. Cores are sometimes done after projects have been selected for preventative maintenance. IRI is
4
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
not used because it is not sensitive enough to pavement deterioration, especially where the pavement has a low IRI. All inspections done are fed into the Maintenance Management System which is used to build budgets on what work needs to be done. There is currently a lot of deferred maintenance. GDOT aims to maintain 85% or above of all pavements at 70 PACES rating.
Performance prediction models have been developed for each roadway segment. The system supports resource allocation decisions. GA is required by law to allocate resources equitably across their congressional districts. There are 13 congressional districts and seven geographic districts. GPAMS ranks projects based on several criteria and prioritizes across and within congressional and geographic districts. Other maintenance work is prioritized first by safety, and then by other criteria. BIMS is a good bridge inventory system but there are no procedures for prioritizing bridge work. There is a GIS viewer (TREX) that displays project data. There is much data and a desire for more. However, there is duplication in data collection efforts and definitions and terminology are not similar across the different departments. Data integration has yet to occur. Access to data in different departments can be difficult. There needs to be a business data plan and it needs to be top driven. There needs to be leadership in this area.
FHWA Headquarters pointed out that most states have pavement management systems and bridge management systems, but are not using them for resource allocation.
Funding, Budgeting and Finance Issues Meeting (Highlights) Bill Lawrence, UDOT Director of Finance UDOT/8-24-09 (3-4 PM)
UDOT explained their budgeting and financing process. Budgeting is done annually. Budget allocations for the current year were done by matching percentages to the previous year's budget.
UDOT has a maintenance program (orange book program), rehabilitation program (purple book program) and reconstruction program (blue book program).
UDOT uses dTIMS as a program development tool and goes in to the Legislature with a defensible budget.
The original Design-Build contract on I-15 had an asset management element in it which was a key to moving Asset Management forward in UDOT.
The report "Good Roads Cost Less" helped to transition UDOT culture from worst-first to preservation strategy.
UDOT's resource allocation occurs within 9 operations and safety programs and not across the programs. The asset management program is largely focused on the pavement preservation program. A bridge preservation program will be added this year.
5
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
"The Self Assessment Tool was tailored to be more applicable to UDOT." -UDOT Director of Systems Planning and Programming
Leadership, Political, Organizational and Institutional Issues (Highlights) Mr. Ahmad Jaber, UDOT Director of Systems Planning & Programming UDOT/8-25-09 (8-9 AM)
Asset Management is a continuous journey. It does not end. It is always about improving what you have.
In the mid-90s, new leadership began to look at changing the culture in the Department. o They looked at project management. o They looked at moving from a heavily centralized to a decentralized operation. (There were areas where it was felt that decentralization would not be efficient, e.g., ROW, structures -- too few people). o The I-15 project -- Design-Build -- was seen as an opportunity to implement asset management. There was interest in changing the way business was managed. o In ~2001, senior leadership decided to have a workshop on Asset Management. They had a 2-day workshop on the current status of Asset Management in the agency and where to go. They had an opportunity to review the TAM Guide and fill the TAM Self-Assessment Tool to determine where they were and where they would like to go. The Self Assessment Tool was tailored to be more applicable to the Department. A consultant was hired to facilitate the workshop. Asset Management helped the organization to understand where they were (at the time) and where they needed to go. o The strategic plan was created prior to Asset Management at UDOT and Asset Management became the tool to implement the strategic plan.
The Strategic Plan is shared with the Legislature every year. UDOT presents the Strategic Plan to the Legislature (transportation interim committee) every year.
UDOT educates the Legislature and staff on various issues using Asset Management. For example, with the prevailing budget crunch, UDOT educated both the Legislature and staff on the potential impacts of the budget shortfall. Level 2 roads were not programmed for improvements because of lack of funds. (Level 1 roads have AADT > 2,000 and/or AADT > 500 trucks while level 2 roads have AADT < 2,000 and AADT < 500 trucks). UDOT chose to concentrate their resources on 96% of the VMT. Asset Management is used to educate the Transportation Commission. UDOT officials make recommendations, and the Transportation Commission then decides on the policy for the available funding.
6
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
UDOT has developed an Asset Management Implementation Plan -- a roadmap for implementing Asset Management. The implementation plan was developed by the
executive leaders of the Headquarters and all four regions.
UDOT has performance measures on the Web.
UDOT uses the dTIMS Asset Management model to develop their pavement and bridge preservation plans.
UDOT has developed an Asset Management Implementation Plan a roadmap for implementing asset management. The implementation plan was developed by the executive leaders of the Headquarters and all four districts.
The tension between DOT Headquarters and districts is reduced by having staff who have worked in both places.
Last year, UDOT decided to hire a new vendor (Fugro-Roadware) to collect some of their pavement management data because they were having problems with data quality with the old vendor.
UDOT Asset Management Overview (Highlights) Tim Rose, UDOT Director of Asset Management 8-26-09 (9-10AM)
UDOT has integrated all their systems except the Maintenance Management System, as shown in Figure 1. As they retired their legacy systems, they made sure that their new systems fit into a common framework.
The integrated system dTIMS gives the following o A bridge preservation plan o A system (i.e., pavement) preservation plan o A system-wide preservation plan for pavements and bridges (Statewide prioritized 20-year plan)
UDOT is in the process of determining deterioration curves for culverts. Data collection is part in house, part contracted out. UDOT is working to add a structural number to their pavement model. Systems Planning and Programming: There has not been as strong a push to get
cross asset tradeoff analysis going because most of the money allocated is allocated to various programs, e.g., preservation versus capacity. UDOT tries to make everything transparent. The executive director has been in the position for 7 years.
7
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
Division of Asset Management
Skid FWD Profile Vendor Data Collection
Bridge Condition
Maintenance Section Info Functional Class & AADT
UHP Accident Reports
Pavement Condition Database "PCS"
Bridge Condition Database "PONTIS"
Plan For Every Section Database "PFES"
Database "HPMS"
Safety Management System Database "SMS"
Asset Management System Database "dTIMS"
Condition Indices, AADT, SKID, Functional Class by
Maintenance Section
Bridge Preservation/ Rehabilitation Plan
Pavement Preservation/ Rehabilitation Plan
System Preservation/ Rehabilitation Plan
Maintenance Stations
Operation Management Systems "OMS"
Figure 1: Integration of UDOT's Infrastructure Management Systems (Courtesy of Utah DOT)
Asset Management is a business decision championed by top leadership. It deals with questions such as the following: Is it the right business approach? Does it serve us as an agency? Is this the best decision for the agency? Does it help us to serve customer needs at lower costs?
The "Good Roads Cost Less," philosophy, developed in the 70s, still drives most of what UDOT does.
To implement Asset Management, you have to get buy in from the bottom and the top, and from the regions or districts. One cannot successfully implement Asset Management without understanding and addressing how financial decisions are made and the control that different individuals have. Ultimately there must be a clear command structure for decisions to be made effectively.
UDOT started using dTIMS in 2002-2003 and only started seeing the benefits really about two years ago.
UDOT Changes o Some painful changes were necessary. o These changes included the combination of construction and maintenance folks into one business unit. o The EPM (i.e., Electronic Program Management) System began in 1989. The first several years were difficult. o Asset Management is marketed to the public using the strategic plan and executive dashboard systems. o The key is to start with what you have. o The UDOT mission is "connected communities."
8
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
Asset Management is a business decision championed by top leadership.
Debriefing UDOT, GDOT, INDOT, FHWA August 26, 2009
GDOT Research Data
GDOT keeps records on every public road in Georgia (117,237 centerline miles). The basic road data feeds every application. There are 17,240 locations around Georgia for traffic counts. The intent is to bring in bridges and railroad crossing data. Every road in Georgia has an associated AADT (i.e., Average Annual Daily Traffic). GeorgiaSTARS is a traffic program produced for GA which makes traffic count data available to everyone.
In the past two years, GDOT has been able to implement a traffic polling system obtained from FDOT. South Carolina is using the same polling system.
A QC/QA program for traffic has been instituted using FHWA's 10 rules. Work is being done to get all the Road Characteristics (RC) data onto relatable
linear referencing systems. GPAMS gets a once a year dump from the RC (road characteristics) file.
Lessons Learned Working with other states tends to be people dependent. When people leave, you have to start over. There is duplication of efforts in collecting and maintaining data. Steps are being taken by individual offices to eliminate some of the duplication but there is no Department-wide plan. There are multiple opportunities for IT to work better with various divisions within the department. IT applications should be driven by business functions instead of technology.
INDOT INDOT has a partnership with Purdue University where it pays them to do research on various topics. Professor Labi and a couple of students were commissioned to do research on cross asset tradeoffs. The research is about prioritizing projects once they are selected, in order to get the greatest benefits. The research developed a menu of ways in which INDOT can maximize benefits. Every project has to be converted into a common measure. Each project has eight attributes that can be weighted in importance. The research generates an ordered list of projects. The research report is available on the Purdue University transportation research website.
9
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
Asset Management involves systematically identifying and prioritizing the best opportunities for improving agency practice, and implementing these improvements.
FHWA (Resources Available) Engineering-Economic Analysis Tools available through FHWA:
HERS-ST Workshop (can include an Executive Overview) New Mexico is using HERS-ST in their LR Planning Oregon is using HERS-ST in their LR Planning and TIP design
REALCOST (Life Cycle Cost Analysis Software) Project-based analysis LCC for 8 alternatives of a project
BCA.NET Web-based tool that allows one to look at different benefits and costs of projects
Economic Analysis for Decision Makers Helps to identify where one can apply economic analysis in one's planning and programming processes
(www.fhwa.dot.gov/infrastructure/asstmgmt/invest.cfm)
Summary Remarks UDOT, GDOT, INDOT, FHWA 8/26/09
Strategic Identify preservation categories. For example, UDOT has identified Interstate, Level 1 and Level 2 roads to prioritize investments based on AADT, truck traffic and VMT. Simple strategic goals are easier to remember and apply throughout the agency There is a need for champions to change the culture and institutionalize these changes so that as people move on the system can continue. Using a third party to facilitate change has been found effective. Holding people accountable for measures is good practice. Asset Management is a journey decide what you want to accomplish looking at both technology and organization. The philosophy "Good roads cost less" has been found to be an effective one for building a culture of asset management. Developing trust with regions and districts is going to be central to implementing an effective asset management system.
10
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
Be realistic about what you can deliver.
Have performance-based systems in place to meet any funding requirements that come with reauthorization.
Having formalized processes in writing is necessary to maintain continuity as people move on to other positions and agencies.
With an effective asset management system, you can demonstrate costeffectiveness.
Transparency of processes is important. However, it is also important to let the public know that plans are fluid documents.
The Transportation Asset Management (TAM) system must be integrated across all classes of work.
Identify what the opportunity costs are: the dollars saved by doing something and the dollars saved by not doing something. Measure your savings and use them as a guide in the progressive implementation of TAM.
Get public/customer input into plans, ideas, etc. To avoid perverse incentives and negative outcomes, include a measure for cost
effectiveness when distributing resources among different districts, and reward cost effectiveness.
Tactical Be realistic about what you can deliver. Models are only one part of a TAM program. They are important, but only one part. Asset Management tools are not black boxes. It is important to document and keep track of and be able to explain what you are doing with your tools. Ensure that confidence levels in data and models are good. In models that identify a menu of treatments, classes or categories of treatments may work better than detailed treatments because of the type of data being fed into the models.
Georgia Bridges are data rich. However there is a need to identify procedures to program bridge work. It is important to look at data to understand ways in which duplication can be eliminated. There is also a need to identify areas where efficiencies can be gained: for example, coordinating efforts among different business units. Complete self-assessment survey (tailor self assessment tool to GDOT). Make sure to document what occurs during the self assessment. It is helpful to have everyone in the same room. INDOT had over 100 people do the self assessment.
11
2009 Asset Management Peer Exchange: UDOT-INDOT-GDOT
DRAFT FINAL REPORT
Have performance-based systems in place to meet any funding requirements that come with reauthorization.
Conclusions: Advancing Asset Management Practice at GDOT
The discussions held during the UDOT/INDOT/GDOT Best Practices/Peer Exchange indicate that GDOT will possibly benefit from two main steps to advance Transportation Asset Management in the agency:
1. Conduct a self assessment exercise.
UDOT tailored the self assessment tool to suit their particular needs, opportunities and constraints. Thus, the information obtained from the self assessment exercise was very valuable for developing an Asset Management Implementation Plan well suited to their needs. It would be worthwhile for GDOT to tailor the self assessment tool to their needs. The purpose of the self assessment would be to gather information for an asset management implementation plan, i.e., a plan that identifies the best opportunities for GDOT to make changes to achieve higher levels of cost effectiveness.
2. Develop an Asset Management Implementation Plan.
Based on the discussions held during the Best Practices/Peer Exchange, the Implementation Plan could possibly include some or all of the following:
o Streamline strategic goals o Develop performance measures that align strategic goals with work at all
levels of the agency o Develop analytical procedures for the bridge database o Integrate data o Integrate analysis tools
12
Effects of Performance Uncertainty on Bridge Project Ranking in Transportation Asset Management
John Patrick O'Har, E.I.T. Graduate Research Assistant School of Civil and Environmental Engineering Georgia Institute of Technology
Atlanta, GA 30332-0355 Tel: 404-894-0418 Fax: 404-894-2278
Email: OHar@gatech.edu
Adjo Amekudzi, Ph.D. Associate Professor
School of Civil and Environmental Engineering Georgia Institute of Technology Atlanta, GA 30332-0355 Tel: 404-894-0404 Fax: 404-894-2278
Email: adjo.amekudzi@ce.gatech.edu
Michael Meyer, Ph.D. P.E. Frederick R. Dickerson Chair and Professor School of Civil and Environmental Engineering
Georgia Institute of Technology Atlanta, GA 30332-0355 Tel: 404-358-2466 Fax: 404-894-2278
Email: michael.meyer@ce.gatech.edu
Paper submitted for review for presentation at the 2012 Transportation Research Board Conference
46
ABSTRACT
47 Understanding the dominant factors of uncertainty and sensitivity in project prioritization can
48 help refine investment priorities to address high risk and benefits. It can also be used in
49 developing procedures for setting performance standards that are data-driven and transparent.
50 This study reviews risk applications in Transportation Asset Management as they apply to
51 project prioritization, and develops a case study to demonstrate the importance of addressing
52 uncertainty in bridge project ranking procedures. The study uses data from the National Bridge
53 Inventory and applies Multiple Attribute Decision Making (MADM) principles to address
54 performance uncertainty and prioritize bridges for investment. Scenarios with and without
55 uncertainty are compared to demonstrate the impact of incorporating performance uncertainty on
56 project ranking outcomes. The study also demonstrates the impacts of data disaggregation on
57 project ranking outcomes. The results show the importance of considering the effects of
58 performance uncertainty and data aggregation in project ranking.
59
60 Keywords: Bridge ranking, performance, uncertainty, risk
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92 INTRODUCTION
93 Several agencies are incorporating uncertainty in Transportation Asset Management (TAM)
94 (1;2;3;4;5;6) in order to include risk as part of their decision-making criteria. The Office of
95 Infrastructure of the City of Edmonton in Canada, for example, uses risk as a basis for their
96 infrastructure strategy. In addressing their infrastructure gap (i.e., the difference between capital
97 requirements and available funding), Edmonton has developed a risk assessment methodology to
98 help quantify the risk of asset failure and relate this to investment levels (7). Main Roads in
99 Queensland, Australia incorporates risk in bridge maintenance decision making, and England's
100 Department for Transport (DfT) has incorporated risk assessment methods into its project
101 prioritization process (1;2). Using risk in decision making helps agencies to prioritize the
102 highest risks and benefits for investment.
103
Project prioritization, a key function of Transportation Asset Management or
104 Infrastructure Management Systems, makes use of various project programming approaches.
105 The most basic of these approaches is simple subjective ranking based on engineering judgment.
106 More complex project programming processes use mathematical models to perform more
107 comprehensive analyses, taking into account various factors influencing project selection.
108 Although these models are more complex, and more difficult to develop and interpret, they
109 provide a better solution than more basic subjective project rankings (8). However, because of
110 data limitations, several agencies use subjective or objective ranking methods for project
111 prioritization, coupled with expert engineering judgment. Objective ranking methods may apply
112 Multi Attribute Decision Making (MADM) methods to capture decision criteria such as asset
113 condition, demand, and consequences of failure to prioritize projects for investment. The
114 availability of historic performance data opens the door to addressing performance uncertainty
115 and refining the results of project ranking to identify the highest-risk assets.
116
This paper presents a case study to highlight the importance of considering performance
117 risk and using disaggregate data in project ranking, where data is available. First, the paper
118 reviews basic concepts of uncertainty and risk, and discusses several examples of project
119 prioritization applications that address uncertainty. Using data from the U.S. National Bridge
120 Inventory (NBI), a scenario analysis is conducted to examine the effects of performance risk and
121 data disaggregation/aggregation on project ranking outcomes, and the implications for
122 investment decision making.
123
124
125 UNCERTAINTY, RISK AND TRANSPORTATION ASSET MANAGEMENT
126 Uncertainty is an inherent element of the decision-making process when choices are made based
127 on incomplete knowledge (6) or when there is inherent randomness in the system under
128 consideration (9). Subjective uncertainty is a function of the analyst's limited knowledge
129 whereas objective uncertainty comes about from inherent randomness in a system. Subjective
130 uncertainty is reducible with the acquisition of more knowledge, while objective uncertainty is
131 irreducible (9, 10). Uncertainties that can be quantified in terms of their probabilities and
132 severity (or magnitude) of occurrence are referred to as risks.
133
Risk assessment and risk management are often considered interchangeable, but they are
134 distinct. Risk assessment refers to the process of measuring risks in a quantitative and empirical
135 manner (11;6). Risk management, which usually follows risk assessment, is a qualitative process
136 that involves judging the acceptability of risks (11) within any applicable legal, political, social,
137 economic, environmental, and engineering constructs (6). Risk assessment and risk
138 management are important components of any asset management process (8). Risk is inherent to
139 the transportation planning and development process. Transportation plans include the political
140 risks, such as the adverse impacts of a transportation project on a local community, and funding
141 risks, i.e. the availability of funds. Risk can be considered in any part of the TAM process or
142 during any portion of the life cycle of an asset. Many times it is best to consider risk throughout
143 the entire transportation planning and development process, but other times it is more appropriate
144 to consider risk during the latter stages of the process (8).
145
146
Transportation asset management has been defined as a strategic resource allocation
147 framework that allows transportation organizations to manage the condition and performance of
148 transportation infrastructure cost effectively (12). Nearly all transportation agencies practice
149 some degree of TAM. However, not all agencies use the term asset management and there is no
150 universally adopted structure to asset management. Even so, the FHWA has identified key
151 elements of transportation asset management processes, including: goals and policies, asset
152 inventory, condition assessment and performance monitoring, alternatives analysis and program
153 optimization, short and long range plans, program implementation, and performance monitoring
154 (13). Asset management systems provide an effective platform for monitoring the condition, or
155 performance, of infrastructure assets throughout their life-cycle. As such, these TAM systems
156 are an effective platform for incorporating the risks that are associated with transportation
157 infrastructure.
158
159
160
161
162 EXAMPLES OF RISK APPLICATIONS IN BRIDGE MANAGEMENT
163
In light of the collapse of the I-35 W bridge in Minneapolis, Minnesota (14), there has
164 been growing interest in incorporating risk into bridge management systems. Cambridge
165 Systematics, in collaboration with Lloyd's Register, a firm that specializes in risk management in
166 the marine, oil, gas, and transportation sectors, developed a highway bridge risk model for
167 472,350 U.S. highway bridges, based on National Bridge Inventory (NBI) data. The model used
168 Lloyd's Register's Knowledge Based Asset Integrity (KBAITM) methodology, implemented on
169 Lloyd's Register's asset management platform, ArivuTM (15). Risk is defined as the product of
170 failure and consequence of failure. However, failure is not defined as catastrophic failure, but
171 rather as performance failure, such as bridge service interruption, which includes emergency
172 maintenance or repair, or some form of bridge use restriction. The model then predicts the mean
173 time until a service interruption occurs. A highway bridge risk universe, as shown in Figure 1,
174 can be visualized using the ArivuTM platform (15).
175
Probability of service interruption is calculated based on three risk units: deck,
176 superstructure, and substructure. The probability that each one of these units would cause a
177 service interruption is calculated; probabilities are then added together to determine the overall
178 probability that a bridge will experience a service interruption in the next year. Consequences of
179 service interruption are determined using a number of bridge characteristics, such as ADT,
180 percentage of trucks, detour distance, public perception, and facility served, that indicate the
181 relative importance of the bridge to the network and users of the system. The consequence of
182 service interruption is dimensionless, which allows the user to define the characteristics used to
183 determine the relative importance of the bridge (15). This model can be used to prioritize bridge 184 investments, minimize risk, and prioritize bridge inspections.
185
186
FIGURE 1 Highway bridge risk universe
187
Source: (15)
188
189
In another study, an analysis of past NBI ratings to predict bridge system preservation
190 needs was done for the Louisiana Department of Transportation and Development (LaDOTD)
191 (16). At the time, the LaDOTD was transitioning to the AASHTO's PONTIS bridge
192 management software. PONTIS requires detailed element level bridge inspection data known as
193 Commonly Recognized (CoRe) elements. Collecting element level bridge inspection data takes
194 years; so, an innovative approach was developed using readily available historic NBI data.
195 Deterioration processes of three NBI elements were studied to develop element deterioration
196 models. Bridge preservation plans and cost scenarios were developed using readily available
197 NBI data along with current LaDOTD practice and information (16). This study illustrated the
198 use of NBI data to evaluate long-term performance of bridges under various budget scenarios.
199
Dabous and Alkass (17) developed a method to rank bridge projects based on MAUT.
200 For capital budgeting needs, decision makers often use rankings to prioritize investment in
201 transportation projects. Several different methods can be used to prioritize bridge projects,
202 including benefit cost ratio (BCR) analysis, California Department of Transportation's Health
203 Index (18), or the FHWA's Sufficiency Rating (SR) formula (19). Based on interviews with
204 bridge engineers and transportation decision makers, MAUT was selected as a prioritization
205 methodology since it allowed decision makers to include multiple and conflicting objectives,
206 incorporating both qualitative and quantitative measurements. Utility functions were developed
207 using the Analytical Hierarchy Process (AHP) and the Eigenvector approach. A case study was
208 developed to demonstrate the potential application of this method (17).
209
210 RESEARCH METHODOLOGY
211 A case study was developed based on NBI data for selected bridges in Georgia. The case study
212 demonstrated the importance of incorporating uncertainty, and of using disaggregate versus
213 aggregate data in prioritization where disaggregate data is available. Furthermore, this case
214 study illustrated the impacts of data quality on investment prioritization, which highlights the
215 importance of investing in high-quality data collection techniques.
216
The NBI data was obtained from the FHWA website in American Standard Code for
217 Information Interchange (ASCII) format; the NBI data was from 1992 through 2009 (20). Using
218 the record format, also available on the FHWA website (20), and the Recording and Coding
219 Guide for the Structure Inventory and Appraisal of the Nations Bridges (19), this ASCII data was
220 converted into Excel format using a script in the SPSS statistical analysis software.
221
The Georgia Department of Transportation (GDOT) uses an internally developed bridge
222 prioritization formula as one of the inputs for allocating funds for bridge investment (21). This
223 bridge prioritization formula is multi-criteria in nature and takes into account a range of factors
224 of bridge condition and performance, as shown in Table 1. GDOT assigns each bridge an overall
225 score based on this formula and using engineering expert judgment. GDOT maintains a
226 proprietary Bridge Information Management System (BIMS) that contains data elements for
227 each state or locally owned bridge in Georgia. The data elements contained in the BIMS are
228 identical to or based on the data elements in the NBI.
229
230
TABLE 1 GDOT Bridge Prioritization Formula Parameter Descriptions and Point
231
Values
Variable
Description
Point Values
HS
Inventory Rating
0, 13, 25, 35
ADT
Average Daily Traffic
1, 3, 6, 10, 15, 21, 27, 35
BYPASS
Bypass/detour length (Also accounts for
0, 10, 18, 25
posting, ADT, and % trucks)
BRCOND Bridge Condition based on condition of deck, 0, 10, 15, 20, 25, 30, 35,
superstructure, and substructure
40
Factor
Weighting Factor based upon functional
1.0, 1.3, 1.5, 1.8
classification, i.e., interstate, defense, NHS
TimbSUB
Timber Substructure
0, 2, 5 (state owned)
TimSUP
Timber Superstructure
0 or 2
TimbDECK
Timber Deck
0 or 2
POST
Bridge Posting
0 to 5
TEMP
Temporary Structure Designation
0 or 2
UND
Underclearance
0, 1, 2, 3, 4, 5, 6
FC
Fracture Critical
0 or 15
SC
Scour Critical
0, 1, 2, 3, 4, 5, 6
HMOD Inventory Rating less than 15 tons for HMOD
0 or 5
truck
Narrow
Based on number of travel lanes, shoulder
0 or 30
width, length, and ADT
232
Source: Adapted from (21)
233
234
GDOT is in the process of collecting more detailed element level CoRe data (21).
235 Without more detailed element level data, it is difficult to develop bridge deterioration models,
236 especially at the project level. Sun et. al. (16) developed deterioration matrices and used Markov
237 chains to model bridge deterioration. Although this approach is feasible, it is more applicable at
238 the network level. In their analysis, bridges were grouped into four major categories: concrete,
239 steel, pre-stressed concrete, and timber; deterioration matrices were then developed for each
240 group. Since individual bridges are being ranked using the NBI data, rather than groups of
241 bridges, it was deemed more appropriate to use a methodology that applies Multiple Criteria
242 Decision Making (MCDM) principles, similar to that applied by Dabous and Alkass (17).
243
In GDOT's bridge prioritization formula (Equation 1), certain variables or attributes are
244 scored and weighted based upon their relative levels of importance. Four attributes in the
245 formula are weighted. This indicates that these attributes, HS, ADT, BYPASS, and BRCOND,
246 are likely considered more important to decision makers at GDOT than the rest of the attributes.
247
248
249
250
251
(Equation 1)
252
253
Table 2 shows the attributes used in the prioritization scenarios and their associated NBI
254 data items. Seven bridges were randomly selected for analysis for the case study. The attributes
255 in Table 2 were selected for analysis since the other attributes are relatively much less important
256 for the seven bridges, i.e., these attributes do not contribute to the scoring of a bridge.
257
258
TABLE 2 Attributes and Associated NBI Items
Attribute
NBI Data Item (s)
HS
66
ADT
29
BYPASS
19
58 (Deck)
BRCOND
59 (Superstructure)
60 (Substructure)
HISTORIC
Based on: 58, 59, 60
POST
70
TEMP
103
FC
92A
SC
113
Based on: 28A (# of lanes)
Narrow
29 (ADT) 49 (length)
51 (width)
259
260
HISTORIC is based on past bridge condition data (NBI items 58, 59, and 60). Although
261 18 years of historic NBI bridge condition data is not enough to develop a detailed deterioration
262 model, it is sufficient to identify bridges that are deteriorating at a more rapid rate than others.
263 The slopes of the historic bridge condition data were calculated in Microsoft Excel based on
264 the linear regression lines for the deck, superstructure, and substructure condition rating data
265 plotted versus time. Average slope is simply the average of the slopes of the condition data
266 plotted against time for the deck, superstructure, and substructure, respectively. Only bridges
267 with negative average slopes, i.e., bridges that worsened in condition rating over time, received
268 an attribute value. The attribute value of these bridges is the absolute value of the slope. The
269 normalized attribute value is based on the largest negative slope from the deterioration gradients.
270 Scenarios that used aggregate HISTORIC data averaged the slopes of the condition ratings for
271 deck, superstructure, and substructure; scenarios that used disaggregate condition rating data did
272 not.
273
`Narrow' is based on the number of travel lanes on the bridge (NBI item 28A), the
274 bridge's ADT (NBI item 29), the bridge's length (NBI item 49), and the bridge's width (NBI
275 item 51). The bridge's length and width are reported to the nearest tenth of a meter and were
276 converted to feet (19). A bridge is considered narrow if its shoulders are less than 3 feet
277 (assuming lanes are 12 feet wide), the total length of the bridge is greater than 400 feet, and the
278 bridge's ADT is greater than 2000 (21).
279
FC (NBI item 92A) is coded Y for the first digit if critical features, whose failure would
280 likely cause the bridge or a portion of the bridge to collapse, need special inspections or special
281 emphasis during inspections (19). SC (NBI item 113) identifies the current status of the bridge
282 as it relates to its vulnerability to scour. This item is coded from 0 to 9, T, U, or N. However,
283 only codes 0 to 4 indicate scour criticality, with 0 being the most severe, i.e., a bridge is scour
284 critical and has failed (19).
285
286 Ranking Method
287 Similar to the method developed by Dabous and Alkass (17), the ranking method developed was
288 based on four tiers of elements. The first level consisted of the overall goal of cost-effective
289 resource allocation. The second level consisted of the objectives required to achieve that goal:
290
Maximize condition preservation
291
Minimize extent of disruption
292
Minimize critical failures
293
Minimize restrictions
294 The third level consisted of the criteria or attributes used to evaluate the objectives:
295
BRCOND
296
HS
297
ADT
298
BYPASS
299
FC
300
SC
301
TEMP
302
Narrow
303
Post
304 The last level consisted of the alternatives or utilities for each bridge. Figure 2 shows the
305 structure of the tiered approach used in this case study. Through the use of an MCDM scoring
306 method that uses the simple additive weighting (SAW) method, each attribute was assigned a
307 weight and a score, varying between 0 and 1. This is achieved by normalizing all scores and
308 weights. The scoring method used for each attribute depended on whether the attribute is a
309 benefit attribute, i.e., higher is better, or a cost attribute, i.e., lower is better. Table 3 shows
310 whether an attribute is a cost benefit attribute.
311
Level 1 Overall Goal
Cost-effective Resource Allocation
Level 2 Objectives
Maximize Condition Preservation
Minimize Extent of Disruption
Minimize Critical Failures
Minimize Restrictions
Level 3 Attributes
BRCOND
Normalized Attribute Value
HS
ADT
Normalized Attribute
Value
Normalized Attribute Value
BYPASS
FC
SC
Normalized Attribute
Value
Normalized Attribute Value
Normalized Attribute
Value
TEMP
Narrow
Post
Normalized Attribute Value
Normalized Attribute
Value
Normalized Attribute Value
Level 4 Bridge Utility
Bridge i Utility
312
313
FIGURE 2 Structure of the hierarchy process used in the prioritization method.
314
Source: Adapted from (17)
315
316
Three prioritization scenarios are presented in this case study. A baseline scenario, i.e.,
317 scenario 0, incorporates aggregate bridge condition data. The first scenario incorporates
318 disaggregate condition data without past bridge condition data. The second and third scenarios
319 both incorporate uncertainty and performance risk by including past bridge condition. Scenario
320 2 incorporates aggregate past bridge condition in addition to aggregate snapshot, or current,
321 bridge condition. The third scenario incorporates disaggregate snapshot bridge condition and
322 disaggregate past bridge condition.
323
324
TABLE 3 Attribute Identification: Cost or Benefit
Attribute
NBI Data Item (s)
HS
Benefit
ADT
Cost
BYPASS
Cost
BRCOND
Benefit
HISTORIC
Cost
POST
Benefit
TEMP
Cost
FC
Cost
SC
Benefit
Narrow
Cost
325
326 Analysis
327 The weights assigned to each bridge in the ranking method are dependent upon the "Factor"
328 assigned to each bridge in GDOT's formula (21). There are four possible factors: 1.0, 1.3, 1.5,
329 or 1.8. Table 4 shows how the weighting factor is determined for each bridge. Based on the
330 factors, normalized attribute weights on the scale of 0 to 1 were calculated for each scenario.
331
332
TABLE 4 Weighting Factor Descriptions
Factor
Description
1.8
Interstate routes
1.5
National Highway System and Defense Highway routes
1.3
Routes with ADT > 10,000
Routes not in the preceding 3
1.0 categories, i.e., factors of 1.8,
1.5, or 1.3
333
334
The baseline scenario utilized aggregate data, which was estimated by averaging the
335 condition ratings of the deck, superstructure, and substructure condition ratings. Scenario 1
336 utilized disaggregate bridge condition data, i.e., bridge condition ratings for the deck,
337 superstructure, and substructure were used individually. Instead of one attribute for bridge
338 condition rating, there are now three, which altered the weights used in scenario 1. The weights
339 used in the baseline scenario and scenario 1 are shown in Table 5.
340
341
TABLE 5 Attribute Weights for Baseline Scenario and Scenario 1
Baseline Scenario
Factor of 1.8
HS ADT
BYPASS
BRCOND POST TEMP FC SC Narrow
0.15 0.15
0.15
0.15
0.08 0.08 0.08 0.08 0.08
Factor of 1.5
HS ADT
BYPASS
BRCOND POST TEMP FC SC Narrow
0.14 0.14
0.14
0.14
0.09 0.09 0.09 0.09 0.09
Factor of 1.3
HS ADT
BYPASS
BRCOND POST TEMP FC SC Narrow
0.13 0.13
0.13
0.13
0.1 0.1 0.1 0.1 0.1
Factor of 1
HS ADT
BYPASS
BRCOND POST TEMP FC SC Narrow
0.11 0.11
0.11
0.11
0.11 0.11 0.11 0.11 0.11
Scenario 1
Factor of 1.8
BRCOND
HS ADT BYPASS Deck Sup Sub POST TEMP FC SC Narrow
0.15 0.15 0.15 0.03 0.06 0.06 0.08 0.08 0.08 0.08 0.08
Factor of 1.5
BRCOND
HS ADT BYPASS Deck Sup Sub POST TEMP FC SC Narrow
0.14 0.14 0.14 0.03 0.05 0.05 0.09 0.09 0.09 0.09 0.09
Factor of 1.3
BRCOND
HS ADT BYPASS Deck Sup Sub POST TEMP FC SC Narrow
0.13 0.13 0.13 0.03 0.05 0.05 0.1 0.1 0.1 0.1 0.1
Factor of 1
BRCOND
HS ADT BYPASS Deck Sup Sub POST TEMP FC SC Narrow
0.11 0.11 0.11 0.02 0.04 0.04 0.11 0.11 0.11 0.11 0.11
342
343
The second scenario incorporated uncertainty; performance risk is included as an
344 attribute that accounts for past bridge condition, HISTORIC. The inclusion of an additional
345 attribute altered the weights used. Only bridges that worsened in condition rating over this time-
346 period, i.e., bridges with negative average slopes, received an attribute value for past bridge
347 condition. The normalized attribute value is based on largest negative slope from the
348 deterioration gradients. For the second scenario, the average slope values, i.e., aggregate data,
349 were used to determine the attribute values. Scenario 3 utilized disaggregate data for snapshot
350 (current) bridge condition rating and also for past bridge condition rating. Once again,
351 disaggregate meant that instead of using the average of deck, superstructure, and substructure,
352 individual attributes were used for deck, superstructure, and substructure. This altered the
353 weights used in scenario 3 and the individual deck, superstructure, and substructure slope values,
354 i.e., disaggregate data, were used to determine the attribute values. Table 6 shows the weights
355 used in scenarios 2 and 3.
356
357
TABLE 6 Attribute Weights for Scenarios 2 and 3
Scenario 2
Factor of 1.8
HS ADT
BYP
BRCOND HISTORIC POST
TEMP
FC
SC Narrow
0.13 0.13
0.13
0.13
0.13
0.07
0.07
0.07
0.07 0.07
Factor of 1.5
HS ADT
BYP
BRCOND HISTORIC POST TEMP
FC
SC
Narrow
0.12 0.12
0.12
0.12
0.12
0.08
0.08
0.08
0.08
0.08
Factor of 1.3
HS ADT
BYP
BRCOND HISTORIC POST TEMP
FC
SC
Narrow
0.11 0.11
0.11
0.11
0.11
0.09
0.09
0.09
0.09
0.09
Factor of 1
HS ADT
BYP
BRCOND HISTORIC POST TEMP
FC
SC
Narrow
0.1 0.1
0.1
0.1
0.1
0.1
0.1
0.1
0.1
0.1
Scenario 3
Factor of 1.8
BRCOND
HISTORIC
HS ADT BYP Deck Sup
Sub
Deck Sup Sub POST TEMP FC SC Narrow
0.13 0.13 0.13 0.03 0.05
0.05
0.03 0.1 0.1 0.07 0.07 0.1 0.07 0.07
Factor of 1.5
BRCOND
HISTORIC
HS ADT BYP Deck Sup
Sub
Deck Sup Sub POST TEMP FC SC Narrow
0.12 0.12 0.12 0.02 0.05
0.05
0.02 0.1 0.1 0.08 0.08 0.1 0.08 0.08
Factor of 1.3
BRCOND
HISTORIC
HS ADT BYP Deck Sup
Sub
Deck Sup Sub POST TEMP FC SC Narrow
0.11 0.11 0.11 0.02 0.05
0.05
0.02 0.1 0.1 0.09 0.09 0.1 0.09 0.09
Factor of 1
BRCOND
HISTORIC
HS ADT BYP Deck Sup
Sub
Deck Sup Sub POST TEMP FC SC Narrow
0.1 0.1 0.1 0.02 0.04
0.04
0.02 0 0 0.1 0.1 0.1 0.1 0.1
358
359 RESULTS
360 As mentioned previously, GDOT uses an internally developed prioritization formula as one of
361 the inputs for ranking bridges for investment (21). This formula assigns a score to each bridge
362 that the Department uses, together with engineering expert opinion and other decision support
363 elements, to allocate investments. While the Department's rankings are developed based on
364 point scores, the rankings developed for this case study utilized actual data from the NBI, with
365 the exception of the TEMP and Narrow attributes, which are binary, i.e., the aforementioned
366 conditions exist or do not exist. In the scenarios developed in this case study, actual data are
367 used in the ranking criteria and as such, bridges with lower utility values rank higher, as opposed
368 to scoring with points, in which case bridges with larger point values receive higher overall
369 scores and priority.
370
The baseline scenario incorporates aggregate snapshot bridge condition data. For
371 illustrative purposes, Table 7 shows the attribute values, their respective normalized values, and
372 each bridge's overall utility for the baseline scenario. The results of the rankings developed in
373 the baseline scenario are shown below in Table 8. Scenario 1 incorporates disaggregate bridge
374 condition data, i.e., bridge condition data for deck, superstructure, and substructure. Table 8 also
375 shows the results of the rankings developed in the first scenario. There are no differences in the
376 utility values or rankings between scenarios 0 and 1. Therefore, scenario 1 results in no
377 differences from the baseline scenario. Even though scenario 1 incorporates disaggregate (deck,
378 superstructure, and substructure) data, the overall weight assigned to the three bridge condition
379 attributes is the same as in scenario 1 (see Table 5).
Bridge ID
Criteria HS ADT
BYPASS BRCOND
POST TEMP
FC SC Narrow Utility
TABLE 7 Baseline Scenario Attributes, Normalized Attribute Values, and Bridge Utilities
251-0026-0 117-0019-0 269-0020-0 255-0017-0 185-0010-0 021-0123-0
Norm
Norm
Norm
Norm
Norm
Norm
Att Val Att Val Att Val Att Val Att Val Att Val
12.90 0.5909 18.85 0.8636 12.90 0.5909 12.90 0.5909 12.90 0.5909 21.83 1.0000
5200 0.4000 15960 0.1303 2080 1.0000 6590 0.3156 3170 0.6562 44430 0.0468
13.67 0.0455 6.835 0.0909 22.99 0.0270 9.942 0.0625 16.78 0.0370 0.6214 1.0000
5 0.6845 5.667 0.7738 4.667 0.6310 7 0.9524 5.667 0.7738 6 0.8214
3 0.6000 4 0.8000 3 0.6000 3 0.6000 3 0.6000 5 1.0000
2 0.0000 0 1.0000 2 0.0000 2 0.0000 2 0.0000 0 1.0000
0 1.0000 0 1.0000 0 1.0000 0 1.0000 15 0.0000 0 1.0000
5 0.5556 9 1.0000 9 1.0000 9 1.0000 9 1.0000 3 0.3333
0 1.0000 30 0.0000 30 0.0000 0 1.0000 30 0.0000 30 0.0000
0.52
0.61
0.54
0.59
0.41
0.70
021-0124-0 Norm
Att Val 21.83 1.0000 44430 0.0468 0.6214 1.0000 6.3333 0.8690
5 1.0000 0 1.0000 0 1.0000 3 0.3333 30 0.0000
0.70
TABLE 8 Normalized Rankings Compared to Scenario 1 and 2 Rankings
Bridge ID
Factor Scenario Scenario Scenario Scenario 1 Used 0 Utility 0 Ranking 1 Utility Ranking
185-0010-0 1
0.41
1
0.41
1
251-0026-0 1.5
0.52
2
0.52
2
269-0020-0 1
0.54
3
0.54
3
255-0017-0 1.5
0.59
4
0.59
4
117-0019-0 1.3
0.61
5
0.61
5
021-0123-0 1.8
0.7
6
0.7
6
021-0124-0 1.8
0.7
6
0.7
6
The second scenario is the first of two scenarios that incorporated uncertainty and performance risk by accounting for past bridge condition. An additional attribute, HISTORIC, was included in scenario 2. Although this changed the weights assigned to each attribute (see Table 6), the factor used, i.e., the relative importance of each attribute, did not change, assuming that past bridge condition is equally as important as the HS, ADT, BYPASS, and BRCOND attributes. The rankings developed in scenarios 2 and 3 are shown in Table 9. All of the utilities and all but one of the rankings are different between scenarios 1 (which has the same rankings as the baseline scenario) and 2. These rankings demonstrate that incorporating past bridge condition, i.e., rate of bridge deterioration, can change the utility of a bridge and therefore change the prioritization.
Scenario 3 also incorporated uncertainty and performance risk by incorporating past bridge condition. However, unlike scenario 2, which also incorporated past bridge condition, scenario 3 incorporated disaggregate snapshot (current) bridge condition as well as disaggregate past bridge condition. Although the weights for the attributes in scenario 3 are different from scenario 2 (see Table 6), the overall weights assigned to the snapshot bridge condition attributes and the past bridge condition attributes are the same as in scenario 2 so that meaningful comparisons can be made between scenarios 2 and 3.
TABLE 9 Normalized Rankings Compared to Scenario 1 and 2 Rankings
Bridge ID Scenario 0 Factor Scenario Scenario Scenario Scenario Ranking Used 2 Utility 2 Ranking 3 Utility 3 Ranking
185-0010-0
1
1
0.47
1
0.47
1
251-0026-0
2
1.5
0.47
1
0.51
3
269-0020-0
3
1
0.49
2
0.5
2
255-0017-0
4
1.5
0.64
5
0.64
5
117-0019-0
5
1.3
0.56
3
0.61
4
021-0123-0
6
1.8
0.63
4
0.69
6
021-0124-0
6
1.8
0.64
5
0.7
7
Disaggregation of both the snapshot and past bridge condition data notably impacts the results of the rankings; all but one of the utilities are different between scenarios 3 and 4 and all but one of the rankings is different. This highlights the importance of incorporating disaggregate data when available. In addition, the result of data disaggregation between scenarios 2 and 3 has
a more significant impact than data disaggregation between the baseline scenario and scenario 1, in which there was no difference in utilities or rankings between the scenarios. This demonstrates the significance of incorporating both uncertainty in terms of bridge deterioration (versus deterministic, i.e., snapshot condition data) and disaggregate data.
Table 9 shows that accounting for uncertainty by incorporating bridge deterioration rather than simply treating bridge condition deterministically notably changed the utilities and rankings for the case study bridges. It is also likely that incorporating this uncertainty on the overall bridge prioritization would result in a different outcome. The results of the prioritization outcomes are as good as the input data used for the exercise. Given that past condition data is easily obtainable, it can be incorporated into the prioritization exercise to refine the prioritization results.
CONCLUSIONS This paper reviewed risk applications in TAM systems as they apply to project prioritization, and developed a case study to prioritize selected bridges using the Multi Attribute Utility Theory (MAUT) technique. Using data from the NBI, three prioritization scenarios were developed for seven bridges in Georgia.
GDOT's internally developed bridge prioritization formula (21) utilized aggregate data in terms of bridge condition. The scenarios developed in this case study, specifically scenario 3, demonstrate the importance of incorporating disaggregate data where it is available. Data disaggregation can impact the utilities and hence the rankings of bridges. In addition, disaggregate data can result in differences in overall bridge prioritization as well. This being the case, where it is available, disaggregate bridge condition data, i.e. data for deck, superstructure, and substructure, should be used in prioritization efforts.
Scenarios 2 and 3 incorporated uncertainty by including past condition data whereas the original GDOT formula does not (21). As opposed to incorporating bridge condition deterministically, i.e., only including current (snapshot) bridge condition data, scenarios 2 and 3 account for performance risk by including attribute(s) that are based on the slopes, i.e. linear regression, of bridge condition data. Incorporating uncertainty in scenarios 2 and 3 significantly altered the utilities and rankings of the selected case study bridges. In scenario 3 when disaggregate snapshot condition data was used in combination with disaggregate past condition data the impacts on the utilities and rankings were particularly noteworthy.
An important component of the MAUT prioritization methodology is decision-maker input. Decision-makers determine the relative importance of certain attributes, influencing the weights of these attributes (see Table 5 and Table 6). A change in the relative importance of certain attributes, the "Factor" used in this case study, results in a change in weight of these attributes. The number of attributes used also influences the weight since all attributes are weighted on a 0 to 1 scale. Although this appears to be subjective, it allows decision-makers flexibility in determining which attributes are more important than others. Given that the goals, objectives, and the criteria used to meet these goals and objectives vary from one transportation agency to another, giving the decision-maker the ability to adjust attribute weights in this type of prioritization effort is one of the strengths of this methodology.
Only seven bridges were selected for the case study developed in this paper. There are over 17,000 bridges in the NBI database in Georgia (20). This being the case, without applying the methodology to all of the bridges (or a representative sample) in Georgia, it is difficult to determine the impact of approaches used in the three scenarios developed on the overall bridge
prioritization. The intent of the study however was to examine the potential effects of incorporating performance uncertainty and disaggregate data on project prioritization that would be generally applicable to bridge ranking by various agencies. The fact that there were notable changes in the rankings in multiple scenarios, particularly scenario 3, indicates that it is worth considering performance uncertainty and data disaggregation when prioritizing projects.
The past condition data used in this analysis involved the use of past NBI condition ratings. Past element level bridge inspection data would allow for the development of more accurate deterioration models. The deterioration curves developed in this analysis were based on linear regression. However, many DOTs do not yet have the resources to collect the element level CoRe data that is necessary for more advanced deterioration and forecasting models such as AASHTO's PONTIS. Even so, NBI condition rating data is reported to the FHWA by DOTs on an annual basis, along with other useful data items such as ADT, bypass length, and inventory rating. Since these NBI data items are readily available to many transportation agencies, they can be used to develop prioritization frameworks. In addition, the results of any risk-oriented prioritization framework can be used to allocate funds and set performance standards. For example, bridges with an overall utility value of 0.5 or less, including performance risk, may be considered as the standard trigger for investment.
ACKNOWLEDGEMENT This study was jointly funded by the Georgia Department of Transportation and the United States Department of Transportation through the Georgia Tech University Transportation Center project: "Best Practices in Selecting Performance Measures and Targets for Effective Asset Management." Additionally, this material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. DGE-0644493. The authors remain exclusively responsible for the material presented in this paper.
REFERENCES 1. Federal Highway Administration. Transportation Asset Management in Australia, Canada, England, and New Zealand. Prepared for NCHRP Panel 20-36. Washington, D.C. : Transportation Research Board, 2005.
2. FHWA. Transportation Performance Measures in Australia, Canada, Japan and New Zealand. Washington, D.C. : AASHTO, 2004.
3. American Association of State Highway and Transportation Officials. Transportation Asset Management Guide, Volume 2 A Focus on Implementation, January 2011, Transportation Research Board, Washington, D.C.
4. Amekudzi, Adjo. Asset Management. Institute of Transportation Engineers. Transportation Planning Handbook. 3rd Edition. Chapter 8. Washington, D.C. :2009.
5. Aktan, A.E. and Moon, F.L. "Mitigating Infrastructure Performance Failures Through Riskbased Asset Management." Drexel Intelligent Infrastructure Institute, Drexel University. Philadelphia, PA : 2009.
6. Piyatrapoomi, N, Kumar, A and Setunge, S. "Framework for Investment Decision-Making Under Risk and Uncertainty for Infrastructure Asset Management." Research in Transportation Economics, Vol. 8, pp. 199-214. 2004.
7. City of Edmonton, Infrastructure Strategy, <http://www.edmonton.ca/city_government/city_wide_initiatives/infrastructure-strategy.aspx>
8. Amekudzi, Adjo. Asset Management. Institute of Transportation Engineers. Transportation Planning Handbook. 3rd Edition. Chapter 8. Washington, D.C. :2009.
9. Helton, J.C. and Burmaster. "Treatment of Aleatory and Epistemic Uncertainty in the Performance of Complex Systems." Reliability Engineering and System Safety, Vol. 54, pp. 9194. 1996.
10. Winkler, R.L. "Uncertainty in Probabilistic Risk Assessment." Reliability Engineering and System Safety, Vol. 54, pp. 91-94. 1996.
11. Haimes, Yacov Y. Risk Modeling, Assessment, and Management. 2nd Edition. Hoboken, New Jersey : John Wiley & Sons, Inc., 2004.
12. AASHTO. Motion to Amend the Definition to Advocate the Principles of Transportation Asset Management. [Online] 2006. http://www.transportation.org/sites/scoh/docs/Motion_Trans_Asset_Management.doc.
13. Federal Highway Administration. Asset Management Primer. Office of Asset Management. Washington, D.C. : s.n., 1999.
14. National Transportation Safety Board. Collapse of I-35W Highway Bridge: Minneapolis, Minnestota; August 1, 2007. Washington, D.C. : 2008.
15. Maconochie, J.A. "U.S. Highway Bridge Risk Model - Development, Summary Results, and Applications for Federal and State Transportation Agencies." Washington, D.C. : Transportation Research Board 89th Annual Meeting, 2010.
16. Sun, X., Z. Zhang, R. Wang, X. Wang, and J Chapman. "Analysis of Past National Bridge Inventory Ratings for Predicting Bridge System Preservation Needs." Washington, D.C. : Transportation Research Record, Vol. 1866, pp. 36-43. 2004.
17. Dabous, S.A., and S. Alkass. "A Multi-attribute Method for Bridge Management." Engineering, Construction, and Architectural Management, Vol. 17., pp. 282-291. 2010.
18. Johnson, M. B. and Shepard, R.W. California Bridge Health Index. 8th International Bridge Management Conference, TRB, National Research Council, Washington, D.C. 1999.
19. Federal Highway Administration. Recording and Coding Guide for the Structure Inventory and Appraisal of the Nation's Bridges. Washington, D.C. 1995.
20. FHWA. NBI ASCII Files. USDOT FHWA Bridge Programs. [Online] April 6, 2011. [Cited: September 21, 2010.] http://www.fhwa.dot.gov/bridge/nbi/ascii.cfm.
21. Georgia Department of Transportation. Bridge Prioritization Formula. s.l. : n.p., January 13, 2011.
TRANSPORTATION PERFORMANCE MANAGEMENT: A RESOURCE CATALOG
SUBMITTED TO GEORGIA DEPARTMENT OF TRANSPORTATION
SUBMITTED BY GEORGIA INSTITUTE OF TECHNOLOGY
Jamie Montague Fischer, Graduate Research Assistant Adjo Amekudzi, Ph.D.
Michael Meyer, Ph.D., P.E.
2011
EXECUTIVE SUMMARY
The field of performance management in transportation is rapidly evolving and many-faceted. Guidance, case studies and tools representing the state of the field are abundant but also spread out across a wide literature. This Performance Management Resource Catalog (the Catalog) compiles and categorizes the various resources available to help State Departments of Transportation develop and improve their performance management programs.
CATALOG ORGANIZATION
The Catalog is organized as collection of seven color-coded sections, each grouping and
tabulating resources according to a common theme of performance management. Each section
further categorizes resources by topic within its theme, and provides separate sub-sections for
guidance, case studies, and tools according to topic. For each topic, resources are presented in a tabular format, including information in four columns: whether the resource offers guidance, case studies, or tools; the topic within the theme; the
Thematic Section Organization
Topic
Reference
Overview Guidance Details
Most Recent Others
Topic
Reference
Overview Most Recent
Pages Vol. Pp. Vol. Pp. Pages Vol. Pp.
document where relevant information is Cases Details Others
Vol. Pp.
found; and the relevant page numbers within that document. This format is summarized in the figure at right.
Tools
Topic Overview Details
Reference Most Recent Others
Pages Vol. Pp. Vol. Pp.
A given topic may have many relevant resources listed, in which case the most recent or most relevant resource is listed first. Also, the same resource may appear several times in the Catalog, if it is relevant to multiple topics. This method is used so that practitioners can easily search for resources by topic. Transportation agencies will be able to use the Catalog as a basis for accessing the appropriate resources as they refine their performance management programs.
THEMATIC SECTIONS:
1. STRATEGIC PLANNING
Strong performance management programs are linked to strong strategic plans. Specifically, performance measures and targets are the tools with which an agency can track progress toward its strategic goals and objectives. This section lists resources for creating focused strategic plans. Its topics include definitions for performance-based planning, visioning, and how to set goals and objectives.
2. PERFORMANCE MEASURES
Appropriate performance measurement will help an agency focus its data collection efforts on collecting the information that is most relevant to tracking progress toward strategic goals. This section table lists resources for the design of simple, measurable, and actionable performance measures. The topics of this section include how to select and organize measures step by step; specific measure formulations for outputs such as infrastructure condition and system efficiency, and outcomes such as accessibility, and environmental, economic and community impacts; and how to deal with "attribution issues," that is the question of how much of a measured outcome can be attributed to agency actions. This is the largest section of the Catalog.
3. PERFORMANCE TARGETS
Performance targets provide short-term mile-markers along the road to achieving strategic goals. This section lists resources for setting targets that are both achievable and ambitious, thus helping an agency to make visible progress within a constrained budget.
4. FUNDS ALLOCATION AND PROGRAMMING
Performance-based resource allocation makes targets achievable; it lends consistency and accountability to agency processes. This table lists resources to help an agency make efficient use of a constrained budget. Topics include innovative funding sources and how to set priorities for project selection.
5. ORGANIZATIONAL STRUCTURE
The success and longevity of a performance management program depends on an organizational context that supports and sustains it. This section provides resources for creating such a context, dealing with topics of both intra-agency structure and inter-organizational cooperation.
6. DATA
High quality performance measures can only be effective with high quality data. This section provides resources for developing robust data collection, analysis and management processes. Topics include how to structure data collection responsibilities, what types of data are needed for different types of measures, and how to link condition data to performance information.
7. COMMUNICATING WITH STAKEHOLDERS
A successful performance management program will gradually increase the transparency and accountability of transportation decision making. This is accomplished primarily through the various means of communication with both internal and external stakeholders. Topics in this section include how to build relationships with legislators, how to strengthen trust with customers (system users), and how to increase employee buy-in to the performance management program.
THEME 1: STRATEGIC PLANNING
Topic Description
What is Performance-based Planning
Reference Hendren and Meyer 2006
Guidance How to Address Federal Planning Regulations
NCHRP Report 551 (Cambridge Systematics 2006)
How to Identify Visions and Goals
How to Link Planning and Performance Measurement
Long-term Performance Goals
Performance measures to support short- and long-term plans
NCHRP Report 618 (Cambridge Systematics, Inc. et al. 2008)
NCHRP Report 551 (Cambridge Systematics 2006 )
NCHRP Report 551 (Cambridge Systematics 2006)
Using Performance Measurement to Inform Policy Development
Linking Planning and Operations at a State DOT
NCHRP Report 551 (Cambridge Systematics 2006 )
NCHRP Report 664 (Cambridge Systematics 2010)
Pages 1-2 1-3
60-61 7-8
86-87 67-69
62
28-31
THEME 2: PERFORMANCE MEASURES
Topic Description
How to select and organize measures, step-by-step
Guidance 1. Evaluate existing measures and identify gaps
Reference
NCHRP Report 551 (Cambridge Systematics 2006)
Pages Vol II: 7-8, 11-14
2. Set Selection Criteria
Strategic Performance
8
Measures for State
Departments of
Transportation: A
Handbook for CEOs and
Executives (TransTech
Management 2003)
What makes a good measure
How many measures are needed
3. Formulate candidate measures and measure categories What types of measures are needed for different decisions?
NCHRP Report 551 (Cambridge Systematics 2006)
Operations-Oriented Performance Measures for Freeway Management Systems (Brydia et al. 2007) Strategic Performance Measures for State Departments of Transportation: A Handbook for CEOs and Executives (TransTech Management 2003) Ibid. (TransTech Management 2003)
NCHRP Report 664 (Cambridge Systematics et al. 2010)
Vol I: 25-27, Vol II: 14-16, 4452 40-41
8
9
2-6
o Asset Management o System Operations
NCHRP Report 551 (Cambridge Systematics 2006)
NCHRP Report 618 (Cambridge Systematics et al. 2008)
Operations-Oriented Performance Measures for Freeway Management Systems (Brydia et al. 2007) NCHRP Report 551 (Cambridge Systematics 2006 )
o Agency Processes
(AASHTO 2006)
Vol I: vi, 52-58, 68-69, 74-79, Vol II: 16-18, 52-58 11-12
35-37
12
38
o Environmental Impact Energy and Resource Conservation
Air Quality
Hendren and Meyer 2006 1-6
(Brydia et al. 2007)
64-73
o Community Impact
Impacts of Air Quality on Health
Community Impact Assessment Quick Reference (FHWA 20082011) (Brydia et al. 2007)
Chapter 5 56-64
How to align various measurement efforts within and outside of the agency
NCHRP Report 664 (Cambridge Systematics 2010)
NCHRP Report 551 (Cambridge Systematics 2006)
20-31 Vol II: 8-9, 63-67,
4. Assess and select measures
Measuring Performance Among State DOTS (AASHTO 2006)
NCHRP Report 551 (Cambridge Systematics 2006)
34
Vol I: 18-20, 4454, 74-79
Cases
Topic Description
Reference
Pages
Physical Condition
Infrastructure Reporting and Asset Management: Best Practices and Opportunities (Amekudzi and McNeil 2008)
Pavement Condition in Minnesota Compatibility with planning functions
Bridge Health: Visual Inspection vs. Structural Monitoring
o Xie and Levinson.
"The Use of Road
Infrastructure Data
for Urban Transportation
94-95
Planning: Issues
and Opportunities."
o Vanderzee and Wingate. "Structural Health Monitoring for Bridges."
178-182
Ohio DOT: Pavement and bridge measures to reduce network deficiencies over time
System Efficiency
Measuring Network Wide Performance: 21 Case studies
Agency Processes
o Evans L. "Performance Driven Asset Management at a State DOT"
154-156
NCHRP Report 664 Appendix B: 40-75 (Cambridge Systematics 2010)
A prototype `Project Delivery' comparative performance measure for Delaware, Florida, Missouri, New Mexico, Ohio, Virginia, and Washington State
Environmental Quality
CalTrans
(AASHTO 2006)
41-52
Hendren and Meyer 2006 2-3
Tools
Topic Description
Physical Condition
Pavements o International Roughness Index
o Present Serviceability Rating
Bridges o CoRe Elements o Bridge Health Index
Other Assets System Efficiency
Recommended Minimum Freeway Performance Measures for Traffic Management Center operations
Reference
Pages
On the Calculation of International Roughness Index from Longitudinal Road Profile (Sayers 1995) AASHTO PP 37-04
HPMS Field Manual (FHWA 2010)
Ibid. (FHWA 2010)
1-12
1-5 4-83 4-84 4-85
AASHTO Commonly-
1-13
Recognized Bridge
Elements (Thompson
2000)
Operations-Oriented Performance Measures for Freeway Management Systems (Brydia et al. 2007)
48-49
Quick Reference guide to selected mobility and reliability measures
Delay o Incident Duration
o Recurring and Non-recurring delay
o Delay per Traveler (annual hours)
o Total Delay (person-minutes)
NCHRP Report 618
14
(Cambridge Systematics
et al. 2008)
Operations-Oriented Performance Measures for Freeway Management Systems (Brydia et al. 2007) Ibid.
51-52 52
NCHRP Report 618
14, 15, 18, 22, 23,
(Cambridge Systematics 60,
et al. 2008)
Ibid.
14, 17, 18, 21, 22,
23, 34, 60
o Misery Index
Ibid.
(MI)
18, 50, 60
Mobility o Speed
o Throughput (Person or Vehicle)
o Travel Time: link, reliability, trip
o Travel Time Index (TTI, Unitless)
o Travel Rate Index (TRI)
Operations-Oriented Performance Measures for Freeway Management Systems (Brydia et al. 2007) Ibid.
52 52-53
Ibid.
53
NCHRP Report 618
14, 15, 16, 18, 20,
(Cambridge Systematics 22, 23, 60, 62
et al. 2008)
Ibid.
15, 60
o Freight Mobility
Reliability
Hendren and Meyer
1-6
2006
o Reliability
o Buffer Index (BI, %)
NCHRP Report 618
11-13
(Cambridge Systematics
et al. 2008)
Ibid.
14, 16, 50, 57, 60,
o Planning Time Ibid. Index
14, 16, 60, 62
o Percent
Ibid.
Variation
16, 18, 50, 57, 60
Congestion
o Congested Travel (vehicle-miles or percent)
o Congested roadway (miles or percent)
Security
NCHRP Report 618
14, 17, 18, 22, 23,
(Cambridge Systematics 60,
et al. 2008)
Ibid.
14, 17, 18, 22, 23,
60,
Henderen and Meyer
1-6
2006
Accessibility
o As a related to Quality of Life, Livability and Security
NCHRP Report 551
12
(Cambridge Systematics
2006 )
Hendren and Meyer
1-6
2006
o Locational Mobility/Reliabi lity related to equity
o Accessibility Measure (opportunities within acceptable travel time)
Environmental Measures
Emissions
NCHRP Report 618
12, 13
(Cambridge Systematics
et al. 2008)
NCHRP Report 618
14, 17, 18, 22, 23, 60
(Cambridge Systematics
et al. 2008)
(Brydia et al. 2007)
80-85
Land Use
Community Impacts
System and Network Measures
Safety
NCHRP Report 664
38
(Cambridge Systematics
2010)
Community Impact Assessment Quick Reference (FHWA 20082011) NCHRP Report 664 (Cambridge Systematics 2010)
Chapter 6 38
(AASHTO 2006)
39
Customer Satisfaction o "Road Rallies"
NCHRP Report 660
34
(Cambridge Systematics
2010)
(Brydia et al. 2007)
45, 47, 50-51
Hendren and Meyer
1-5
2006
TransTech Management 5-6 (2003)
THEME 3: SETTING TARGETS
Topic Description Guidance Step by Step Process
Reference
NCHRP Report 666 (Cambridge Systematics 2010)
Attribution Issues the extent to which system performance can be attributed to agency actions
Structuring Tradeoffs
NCHRP Report 551 (Cambridge Systematics 2006 )
NCHRP Report 551 (Cambridge Systematics 2006 )
Ibid.
Pages I-22 95; Volume II 29-36 69-71
81
Setting Targets based on Available funding Addressing GASB Requirements Aligning with Customer Expectations
Travel-Time and Mobility Thresholds and Targets
Forecasting Future Mobility and Reliability Performance
Alternatives Analysis for reducing travel time and delay, and for improving reliability
Ibid.
Ibid.
NCHRP Report 660 (Cambridge Systematics 2010)
NCHRP Report 618 (Cambridge Systematics et al. 2008)
NCHRP Report 618 (Cambridge Systematics et al. 2008)
NCHRP Report 618 (Cambridge Systematics et al. 2008)
Using Travel Time information in decision making
NCHRP Report 618 (Cambridge Systematics et al. 2008)
81-82 82-85 34-35 19-20 41-50 51-54
55-68
THEME 4: FUNDS ALLOCATION AND PROGRAMMING
Topic Description
Reference
Guidance How to implement programs and projects
What are the characteristics of a programming process
How to set priorities for project selection
Urban Transportation Planning (Meyer and Miller 2001)
o Goals Achievement
o Numerical Ratings (Benefit/cost, net present worth, etc)
o Priority Indexes
o Programming Evaluation Matrix
o Systems Analysis Techniques (Multiobjective , multicriteria optimization)
What are some innovative financing/nontraditional funding sources
Urban Transportation Planning (Meyer and Miller 2001)
What does a framework for performance-based resource allocation look like
NCHRP Report 666 (Cambridge Systematics 2010)
What are Public/Private Partnerships
How to relate planning to the programming and budgeting process
Urban Transportation Planning (Meyer and Miller 2001) Urban Transportation Planning (Meyer and Miller 2007)
Pages
565-619 570-587 571-572 572
573-578 578-584 585-586
597-602 I-1 I-3 49-51 73
Cases
How to link resource allocation to policy objectives
How to evaluate costeffectiveness
NCHRP Report 551 (Cambridge Systematics 2006) Urban Transportation Planning (Meyer and Miller 2007)
How to determine the timing Urban Transportation and amount of future funding Planning (Meyer and
Miller 2007)
What are some funding considerations for target setting
NCHRP Report 551 (Cambridge Systematics 2006)
61-62 505-508
587-597
Volume 2 pp. 3233
How to use performance measures for multimodal and multi-strategy investment prioritization
How to evolve financial structure for transportation projects What are some key characteristics of an evaluation in a decisionoriented planning process How to account for uncertainty in evaluation
NCHRP Report 664 (Cambridge Systematics 2010)
Urban Transportation Planning (Meyer and Miller 2007) Urban Transportation Planning (Meyer and Miller 2007)
Urban Transportation Planning (Meyer and Miller 2007)
13-19 48-51 486-488 519-523
Topic Description
Reference
Pages
Political Linkage: Maryland General Assembly transportation revenue program. Portland Oregon: Linking Asset Decisions to Community Values to address a Funding Gap Ohio DOT: Reducing pavement and bridge deficiencies over time.
NCHRP Report 551 (Cambridge Systematics 2006)
Bugas-Schramm, P. (In Amekudzi and McNeil 2008)
Evans L. "Performance Driven Asset Management at a State DOT" in Amekudzi and McNeil 2008
62 56-63 154-156
City of Edmonton, Alberta Canada: Using degradation
Haas, Tighe, Falls and Jeffray. "Long Term
165-171
Tools
modeling to analyze alternative funding scenarios
Albany, NY New Visions Planning: Public Participation in Evaluation Urban Corridor Analysis in Salt Lake City: Multimodal Transportation Study Benefit/Cost Analysis of Light Rail in Portland, OR
Performance Modeling Life Cycle Analysis and Investment Planning for Sidewalk Networks" In Amekudzi and McNeil 2008 Urban Transportation Planning (Meyer and Miller 2007) Urban Transportation Planning (Meyer and Miller 2007) Urban Transportation Planning (Meyer and Miller 2007)
530-538 538-546 546-550
Evaluation of Implemented Programs and Projects (Ex Post Evaluation) Optimization for project programming, incorporating public input (Case Study from Seattle, Local Transportation Tax programming)
Urban Transportation Planning (Meyer and Miller 2007) to program projects in the era of communicative rationality (Lowry 2010)
Topic Description
Reference
Allocating Resources for Asset Management
Evaluating Investment Levels and Trade-offs
Identifying Needs and Solutions
Evaluating and Comparing Options
AssetManager NT
NCHRP Report 545 (Cambridge Systematics 2005)
550-556 91-100
Pages
4-6, 14-21 24-25 28-29 36-42
o Testing
49-54
o Improvements AssetManager PT
55-59 42-48
o Testing
49-54
o Improvements
55-59
Structural Monitoring for Bridges can save money over Visual Inspection
What are the characteristics of a comprehensive cost-benefit analysis What are single-objective comparative assessment methods What are multi-objective comparative assessment methods What are some concepts of transportation planning economics
Vanderzee and Wingate. "Structural Health Monitoring for Bridges." (Amekudzi and McNeil 2008) Urban Transportation Planning (Meyer and Miller 2007) Urban Transportation Planning (Meyer and Miller 2007) Urban Transportation Planning (Meyer and Miller 2007) Urban Transportation Planning (Meyer and Miller 2007)
178-182
488-501 512-516 516-519 508-511
THEME 5: ORGANIZATIONAL ISSUES
Topic Description
Internal Organization and Human Resources
How to obtain executive and Guidance senior-level leadership
Reference
NCHRP 8-36, Task 47 (Padgette 2006)
NCHRP Report 660 (Cambridge Systematics 2010)
What are some examples of consolidated and decentralized performance management activities
What are the staffing needs for performance management activities How to obtain employee buyin
How to ensure employee accountability
How to maintain program continuity How to align performance measures across the agency
NCHRP Report 660 (Cambridge Systematics 2010)
NCHRP 8-36, Task 47 (Padgette 2006)
NCHRP 8-36, Task 47 (Padgette 2006)
(Bremmer et al. 2005)
NCHRP Report 660 (Cambridge Systematics 2010) NCHRP 8-36, Task 47 (Padgette 2006) (Bremmer et al. 2005)
NCHRP Report 660 (Cambridge Systematics 2010) NCHRP 8-36, Task 47 (Padgette 2006)
NCHRP 8-36, Task 47 (Padgette 2006) NCHRP Report 551 (Cambridge Systematics 2006)
Pages
2-2 2-3 28-29, 38
39
2-3 2-5, 2-7 210 2-4 2-7 180 30-32, 39
2-16 2-17 181 29-30, 39 2-13 2-14 2-15 2-16 63-66
How to link overall agency goals and performance measures to staff performance
Inter-Organizational Issues
A CFO's Handbook on Performance Management (Cambridge Systematics 2010)
How to create peer groups
(AASHTO 2006)
among DOTs for comparative
performance measurement
13-15 4, 26, 30-33
How to align with other jurisdictions on performance measurement How to use performance management to build bridges with state legislators
How to engage public in performance measurement
NCHRP Report 551 (Cambridge Systematics 2006) NCHRP Report 660 (Cambridge Systematics 2010)
NCHRP Report 660 (Cambridge Systematics 2010)
66 39-40 40-41
Topic Description
Reference
Pages
Cases
Organization of performance measurement programs within thirteen agencies
Linking planning and operations at a State DOT: Oregon Transportation Plan, Washington State Gray Notebook A prototype `Project Delivery' comparative performance measure" for Delaware, Florida, Missouri, New Mexico, Ohio, Virginia, and Washington State Performance-based Contracts for road maintenance: lessons from New Zealand
NCHRP 8-36, Task 47 (Padgette 2006)
NCHRP Report 664 (Cambridge Systematics 2010)
(AASHTO 2006)
Tighe, Manion, Yeaman, Rickards and Haas. "Using Performance Specified Maintenance Contracts: Buyer/Seller Beware" in Amekudzi and McNeil 2008
B-1 B-32 Chapter 7 41-52
108-114
Peer-to-Peer Scenario Multistate Partnership for System Operations: MidAtlantic Operations Study, I95 Vehicle Probe Study
NCHRP Report 664 (Cambridge Systematics 2010)
Megaregional Partnership: San Joaquin Valley Regional Blueprint
NCHRP Report 664 (Cambridge Systematics 2010)
Interagency Development of Performance Standards for Managing Materials, Wastes, and Contamination Under Oregon's Bridge Program
(Armstrong & Levine 2006) Journal of the Transportation Research Board
Chapter 5 Chapter 6 176-177
Topic Description
Reference
Pages
Tools
How to design training programs
How to write a Memorandum of Understanding
NCHRP Synthesis 362 (RandolphMorgan Consulting LLC 2006) (Homeland Security SAFECOM)
29-31 3-8
A guide to Best Practices for (Office of Federal
2-18
contract administration
Procurement Policy 1994)
THEME 6: DATA
Topic Description
Data Collection How to structure data collection responsibilities
Reference
NCHRP 8-36, Task 47 (Padgette 2006)
Pages 2-7 2-9
What to measure: outputs vs. outcomes
Guidance
Performance Measures for Complete, Green Streets: A Proposal for Urban Arterials in California (MacDonald, Sanders, Anderson 2010)
37-38
When to measure
Performance Measures for 40 Complete, Green Streets: A Proposal for Urban Arterials in California (MacDonald, Sanders, Anderson 2010)
What types of data are needed
System preservation NCHRP Report 551
31
(Cambridge Systematics
2006)
Operations and
31
management
Capacity expansion
32
Air quality monitoring Operations-Oriented
73-80
and measurement
Performance Measures for
Freeway Management
Systems (Byrdia et al.
2007)
Travel time, mobility NCHRP Report 618
and reliability
(Cambridge Systematics et
measures
al. 2008)
o Potential sources of travel time, delay, and reliability data
o Travel-time data collection methods by required investment
Customer-related
o Measuring customer needs: objective data
o Measuring customer needs: subjective data
o Customer grouping and segmentation
NCHRP REPORT 487 (Stein and Sloan 2003)
o Survey Techniques
27 28-29
13-14 14-15 17-20 21-28
Non-traditional performance measures
o Contextsensitive performance measures
Pedestrian safety and walkability
Bicyclist safety and bikability
NCHRP W69 Performance Measures for Context Sensitive Solutions: A Guidebook for State DOT's Performance Measures for Complete, Green Streets: A Proposal for Urban Arterials in California (MacDonald, Sanders, Anderson 2010) Modeling Capacity Flexibility of Transportation Networks (Chen & Kasikitwiwat 2011)
7-8 22-23, 25-26
24, 26-27
Psychological
28
well-being
Economic vitality
28-29
Environmental benefits o System
flexibility How to forecast future performance: mobility and reliability
29-32
107-109
NCHRP Report 618
41-50
(Cambridge Systematics et
al. 2008)
Data Management
How to structure data management responsibilities
NCHRP 8-36, Task 47 (Padgette 2006)
2-7 2-9
How to store and manage data NCHRP 8-36, Task 47 (Padgette 2006)
2-9 2-10
Data Analysis
How to structure data analysis NCHRP 8-36, Task 47
responsibilities
(Padgette 2006)
Linking Condition Data and Performance Information
Air Quality Monitoring and Measurement
Little, R. "A Clinical Approach to Infrastructure Asset Management" in Amekudzi and McNeil 2008 (Brydia et al. 2007)
2-7 2-9 120-122
73-80
Assigning Values
Importance of information and Analytic Tools
Performance Measures for Complete, Green Streets: A Proposal for Urban Arterials in California (MacDonald, Sanders, Anderson 2010) NCHRP Report 551 (Cambridge Systematics 2006)
41-44 43-44
Topic Description
Reference
Pages
Cases
Integrated Corridor Management (ICM) Projects: Maryland I-270, Minnesota I394 Identifying Performance Data to support Strategic Goals: Florida DOT Strategic Intermodal System Combining Subjective and Objective Data: Portland Oregon, Florida DOT Data Business Plans in Florida
Caltrans Performance Measures Framework for Complete, Green Urban Arterials
Non-motorized Modes Oregon DOT Complete Streets
Vermont Pedestrian and Bicycle Policy Plan
NCHRP Report 664 (Cambridge Systematics 2010)
49-53
NCHRP Report 664 (Cambridge Systematics 2010)
59-60
NCHRP REPORT 487 (Stein and Sloan 2003)
44-45, 59-61
Llort and Golden in Transportation Research Circular Number E-C115 (Hall 2007) Performance Measures for Complete, Green Streets: A Proposal for Urban Arterials in California (MacDonald, Sanders, Anderson 2010)
19-21 79-81
Performance Measures for Complete, Green Streets: A Proposal for Urban Arterials in California (MacDonald, Sanders, Anderson 2010)
45-47 47-50
The Florida Reliability Method
City of Edmonton, Alberta Canada: Linking Condition Data to Performance Models
Haas, Tighe, Falls and Jeffray. "Long Term Performance Modeling Life Cycle Analysis and Investment Planning for Sidewalk Networks" In Amekudzi and McNeil 2008
51-52 165-171
The Role of Senior Management in Performance Measurement (CalTrans Data Management for GoCalifornia Plan)
Iwasaki in Transportation Research Circular Number E-C115 (Hall 2007)
22-24
Topic Description
Reference
Pages
Tools
RAILER- a member of the Engineered Management System (EMS) family of products, for condition reporting and maintenance planning on short-line railroads.
Pavement Condition Assessment
Automated sensing Automatic Road Analyzer (ARAN)
Image Pattern Recognition
Bridge Health: Visual Inspection vs. Structural Monitoring
Customer Surveys
Microsimulation Modeling
Grussing and Uzarski, "Framework for Short-Line Railroad Track Asset Management and Condition Reporting" in Amekudzi and McNeil 2008
A Study of Manual vs. Automated Pavement Condition Surveys (Timm & McQueen 2004) TransView (2010). "ARAN Automated Road Analyzer" (April 2, 2011).
172-176
9-27 <http://www.transvie w.org/aran/>
Using Image Pattern Recognition Algorithms for Processing Video Log Images to Enhance Roadway Infrastructure Data Collection (Tsai 2009) Vanderzee and Wingate. "Structural Health Monitoring for Bridges." (Amekudzi and McNeil 2008) NCHRP REPORT 487 (Stein and Sloan 2003)
7-13, 18-24 178-182 21-28
Data warehouses/data marts
Bi-level Network Capacity Models
Transportation Research Circular Number E-C115 (Hall 2007)
Modeling Capacity Flexibility of
VDOT Case P64. WSDOT Case P68
109-116
Methodology for Measuring Service Quality using Objective and Subjective Indicators
Geographic Information Systems (GIS)
Alaska DOT HASGIS Interface project
Transportation Networks (Chen & Kasikitwiwat 2011) A methodology for evaluating transit service quality based on subjective and objective measures from the passenger's point of view (Eboli & Mazzulla 2011)
174-176
Transportation Research Circular Number E-C115 (Hall 2007)
31-32
Minnesota DOT
52
Spatial Analysis
7. COMMUNICATING WITH STAKEHOLDERS
Topic Description
Stakeholder Engagement Guidance Building Relationships with
Legislators
Reference
NCHRP Report 660 (Cambridge Systematics 2010)
Visibility and Credibility to the Public
NCHRP Report 660 (Cambridge Systematics 2010)
NCHRP 8-36, Task 47 (Padgette 2006)
Pages 39-40 35-36, 40-41 2-13 2-14
Strengthening Trust with
TransTech Management 4
Stakeholders and Customers (2003)
Using Customer Needs to Drive Transportation Decision Making: Chapter 8, Guidelines for Practitioners. Using Customer Opinions to Shape Strategic Management Direction
NCHRP REPORT 487( Stein and Sloan 2003)
TransTech Management (2003)
Interests of Different Stakeholders
NCHRP Report 551 (Cambridge Systematics 2006)
External and Internal Buy-In
NCHRP Report 551 (Cambridge Systematics 2006 )
Reporting Challenges with Reporting
(Bremmer et al. 2005)
85-101
5 Volume 2 P8: Figure 2. 87-89
179-180
Attribution Issues the extent to which system performance
can be attributed to agency actions
NCHRP Report 551 (Cambridge Systematics
2006 )
69-71
Steps to Keeping Customers NCHRP REPORT 487(
Informed
Stein and Sloan 2003)
98-101
Cases
Topic Description
Community Engagement in Portland Oregon: Linking Asset Decisions to Community Values
Reference
Bugas-Schramm, P. (In Amekudzi and McNeil 2008)
Pages 56-63
Case Studies in Customer Analysis in Agency Work, including methods of outreach and application of customer data to performance measures for several DOTs
NCHRP REPORT 487( Stein and Sloan 2003)
An internet portal for large group participation in transportation programming decisions (Case Study from Seattle, Local Transportation Tax programming)
Lowry 2008
Iowa DOT: reporting to agency decision makers
Example Reporting Methods for Eight DOTs
Smadi, O. "Communicating the Results of Integrated Asset Management: Iowa DOT Case Study." (Amekudzi and McNeil 2008) NCHRP 8-36, Task 47 (Padgette 2006)
65-80 156-165 86-92 2-12 (Table 2.3)
Topic Description
Tools Customer Surveys
Dashboards Agency Report Cards Websites Reports
Reference
Pages
NCHRP REPORT
21-28
487(Stein and Sloan 2003)
(Bremmer et al. 2005) 179-180
Reporting on Customer Preferences, Florida DOT
NCHRP REPORT 487( Stein and Sloan 2003)
59-61
REFERENCES
1. Cambridge Systematics, Inc., Boston Strategies International, Inc., Gordon Proctor and Associates, and Michael J. Markow (2010) NCHRP Report 666: Target-Setting Methods and Data Management to Support Performance-Based Resource Allocation by Transportation Agencies
2. Cambridge Systematics, Inc. and University of Maryland Center for Advanced Transportation Technology (2010) NCHRP Report 664: Measuring Transportation Network Performance. Transportation Research Board of the National Academies, Washington DC. 84p.
3. Cambridge Systematics, Inc. and High Street Consulting Group (2010) Transportation Performance Management: Insight from Practitioners NCHRP Report 660
4. Grant, M., J. Bauer, T. Plaskon, andJ. Mason. (2010) Advancing Metropolitan Planning for Operations: An Objectives-Driven Performance-Based Approach. United States Department of Transportation. Federal Highway Administration.
5. Lowry, M. B. (2010). Using optimization to program projects in the era of communicative rationality. Transport Policy, 17(2), 94-101.
6. Federal Highway Administration (FHWA). (2010) Highway Performance Monitoring System Field Manual. http://www.fhwa.dot.gov/policy/ohpi/hpms/fieldmanual/
7. Federal Highway Administration (FHWA). (2011) Community Impact Assessment: A Quick Reference for Transportation. http://www.ciatrans.net/CIA_Quick_Reference/Purpose.html
8. Shelton, J., and Medina, M. (2010). Prioritizing Transportation Projects using an Integrated Multiple Criteria Decision-Making Method. In TRB 89th Annual Meeting: Compendum of Papers DVD. Washington DC: Transportation Research Board.
9. Amekudzi, A. and McNeil, S. eds. (2008) Infrastructure Reporting and Asset Management: Best Practices and Opportunities. American Society of Civil Engineers. Reston, VA. 56-62.
10. Cambridge Systematics, Inc., Dowling Associates , Inc., Systems Metrics Groups, Inc., and Texas Transportation Institute. (2008) NCHRP Report 618: Cost-Effective Performance Measures for Travel Time Delay, Variation, and Reliability. Transportation Research Board of the National Academies, Washington D.C.
11. Trigueros, M. A. (2008). An Analysis of Project Prioritization Methods at the Regional Level in the Seventy-five largest metropolitan areas in the United States of America. Civil and Environmental Engineering. Georgia Institute of Technology.
12. Lowry, M., Nyerges, T., Rutherford, G., 2008. An internet portal for large-group participation in transportation programming decisions. Transportation Research Record 2077, 156165.
13. Brydia, R. E., Schneider IV, W. H., Mattingly, S. P., Sattler, M. L., and Upayokin, A. (2007). Operations-Oriented Performance Measures for Freeway Management Systems: Year 1 Report (No. FHWA/TX-07/ 0-5292-1). Texas Transportation Institute; Texas Department of Transportation, Austin, Texas; Federal Highway Administration
14. Poister, T. H. (2007). Performance measurement in transportation agencies: State of the practice. Handbook of Transportation Policy and Administration, pp. 485-504.
15. Cambridge Systematics. (2007). NCHRP Research Results Digest 312: Guide to Effective Freeway Performance Measurement. Transportation Research Board, Washington, D.C.
16. Schmitt, R. (2007). Research Problem Statement: Setting Effective Performance Targets for Transportation Programs, Plans and Policy. Challenges of Data for Performance Measures Workshop (pp. 106-108). San Diego : Transportation Research Board.
17. Transportation Association of Canada (2006). Performance Measures for Road Networks: A Survey of Canadian Use. Transport Canada.
18. Hendren, P. and Meyer, M. (2006). NCHRP Project 08-36, Task 53 (02): Peer Exchange Series on State and Metropolitan Transportation Planning Issues. Meeting 2: Nontraditional Performance Measures. Transportation Research Board, Washington, D.C.
19. American Association of State Highway and Transportation Officials. (2006). Measuring performance among state DOTs. Washington, D.C.: American Association of State Highway and Transportation Officials.
20. Cambridge Systematics, Inc. (2006). Performance measures and targets for transportation asset management (NCHRP Report 551). Washington, D.C.: Transportation Research Board.
21. Padgette, Robert. Cambridge Systematics, Inc. (2006) Effective Organization for Performance Measurement , Transportation Research Board of the National Academies, Washington, D.C., 2006. http://www.transportation.org/sites/planning/docs/ NCHRP%208 36%2847%29%20Final%20Report.doc.
22. International Infrastructure Management Manual, Version 3.0 (IIMM). Association of Local Government Engineering New Zealand (INGENIUM), Institute of Public Works Engineering of Australia (IPWEA), 2006.
23. Marsden, G., & Bonsall, P., (2006). Performance targets in transport policy. Transport Policy, 13 (3), 191-203.
24. Hall J, ed. (2007) Transportation Research Circular Number E-C115: Challenges of Data for Performance Measures - A Workshop. Transportation Research Board, Washington D.C.
25. Adams, Louis H., Frances D. Harrison, and Anita Vandervalk, "Issues and Challenges in Using Existing Data and Tools for Performance Measurement," TRB Conference Proceedings 36: Performance Measures to Improve Transportation Systems. Transportation Research Board (2005).
26. Bremmer, D., Cotton, K. C., & Hamilton, B. (2005). Emerging performance measurement responses to changing political pressures at state departments of transportation: practitioners' perspective. Transportation Research Record. (1924), 175-183.
27. Cambridge Systematics. (2005). NCHRP 7-15, Task 1.3: Cost-Effective Measures and Planning Procedures for Travel Time, Delay, and Reliability. Transportation Research Board, Washington, D.C.
28. Managing for Results: Enhancing Agency Use of Performance Information for Management Decisions ; Government Accountability Office, Washington, D.C., 2005. http ://www.gao.gov/new.items/d05927.pdf.
29. Larson, M. C. (2005). Organizing for Performance Management. Report of a Conference, Irvine, California, August 2224, 2004. Second National Conference on Performance Measures (36) pp. 99120. Available at http://onlinepubs.trb.org/onlinepubs/conf/CP36.pdf.
30. Federal Highway Administration, Office of International Programs (2004). Transportation performance measures in Australia, Canada, Japan, and New Zealand. (Scanning Tour Report), Washington, D.C.
31. Stein K, Sloan R. NCHRP REPORT 487: Using Customer Needs to Drive Transportation Decision Making. Washington, D.C.; 2003.
32. TransTech Management, Inc. (2003). Strategic performance measures for state Departments of Transportation: A handbook for CEOs and executives. Washington, D.C.: American Association of State Highway and Transportation Officials.
33. Shaw, T. (2003). Performance Measures of Operational Effectiveness for Highway Segments and Systems. National Cooperative Highway Research Program Synthesis 311. Transportation Research Board, Washington, D.C.
34. Meyer, M. D. and E. J. Miller. Urban transportation planning: a decision-oriented approach,2nd ed., McGraw-Hill, Boston, 2001.
35. National Research Council (.-U.). (2000). A guidebook for performance-based transportation planning. Report (National Cooperative Highway Research Program), 446. Washington, D.C.: National Academy Press.
36. Poister, T. H. (2004). Strategic planning and decision making in state departments of transportation: A synthesis of highway practice (NCHRP Synthesis 326). Washington, D.C.: Transportation Research Board.
37. Poister, T., D. Margolis, and D. Zimmerman, Strategic Management at the Pennsylvania Department of Transportation: A Results-Driven Approach, Transportation Research Record 1885, Transportation Research Board of the National Academies, Washington, D.C., 2004. http://trb.metapress.com/content/k0v0nn40x7x3l371/.
38. Cambridge Systematics, Inc., and David Evans and Associates, ODOT Operations Program Performance Measures, draft final report for Oregon DOT (June 2001).
39. Florida DOT, Measures for Performance-Based Program Budgeting as Stated in the General Appro- priations Act for FY 19971998 (October 1997).
40. Gore, A. (1997). Balancing measures: Best practices in performance measurement. Washington, DC: The Partnership.
41. Hudson, W. R., R. Haas and W. Uddin. Infrastructure Management, McGraw-Hill. New York, 1997.
42. Sayers, M. (1995). "On the Calculation of International Roughness Index from Longitudinal Road Profile." Transportation Research Record, No. 1501.