Materials quality management for alternative project delivery

GEORGIA DOT RESEARCH PROJECT NUMBER RP 16-22
FINAL REPORT
MATERIALS QUALITY MANAGEMENT FOR ALTERNATIVE PROJECT DELIVERY
OFFICE OF PERFORMANCE-BASED MANAGEMENT AND RESEARCH 15 KENNEDY DRIVE
FOREST PARK, GA 30297-2534

1. FHWA-GA- 2018-1622

2. Government Accession No.: 3. Recipient's Catalog No.:

4. Title and Subtitle:

5. Report Date: May 2018

Materials Quality Management for Alternative Project Delivery

6. Performing Organization Code:

7. Author(s): Baabak Ashuri, Ph. D., DBIA, CCP, DRMP Yashovardhan Jallan Jung Hyun Lee 9. Performing Organization Name and Address: Economics of the Sustainable Built Environment (ESBE) Lab Georgia Institute of Technology 280 Ferst Drive, Atlanta, GA 30332-0680 12. Sponsoring Agency Name and Address: Georgia Department of Transportation, Office of Performance-Based Management and Research 15 Kennedy Drive, Forest Park, Georgia 30297-2599

8. Performing Organ. Report No.:
10. Work Unit No.: 11. Contract or Grant No.:
P.I.No. 0015164
13. Type of Report and Period Covered: Final; July 2016 May 2018 14. Sponsoring Agency Code:

15. Supplementary Notes:

16. Abstract: This research provides a synthesis of the state-of-the-art practices in quality management for highway construction projects delivered by the alternative delivery methods, especially designbuild. A designbuild project delivery system is a significant change from the traditional designbidbuild system. A large number of roles and responsibilities are transferred by the state DOT to the designbuilder, the size of the project is usually much larger, the cost and funding mechanisms are much more elaborate, the personnel qualifications and requirements have to be carefully reviewed at each stage, and several stakeholders are involved in the project. In light of these changes, several challenges were identified. Best practices in handling the identified challenges for implementing a quality management plan for the alternative delivery environment are identified. Several important areas that can be considered for enhancing the state of the practice for quality management in the alternative delivery environment are as follows: (1) Organizational structure; (2) Acceptance approaches; (3) Selection criteria and quality management plan; (4) Working relationships with the FHWA; (5) Independent assurance methods; (6) Budgeting and cost mechanisms; (7) Quality assurance software programs; (8) Pay factor adjustment; (9) Non-conformance reports; (10) Responsible charge; (11) Risk-based approach; and (12) Independent engineer.

17. Key Words: Quality assurance, Quality assurance Program, Quality management, Quality management plan, Quality assurance/quality control (QA/QC), Innovative project delivery, Design-build

18. Distribution Statement:

19.Security Classification
(of this report): Unclassified

20. Security Classification (of this page): Unclassified

21. Number of 22. Price: Pages: 90

ii

GDOT Research Project No. 16-22
Final Report
MATERIALS QUALITY MANAGEMENT FOR ALTERNATIVE PROJECT DELIVERY
Prepared by: Baabak Ashuri, Ph.D., DBIA, CCP, DRMP
Yashovardhan Jallan Jung Hyun Lee
Georgia Institute of Technology
Contract with Georgia Department of Transportation
In cooperation with U.S. Department of Transportation Federal Highway Administration
May 2018
The contents of this report reflect the views of the authors who are responsible for the facts and the accuracy of the data presented herein. The contents do not necessarily reflect the official views or policies of the Georgia Department of Transportation or the Federal Highway Administration. This report does not constitute a standard, specification, or regulation.
iii

TABLE OF CONTENTS
List of Tables .................................................................................................................... vii List of Figures .................................................................................................................. viii Executive Summary ............................................................................................................ x Acknowledgments............................................................................................................ xiv Chapter 1 Introduction and Literature Review ................................................................... 1
1.1 Introduction............................................................................................................. 1 1.2 23 CFR 637 and FHWA Techbrief ......................................................................... 4 1.3 Literature Review.................................................................................................... 5 Chapter 2 Research Methodology....................................................................................... 8 2.1 Overview................................................................................................................. 8 2.2 Discussion of Research Methodology Steps ........................................................... 9 Chapter 3 Challenges in Developing and Implementing Quality Management Programs15 3.1 Overview............................................................................................................... 15 3.2 Reluctance to Shift the Responsibility of Quality Assurance to the
DesignBuild Team ............................................................................................. 16 3.3 Contractor's Reluctance to Accept the New Role of QA in the DB
Environment ........................................................................................................ 17 3.4 Difficulty in Developing Appropriate Quality Management for the
Alternative Delivery when Detailed Design and Actual Quantities Are Not Available.............................................................................................................. 18 3.5 Difficulties in Developing an Adequate and Reliable Budget for Quality Management Tasks and Conducting Cost Control .............................................. 18 3.6 Differences in Terminology Used by State DOTs for Quality Management in the DesignBuild Environment ....................................................................... 20 3.7 Lack of a Unified and Consistent Guidebook for Quality Management in the State DOT ............................................................................................................ 21 3.8 Differences in Organizational Structure for Quality Management ....................... 22
iv

3.9 Understanding New Roles and Responsibilities in DesignBuild Projects.......... 22 3.10 Independence of Quality Management Firms from the DesignBuild Team..... 23 3.11 Need for Specialized Training: Requirements for the New Set of Skills and
Qualifications for Working in the DB Environment ........................................... 24 3.12 Need for an Appropriate Evaluation System to Evaluate the Qualifications
of the DesignBuild Team and Its Approach Toward Quality Management in the Procurement Phase..................................................................................... 25 3.13 Lack of Familiarity with How to Use the Contractor's Samples in the Acceptance Procedure ......................................................................................... 25 3.14 Establishing and Maintaining Good Relationships with the FHWA to Ensure that State DOTs and the FHWA are on the Same Page When It Comes to Evaluating the Project Quality............................................................. 26 3.15 Lack of Flexibility and Scalability of Existing Quality Management Software Programs Mainly Designed for the DBB Environment ....................... 26 Chapter 4 Strategies to Enhance Quality Assurance Programs ........................................ 28 4.1 Organizational Structure for Quality Management in the DesignBuild Environment ........................................................................................................ 28 4.1.1 Georgia DOT (GDOT)................................................................................. 30 4.1.2 Virginia DOT (VDOT) ................................................................................ 31 4.1.3 Texas DOT (TxDOT) .................................................................................. 32 4.1.4 Minnesota DOT (MnDOT) .......................................................................... 33 4.2 Acceptance Approaches........................................................................................ 34 4.2.1 Approaches for Quality Acceptance ............................................................ 35 4.2.2 Decision Factors........................................................................................... 38 4.2.3 Examples Describing the Choice of Acceptance Approaches by
Different DOTs................................................................................................. 40 4.3 Selection Criteria .................................................................................................. 43
4.3.1 RFQ Phase ................................................................................................... 43 4.3.2 RFP Phase .................................................................................................... 44 4.3.3 Quality Management Plan............................................................................ 45
v

4.4 Establishing and Maintaining Exemplary Working Relationships and Collaborations with the FHWA ........................................................................... 47
4.5 Independent Assurance ......................................................................................... 50 4.5.1 Project Approach ......................................................................................... 51 4.5.2 System Approach ......................................................................................... 51 4.5.3 Mixed Approach .......................................................................................... 51 4.5.4 Summary of the Interview Findings about the State of the Practice in IA Approaches.................................................................................................. 52
4.6 Cost Mechanisms .................................................................................................. 54 4.6.1 Summary of the Interview Findings about the State of the Practice in Budgeting and Cost Control for Quality Management Tasks in the DesignBuild Environment .............................................................................. 56
4.7 Quality Assurance Software ................................................................................. 58 4.7.1 Differences between DB and DBB Projects in Terms of QA Software Requirements.................................................................................................... 58 4.7.2 Desired Functionalities for QA Software Programs .................................... 59
4.8 Pay Factors............................................................................................................ 64 4.8.1 Overview...................................................................................................... 64 4.8.2 Summary of the Interview Findings about the State of the Practice in Pay Factor Adjustment for Quality in the DesignBuild Environment ........... 64
4.9 Non-Conformance Reports (NCRs)...................................................................... 69 4.9.1 Overview...................................................................................................... 69 4.9.2 Summary of the Interview Findings about the State of the Practice in the NCR Process............................................................................................... 72
4.10 Responsible Charge ............................................................................................ 77 4.11 Risk-Based Approach ......................................................................................... 78 4.12 Independent Engineer (IE) .................................................................................. 81 Chapter 5 Conclusions ...................................................................................................... 84 References............................................................................................87
vi

LIST OF TABLES
Table 1. States with Percentage-based Mechanism for Budgeting QA Services in DesignBuild Projects .................................................................................................. 57
Table 2. Line Items for Pay Factors .................................................................................. 66 Table 3. Material Tests for Pay Factor.............................................................................. 67
vii

LIST OF FIGURES
Figure 1. Proper use of term "Quality Assurance" ............................................................. 3 Figure 2. Quality assurance elements (FHWA 2008) ......................................................... 3 Figure 3. Six core elements of QA...................................................................................... 5 Figure 4. Research methodology ........................................................................................ 9 Figure 5. Overview of email interview ............................................................................. 11 Figure 6. An example of the recommended organizational chart of quality control
system (recreated from NETTCP 2014b, Chapter 4 Quality Assurance for Design-Build Projects) ................................................................................................. 30 Figure 7. Components and relationship in the CQAP (Georgia Department of Transportation 2014) .................................................................................................... 31 Figure 8. Virginia DOT's Quality Assurance Management organizational chart (Virginia Department of Transportation 2012) ............................................................ 32 Figure 9. Texas DOT's Quality Assurance organizational chart (Recreated from Texas Department of Transportation 2016) ................................................................. 33 Figure 10. Minnesota DOT's quality organization chart (Recreated from Minnesota Department of Transportation 2014) ............................................................................ 34 Figure 11. Three acceptance approaches .......................................................................... 36 Figure 12. Typical organizational chart for a DB project involving a mixed approach (Recreated from the GDOT Quality Assurance Plan).................................................. 37 Figure 13. Distribution of state DOTs based on their acceptance approaches ................. 39 Figure 14. Consistency of acceptance approaches............................................................ 39 Figure 15. Decision factors of acceptance approaches ..................................................... 40
viii

Figure 16. Shortlisting criteria in RFQ phase ................................................................... 44 Figure 17. Proposal evaluation in RFP phase ................................................................... 45 Figure 18. Distribution of state DOTs based on their IA approaches............................... 52 Figure 19. Consistency of IA approach ............................................................................ 53 Figure 20. Mechanism for budgeting QA services in designbuild projects.................... 57 Figure 21. Caltrans Construction Quality Assurance database architecture (adopted
from California Department of Transportation 2015 and FHWA 2006) ..................... 63 Figure 22. Implementation of pay factors......................................................................... 65 Figure 23. Non-validation flowchart (Arizona Department of Transportation 2016) ...... 70 Figure 24. CDOT assessment report workflow (Colorado Department of
Transportation 2017) .................................................................................................... 71 Figure 25. Implementation of NCRs................................................................................. 72 Figure 26. Audit Profile separated into Risk Quartiles FDOT project ....................... 80 Figure 27. TxDOT's owner verification levels for material testing validation ................ 81
ix

EXECUTIVE SUMMARY
This research report on `Materials Quality Management for Alternative Project Delivery' explores the state-of-the-art practices in quality management for highway construction projects delivered by the alternative delivery methods, especially designbuild. The main objective of the research undertaken is to study the existing systems and procedures across various state Departments of Transportation (DOTs) and identify important trends, best practices, and recommendations. The research assignment began with an in-depth analysis of the current literature in terms of published academic papers; federal and state reports published by various organizations like the Federal Highway Administration (FHWA), National Cooperative Highway Research Program (NCHRP), and the Northeast Transportation Training and Certification Program (NETTCP); and numerous state DOTs' presentations and other published records. Following the literature review stage, the research methodology continued with interviews and question-and-answer sessions with subject-matter experts across the country. This included email interviews, in-person site visits and discussions, telephonic conversations and meetings, and presentations during relevant conferences, such as Transportation Research Board (TRB) and Design-Build Institute of America (DBIA) conferences.
The first half of the research was done with an aim to identify the existing challenges in the quality management procedures in the alternative delivery environment with a strong
x

focus on designbuild (DB) projects. A designbuild project delivery system is a significant change from the traditional designbidbuild (DBB) system. Among other differences, a large number of roles and responsibilities are transferred by the state DOT to the designbuilder, the size of the project is usually much larger, the cost and funding mechanisms are much more elaborate, the personnel qualifications and requirements have to be carefully reviewed at each stage, and several stakeholders are involved in the project. In light of these developments, several challenges were identified:
Reluctance of state DOTs to shift the responsibility of quality assurance (QA) to the designbuild team
Contractor reluctance to accept the new role of QA in the DB environment Difficulty in developing an appropriate quality management for the alternative
delivery when detailed design and actual quantities are not available Difficulty in developing an adequate and reliable budget for quality management
tasks and conducting cost control Differences in terminology used by state DOTs for quality management in the
designbuild environment Lack of a unified and consistent guidebook for quality management in the state
DOT Differences in organizational structure for the quality management
xi

Understanding new roles and responsibilities in designbuild projects Independence of quality management firms from the designbuild team Need for specialized training: Requirements for the new set of skills and
qualifications in working in the DB environment Need for an appropriate evaluation system to evaluate the qualifications of the
designbuild team and its approach toward the quality management in the procurement phase Lack of familiarity on how to use the contractor's samples in the acceptance procedure Establishing and maintaining good relationships with the FHWA to ensure that state DOTs and FHWA are on the same page when it comes to evaluating the project quality Lack of flexibility and scalability of existing quality management software programs mainly designed for the DBB environment The second half of the research effort was to examine the state of the practice in state DOTs in order to identify best practices in handling the identified challenges for implementing a quality management plan for the alternative delivery environment. The results of the email interview process and review of state DOTs' quality management manuals and designbuild request for qualifications (RFQ) and request for proposal (RFP) helped
xii

identify several important areas that can be considered for enhancing the state of the practice for quality management in the alternative delivery environment as follows:
Organizational structure for quality management in the designbuild environment Acceptance approaches and decision factors in choosing the most appropriate
acceptance approach for the designbuild project Selection criteria and quality management plan Establishing and maintaining exemplary working relationships and collaborations
with the FHWA Independent assurance methods (i.e., project approach, system approach, and
mixed approach) Budgeting and cost control for quality management tasks in the designbuild
environment Quality assurance software programs Pay factor adjustment for quality in the designbuild environment Non-conformance reports (NCRs) Responsible charge Risk-based approach Independent engineer (IE)
xiii

ACKNOWLEDGMENTS
The research reported herein was sponsored by the Georgia Department of Transportation through Research Project Number 16-22. The authors acknowledge and appreciate the help of Mr. Darryl VanMeter, GDOT Assistant P3 Division Director/State Innovative Delivery Administrator; Ms. Monica Flournoy, GDOT Office of Materials Administrator; and Mrs. Supriya Kamatkar, GDOT Research Program Manager.
xiv

Chapter 1 Introduction and Literature Review
1.1 Introduction
One of the fastest growing alternative project delivery methods used in the United States is designbuild (DB). A study conducted by the Design-Build Institute of America (DBIA) indicates that since 2002, the number of projects in the transportation industry procured with DB in the United States has increased 600%. In 2017, the DBIA announced that DB had been fully authorized in 27 states and the District of Columbia. There are only four state DOTs that do not have the authority to use DB in highway project delivery. DB contracting is becoming popular because of time and cost savings compared to the traditional designbidbuild (DBB) method. With traditional DBB, departments of transportation (DOTs) need to handle separate contracts with a designer and a contractor. This limits the flexibility of executing construction work before the completion of the design. In DB projects, however, the owner signs a contract with a single combined entity as a designer and a contractor. Coordinating the schedule of the projects with the single team allows the designbuilder to initiate its construction work before the design phase is complete, which saves costs and reduces time. Despite these advantages, DOTs have faced several challenges in implementing DB contracting. More roles and responsibilities have shifted from DOTs to designbuilders in this alternative method. In addition, since the engineer of record (EOR) works for the designbuild team, the designbuilder assumes the liability of performance in DB projects (a distinction compared to traditional DBB projects).
Quality is one of the basic criteria for project success--it with time and cost is referred to as the iron triangle (Atkinson 1999). Most articles, such as that of Munns and Bjeirmi (1996), Chan et al. (2002), Chan et al. (2004), and Songer et al. (2015), identified and discussed this iron triangle
1

of project success criteria. Quality assurance (QA) management is an essential area of concern for DOTs. If the material and workmanship fail to comply with specifications and contract requirements, this could lead to early failure of the highway component (Charles S. Hughes 2005). DOTs have realized that without well-established QA programs, the projects will fail to comply with either material or construction specification (Charles S. Hughes 2005). Gransberg and Molenaar (2008) examined DB procurement packages and found that 23 of 60 requests for proposals (RFPs) did not clearly define roles and responsibilities. To obtain a quality product, the DOTs need to clearly state quality-related roles and responsibilities in the contract document (Gransberg et al. 2008a).
The federal code that deals with QA procedures for construction (23 Code of Federal Regulations Part 637 Subpart B) requires that each state highway agency (SHA) develop a QA program for the national highway system. This ensures that materials and workmanship integrated in every federally funded highway construction project conform to the approved plans and specifications of the project. However, QA practices vary from state to state, and the practice for one DOT may not be acceptable for other DOTs (Scott and Molenaar 2017). To reduce this inconsistency and clarify quality management, the Federal Highway Administration published the quality management guidelines.
The transportation industry has moved away from the term "quality control/quality assurance (QC/QA)" (or "QA/QC") to refer to a quality assurance program. Some transportation agencies have historically applied QC/QA, indicating that QC represents a contractor's responsibility and that QA is an agency's responsibility (see Figure 1). However, quality control is not a separate function from quality assurance. Instead, QC is one of the core elements of a quality assurance program. Thus, QA refers to an overall system for assuring project quality. In response to these
2

changes, to help clarify roles, responsibilities, and quality-related activities when DOTs use DB contracting, the Federal Highway Administration (FHWA) (2012) published a Techbrief titled, "Construction Quality Assurance For Design-Build Highway Projects." The Techbrief recommends that DOTs use synthesized quality management programs by implementing quality assurance as an umbrella term with six core elements (see Figure 2): (1) quality control (QC), (2) agency acceptance, (3) independent assurance (IA), (4) personnel qualification, (5) laboratory accreditation, and (6) dispute resolution.
Figure 1. Proper use of term "Quality Assurance"
Figure 2. Quality assurance elements (FHWA 2008) 3

1.2 23 CFR 637 and FHWA Techbrief
In 1995, Title 23, Code of Federal Regulations, Part 637 (23 CFR 637) allowed State Highway Agencies (SHAs) to use the testing results from contractors (designbuilders) in acceptance decisions if they are verified by their agencies or designated representatives (FHWA 2004). In 2004, the FHWA published a technical advisory that "provide[s] guidance and recommendations for the use and validation of contractor's test results for acceptance." In 2008, the FHWA published the Techbrief describing QA program consisting of six core elements that contribute to the achievement of quality (FHWA 2008). Among the six core elements, QC, acceptance, and IA are primary activities (see Figure 3). QC activities, including sampling, testing, and inspection, are performed by designbuilders. Acceptance is defined as "all factors used by the agency (i.e., sampling, testing, and inspection) to evaluate the degree of compliance with contract requirements" (FHWA 2008, p. 3), which is the responsibility of the DOTs or their designated agents. IA provides an independent assessment of QC and acceptance activities that ensure that all factors are accurate and that the testing equipment used in the program is functioning and remains calibrated (FHWA 2012). The remaining three activities support the QA program. To ensure the achievement of quality, qualified personnel should perform testing and inspection in a capable laboratory. To strengthen these elements, a dispute resolution system provides resolution of possible discrepancies between the QC data of designbuilders and acceptance data from highway agencies.
4

Figure 3. Six core elements of QA
1.3 Literature Review
Although several DOTs implement DB contracting, they still maintain the traditional DBB contracting method. Traditional DBB and DB projects differ in several ways. The main difference lies with those who hold the responsibility (Gransberg et al. 2008b). On DBB projects, the staff of the DOTs are mainly responsible for inspection, QA verification and acceptance activities, and independent assurance whereas the contractor is usually only responsible for quality control. Compared to the contractor on typical DBB projects, contractors on DB projects are responsible for a larger scope of quality-related activities. DB teams (or their third parties) have the primary responsibilities for quality management, including design and construction while the DOTs perform QA oversight verification testing and independent assurance (Loulakis et al. 2015). Another difference between DBB and DB is in the procurement process. During the procurement phase, while contractors using DBB submit proposals after the design and all specifications are complete, contractors using DB submit proposals before the design is complete. Therefore, DOTs must define and assign roles, responsibilities, and activities in the contract documents; however,
5

as some of their RFPs and contract documents for DB projects inadequately define responsibilities for each activity, contractors then submit proposals that do not meet the DOT requirements.
The literature in the field of quality assurance contains numerous studies. Harman and Sillars (2013) applied a case-study approach to 10 transportation projects and provided insight into QA systems. They determined that innovative quality assurance methods, mostly used for alternative project delivery methods such as DB may be a key factor for project managers to consider when they develop a whole quality assurance system. Kraft and Molenaar (2013) developed a quality assurance organization (QAO) process based on project specifications. They identified five fundamental QAOs in the industry (Kraft et al. 2014). They continued their work in another paper (Kraft and Molenaar 2015) in which they identified 10 factors that influence the QAO selection process, including project size, project complexity, and schedule. Because of the scope of their decision-making process, the limitation of their project data availability, and the complexity of their topic, they used structured interviews and the Delphi method to explore the selection factors. Gransberg and Molenaar (2004) conducted a content analysis of 78 RFPs for public DB projects with an aggregate contract value of over $3.0 billion between 1997 and 2002. They identified six owner approaches for articulating owner requirements in the RFPs. Gad et al. (2015) continued this work and compared RFPs between 1997 and 2002 and between 2009 and 2013. They found that state DOTs had become more cognizant of the importance of quality management in their DB contracting process. Gransberg et al. (2008a) compiled National Cooperative Highway Research Program (NCHRP) research report 376, highlighting quality assurance practices, approaches, and models in the practices. In NCHRP research report 838, Scott and Molenaar (2017) produced guidelines consisting of a framework that state DOTs could use while allocating QA resources.
6

With increasingly evolving transportation project delivery practices, the involvement of consultants and third-party firms and the roles and responsibilities for QA have diversified. The current research team expects to determine gaps in practices between DBB and DB. Thus, the primary objective of this study was to identify current challenges associated with QA for DB highway projects and provide a synthesis of the current state of QA practices of state DOTs. The results indicate that responsibility for quality assurance is being transferred to designbuild teams. Although the QA programs of the DOTs include the six core elements of QA suggested by the FHWA, each DOT has different practices in its QA program regarding those elements. In addition, quality-related software and cost mechanisms vary from state to state. Some DOTs use a consistent approach to quality assurance management, and other DOTs tend to change their approach depending on the project size, staff availability, agency experience, and so forth. This research attempts to identify key differences in quality assurance practices and the critical drivers for selecting a quality management approach for DB highway projects.
7

Chapter 2 Research Methodology
2.1 Overview
Because of the nature of this topic, the researchers used a combination of methods. An overview of the research methodology is presented in Figure 4. The overarching objectives of this research were to: (a) identify the key challenges faced by state DOTs to develop and implement an effective quality management approach in designbuild projects, and (b) determine appropriate strategies to enhance quality management in the designbuild environment. To achieve these objectives, the researchers took the following steps:
1. Conduct an extensive review of the academic and professional literature related to quality management for alternative project delivery
2. Create open-ended questions for distributing via an initial emailed questionnaire 3. Refine the questions by conducting a dry-run interview with selected subject-matter experts
to ensure that the questions are clearly crafted and the anticipated responses reflect the intent of the research 4. Distribute the questionnaire with subject-matter experts in state DOTs and follow up with them to receive as high a response rate as possible 5. Determine the areas to prepare questions for follow-up phone interviews and/or emails 6. Follow up with agencies that best responded to the initial questionnaire to conduct multiple rounds of structured interviews and/or emails 7. Collect documents from state DOTs following the interviews/emails (e.g., designbuild and publicprivate partnership [P3] manuals, state DOTs' quality management plans for designbuild and P3 projects, requests for qualifications (RFQs) and requests for proposals
8

(RFPs) of past and current designbuild and P3 projects, and master contracts and related task orders with the owner's consulting firms offering quality management and construction engineering and inspection [CEI] services) and analyze the contents of these documents in several areas of interest, such as common practices in quality management organization and the quality assurance process 8. Summarize and present in this research report the findings from all the information collected through emails, structured interviews, and content analysis
Figure 4. Research methodology
2.2 Discussion of Research Methodology Steps
1. Conduct an extensive review of the academic and professional literature related to quality management for alternative project delivery: The main focus of the literature review task was to examine the current state of the practice in quality management among state DOTs 9

that are actively using DB and DBB projects, and identify key differences in quality management practices in DBB versus DB projects. 2. Create open-ended questions for distributing via an initial emailed questionnaire: The research team developed a set of initial questions as the first step to better understand the practice of quality management among state DOTs in the alternative delivery environment. The areas of focus for initial questions were:
a. the main issues for successful execution of quality management in the alternative delivery environment,
b. the availability of quality management manuals for DB and P3 delivery systems, and
c. an overview of quality management organization for DB and P3 delivery systems. 3. Refine the questions through conducting dry-run interviews with selected subject-matter
experts to ensure that the questions are clearly crafted and the anticipated responses reflect the intent of the research: Researchers sent the questions to several innovative delivery subject-matter experts, such as the heads of the offices of innovative delivery program in several state DOTs across the nation, in order to validate and refine the questions and make a final decision on the best questions to use in the initial emailed questionnaire to get the best results. The research team then used the refined set of questions to gain and collect information about the current practices of quality management in the alternative delivery environment. 4. Distribute the questionnaire with subject-matter experts in state DOTs and follow up with them to receive as high a response rate as possible: The email survey was sent to 40 state
10

DOTs in the United States with active designbuild programs, of which 27 state DOTs provided answers (see Figure 5).
Figure 5. Overview of email interview 5. Determine the areas to prepare questions for follow-up phone interviews and/or emails:
The research team used more detailed questions for the follow-up interview/email phase to better understand the practice of quality management among state DOTs in the alternative delivery environment. The areas of focus for follow-up questions were:
a. the relative significance of challenges for executing quality management in the alternative delivery environment,
b. further description of QA organizational models and new roles and responsibilities required for QA in the DB and P3 delivery systems,
c. handling quality management issues during shortlisting and proposal evaluation phases, 11

d. approaches used by agencies for independent assurance and quality acceptance, e. budgeting and cost control methods for quality management services, f. methods to resolve conflicts related to quality issues and non-conforming products,
and g. functionality requirements for quality management software programs and database
systems in the alternative delivery environment.
The researchers refined the follow-up interview/email questions through conducting dry-run interviews with a few subject-matter experts in designbuild organizations, including the abovementioned state DOTs, to ensure that the questions would help collect the information they intended to retrieve from the state DOT officials.
6. Follow up with agencies that best responded to the initial questionnaire to conduct multiple rounds of structured interviews and/or emails: Following the analysis of the initial emailed questions, the researchers identified the following 19 state DOTs for follow-up interviews: Arizona, Colorado, Connecticut, Florida, Georgia, Maine, Maryland, Massachusetts, Michigan, Minnesota, Missouri, Montana, Ohio, Oregon, South Carolina, Texas, Utah, Virginia, and Washington State. The selection was made based on the quality and depth of answers to the survey questions, as well as expressed interest by the respondents to participate in the following research steps.
7. Collect documents from state DOTs following the interviews/emails (e.g., designbuild and P3 manuals, state DOTs' quality management plans for designbuild and P3 projects, RFQs and RFPs of past and current designbuild and P3 projects, and master contracts and related task orders with the owner's consulting firms offering quality management and CEI services) and analyze the contents of these documents in several areas of interest, such
12

as common practices in quality management organization and quality assurance process: Participants in the follow-up interviews/emails provided several internal documents that contained valuable information regarding the quality management plan of their alternative delivery programs. Also, they shared copies of their contracts with the owner's consulting firms that were assisting them in preparing quality management plans and conducting CEI services. These documents explain how the state DOT handles various aspects of quality assurance/quality control for designbuild and P3 projects. These documents included, but were not limited to, designbuild and P3 manuals, quality management plans for designbuild and P3 projects, RFQs and RFPs of past and current designbuild and P3 projects, and master contracts and related task orders with the owner's consulting firms offering quality management and CEI services.
Content analysis was performed on the resources provided to: (a) understand state DOTs' main issues in handling quality management in the alternative delivery environment, and (b) identify and characterize different state DOTs' practices in developing and implementing quality assurance plan for designbuild and P3 projects.
8. Summarize and present in the research report the findings of all the information collected through emails, structured interviews, and content analysis: In the final step of the research methodology, the research team assembled all the work done in the earlier stages in an efficient manner to come up with a synthesis of all the findings. Starting from the first step of conducting extensive literature review for finding gaps in existing research and coming up with questions for subject matter experts, to distributing the questionnaires over email and following up with these contacts over a protracted period of time with questions on several pertinent issues, and performing content
13

analysis on all the responses and documents shared by the interviewees; it was extremely essential to compile this entire process together and document the findings in a clear and lucid manner. Important industry practices and trends were identified while summarizing these responses and all the available documents, which have been highlighted in the next couple of chapters.
14

Chapter 3 Challenges in Developing and Implementing Quality Management Programs
3.1 Overview
Alternative project delivery methods such as designbuild provide the opportunity to expand the contractor's role in construction quality management beyond conventional quality control activities to include several of the QA tasks traditionally performed by DOT personnel. In accordance with 23 CFR 637 and the FHWA's Techbrief HRT-12-039, a comprehensive construction quality assurance program should consist of: quality control, acceptance, independent assurance, dispute resolution, personnel qualification, and laboratory accreditation/qualification. Use of an alternative delivery method does not diminish the need to perform any of these functions; however, the party performing them may differ from the DOT's standard practices. Possible options include performance by the DOT, an independent evaluator, the contractor (with DOT verification sampling and testing), or some combination thereof. With the introduction of alternative delivery in transportation projects, the methods and procedures relating to quality management have undergone several changes that introduce new challenges for state DOTs in efficient and effective execution of quality management in the alternative project delivery environment. In addition, state DOTs have limited resources for keeping up with the demand of large-scale DB projects. This chapter presents the identified challenges for developing and implementing quality management in the alternative project delivery.
15

3.2 Reluctance to Shift the Responsibility of Quality Assurance to the DesignBuild Team
With project costs and sizes growing exponentially over the last couple of decades, it has become increasingly difficult for DOTs to deliver all their projects in a designbidbuild environment. This has led to the advent of alternative project delivery systems like designbuild in which a large portion of the project responsibilities are shifted to the designbuilder. However, the fact that a lot of responsibilities are now shouldered by the designbuilder creates an interesting challenge for some state DOTs, as they may argue that this change has led to a lack of day-to-day control over the project as in the designbidbuild environment. A mental shift is needed for some DOT professionals to become accustomed to the new dynamics of the designbuild project delivery. It is critical for state DOTs to understand the risk that extensive involvement in day-to-day quality assurance activities increases the risk that the design liability is shifted back to the agency from the designbuilder. This is completely against the main feature of the designbuild project delivery system that demands the role of the engineer of record remains with the designbuild entity.
The idea of transferring some responsibilities that have been traditionally held by the state DOTs to the designbuilder can be a big challenge. State DOTs have different levels of transfer of responsibilities to the designbuilder. Some DOTs require that the designbuild team follow the state's official quality assurance manual, while others require the designbuilders to present their own QA manual as a part of the selection process. A wide variation also is seen in the roles of quality acceptance. Traditionally, all the acceptance has resided with the state DOT, but with the introduction of designbuild and other alternative project delivery methods, the responsibilities of acceptance also have seen a shift. Understanding the shift in roles and responsibilities of the quality
16

management team can be a source of challenge for some DOTs that may be afraid of losing control over day-to-day activities of the project.
3.3 Contractor's Reluctance to Accept the New Role of QA in the DB Environment
In the traditional designbidbuild environment, the state DOT accepts the quality assurance role and conducts inspections and testing to accept the contractor's work. Contractors are familiar and comfortable with the conventional QA process, especially when they know that the liability is transferred to the owner once the work is accepted. This is aligned with the fact that the designer of record works directly for the state DOT and the contractor does not assume any design liability risk. Some contractors have difficulty changing their mindsets when they work in the designbuild environment. Accepting new roles in the quality assurance program for the designbuild project can be problematic for the designbuild team. The major challenge that makes some designbuilders uncomfortable is that the design and construction liability is not over immediately after the completion of the work element. Since the designer of record works directly for the designbuilder, the liability of design remains with the designbuild team and, therefore, the contractor needs to be more cautious than ever to deliver the total project with the anticipated level of performance as outlined in the design developed by the designbuild team.
Lack of adequate resources or trained personnel, and difficulty in changing the mindset are the main challenges that some contractors face with the new QA model. Oftentimes, the fear of something new can also be a deterrent to trying out newer approaches. Some contractors might be reluctant to change their traditional roles and responsibilities because they firmly believe the statement, "Don't fix what's not broken."
17

3.4 Difficulty in Developing Appropriate Quality Management for the Alternative Delivery when Detailed Design and Actual Quantities Are Not Available
Designbuild contracts are lump-sum contracts based on partially completed design. Detailed design and actual quantities of major line items are not available at the time that the designbuilder comes on board (i.e., the designbuilder develops the cost estimate based on estimated quantities of different line items that will change throughout the detailed design development). Lack of detailed information about design elements and the actual quantities of different line items makes it challenging for state DOTs and designbuild contractors to define a quality management program. Traditionally, the QA program is defined based on types of tests and respective frequencies and timing of the tests that can be exactly quantified using detailed design information in the designbidbuild project. The quality management program for unit price designbidbuild contracting needs to be revisited to accommodate the nuances of lump-sum designbuild contracting. This change introduces challenges for some DOTs and contactors that may find it difficult to work in uncertain conditions with incomplete design information.
3.5 Difficulties in Developing an Adequate and Reliable Budget for Quality Management Tasks and Conducting Cost Control
In the traditional designbidbuild environment, state DOTs are in charge of allocating adequate resources and budgets for required QA tasks. State DOTs use the actual quantities of the work items and apply their historical rates to develop a good estimate for the QA budget. This approach is inherently limited in the designbuild environment when detailed design and actual quantities for major line items are not available. Therefore, developing a reliable budget for designbuild projects is a challenging task considering several unknown and uncertain factors involved in the QA process. This issue can become more challenging with the recognition that the designbuild
18

team will be in charge of conducting most of the quality management tasks in the alternative delivery environment. Some designbuilders may be not as familiar as the state DOT quality management personnel in identifying required resources and estimating an adequate budget for QA tasks in the alternative delivery environment. Further, some DOTs may find it difficult to not have control over the designbuild team's expenditure of project funds on quality management tasks.
In lump-sum designbuild projects, designbuilders are not typically required to identify a separate line item for quality management tasks. Some state DOTs are concerned that the design build team may not have enough expertise to develop a reliable budget for QA tasks or may not allocate a satisfactory budget for performing QA tasks. For instance, Missouri DOT reported that for design-build projects where contractors are responsible for QA and its budgeting, the DOT reviews the amounts identified in the work breakdown structure and ensures that the contractor has adequate resources budgeted for QA before approving the final schedule and work breakdown structure. Another point to note is that all the states interviewed reported that they do not require their design-builders to spend at least an `x%' of the project costs on QA. Most of these state DOTs are not interested in the exact budget details as long as the design-build team adheres to all QA tasks promised in the quality management plan for the alternative delivery environment.
19

3.6 Differences in Terminology Used by State DOTs for Quality Management in the DesignBuild Environment
The FHWA Techbrief recommends that each state outline a quality assurance management plan for its designbuild projects. The plan should contain elaborated details on all six core areas of construction QA. However, the research team noticed that the current terminology used throughout the country contains several inconsistencies. This inconsistency is a major challenge that the transportation designbuild industry faces in dealing with quality management issues of alternative delivery projects. The terminology used in quality management manuals of some state DOTs, such as Texas DOT is highly consistent with that of the FHWA Techbrief recommendations. All six core elements of the QA plan are discussed thoroughly in those DOT guidelines. However, there are other state DOTs (e.g., Virginia DOT) that prefer to use their own convention when it comes to the terminology used for quality management. Although the technical terms used in these state DOT manuals are somewhat different than those used in the Techbrief, the principles behind the state manuals are consistent with the essence of quality management recommended by the Techbrief. However, these states do not use `quality assurance' as an umbrella term; rather, they follow the historic QA/QC terminology, which traditionally associates QA to the agency's role and QC to the contractor's role. It should be noted that in designbuild projects, where several quality management responsibilities lie with the designbuilder, QA can be erroneously referred to as `quality acceptance' as opposed to `quality assurance'.
20

3.7 Lack of a Unified and Consistent Guidebook for Quality Management in the State DOT
It is recognized that state DOTs have different approaches when it comes to guidebooks and quality management manuals for projects delivered under alternative delivery. Some state DOTs have developed quality management manuals that designbuilders are required to follow. This serves as a minimum requirement on all their designbuild projects for quality management purposes. Other states tend toward a different approach in which the state DOT requests the designbuild team to propose a quality assurance plan for the designbuild project and the state DOT evaluates the plan during the selection phase. The transportation designbuild industry is facing a challenge in handling different quality management expectations while working with different DOTs across the nation.
The researchers also noticed that sometimes the QA practice varies among different projects within the same state, as well. An example is found from the response the researchers received from the Colorado DOT (CDOT) to one of the follow-up email questions:
"Historically on designbuild projects CDOT has given the responsibility of both Quality Control and Quality Assurance to the contracting team. CDOT then performs Assessments on design and construction, as well as Independent Assurance materials testing. On the I-25 Ilex project CDOT chose to have the contracting team perform Quality Control, while CDOT retained the Quality Assurance program including Independent Assurance. It was decided that by retaining the QA CDOT would have more control and oversight of the work performed during design and construction."
21

3.8 Differences in Organizational Structure for Quality Management
The organizational structure for quality management in designbuild projects is significantly different from that in designbidbuild projects. In DBB projects, the contractor was only responsible for quality control, while the owner was responsible for quality assurance, acceptance, and independent assurance. In the designbuild landscape, lines of reporting have changed and the roles and responsibilities have transcended from one stakeholder to another. The designbuilder takes on an increasing role in the acceptance process and the owner relinquishes some of the responsibility, and implicitly some of the liability to the designbuild team. Due to the size of some of the designbuild projects, third-party firms are brought on board to act independently from the designbuilder to work as a quality verification and independent assurance firm. More stakeholders are involved in the organizational structure, the contractor takes on responsibilities that it had not previously, and the involvement of third-party firms is a new piece in the puzzle.
Clear understanding of the new organizational structure for quality management programs represents an important goal for state DOTs and the transportation designbuild industry. Managing relationships among several parties involved in the quality management program is a challenge for state DOTs in designbuild projects. Maintaining healthy working relationships throughout the project is a necessity for the smooth execution of quality management tasks throughout the project development.
3.9 Understanding New Roles and Responsibilities in DesignBuild Projects
In conjunction with the previous section, an important hurdle that is introduced in the organizational setup of quality management in designbuild projects is the involvement of new entities in the process, e.g., independent quality firms (IQFs) and owner's verification firms
22

(OVFs). The IQF is usually hired by the designbuilder with the consent of the owner. The IQF acts independently as a second line of acceptance. The IQF is assigned to verify that the quality control measures taken by the designbuilder are up to the mark, the requisite material testing results are within the recommended guidelines, and the personnel working on the site are qualified and correctly certified. The OVF is hired by the owner to assist in the verification process. This can sometimes be seen as the last layer of quality acceptance and can also be tasked with the roles and responsibility of independent assurance.
Introducing these new players, and the various ways in which different state DOTs define their roles and responsibilities, may be problematic for some state DOTs and members of the transportation designbuild industry. Clear understanding of the responsibilities and accountabilities of the new entities is critical to the success of quality management in the designbuild environment. The existence of several players in the designbuild project should not be treated as a source of confusion but as a core strength of the quality management organization with several layers of checks and balances.
There might be a perception of a shift in acceptance liabilities from designbuilder to the IQFs and independent assurance responsibility from the owner to the OVF. This issue is crucial to overcome from the start of the project to avoid finger pointing if problems do arise.
3.10 Independence of Quality Management Firms from the DesignBuild Team
The independent quality firm is typically hired by the designbuild team with the consent of the owner. The IQF is part of the designbuild team and is paid directly by the designbuild contractor. However, it is critical that the IQF has the ultimate authority to act on the best interest of the project and protect the owner's performance expectations in the designbuild project. However, some
23

IQFs may find it challenging to act as an impartial agent, as the designbuild team may not hire them for future projects if they are very strict in their roles. Some IQFs may prefer to be hired directly by the owner and report directly to the owner. Also, some owners may find it difficult to become accustomed to the new arrangement in which the IQF works directly under the design build contractor. However, most DOTs already overcome this challenge through rigorous examination of the proposed quality management plan of the designbuild team to ensure that the right and most highly qualified IQF is selected to work on the project and has adequate power and authority to handle the quality issues in the designbuild project. It is worth noting that, ultimately, all players in the designbuild team, including the IQF, should work to satisfy the required expectations of the owner for the designbuild project.
3.11 Need for Specialized Training: Requirements for the New Set of Skills and Qualifications for Working in the DB Environment
Designbuild projects have created a need for qualified and certified personnel working for all the stakeholders involved in the project. With the growing size and complexity of these projects, effective communication and documentation have become extremely relevant. The need for specialized skills and qualifications has never been greater. The transportation designbuild construction industry is moving toward increased use of technology, and the challenge remains to adequately train and certify all quality management professionals working on the increasingly complex designbuild project. States have ramped up their personnel qualifications programs and require highly skilled quality management professionals as key personnel in the designbuild team. The FHWA has set up stricter guidelines for auditing quality management programs. Designbuild teams across the country are recognizing the need for a qualified and well-trained workforce.
24

3.12 Need for an Appropriate Evaluation System to Evaluate the Qualifications of the DesignBuild Team and Its Approach Toward Quality Management in the Procurement Phase
Some state DOTs treat quality as a selection factor in shortlisting designbuild teams and evaluating their proposals. Quality-related factors are separately rated and weighted as part of the selection and evaluation process. Other state DOTs do not explicitly rate quality-related factors as part of evaluating designbuild teams and their proposals; however, this does not mean that the quality factors are not important in the selection of shortlisted teams and evaluation of designbuild proposals. These state DOTs still review the qualifications of the quality management personnel and provide feedback for the proposed quality management plans. Nevertheless, a lack of unified approach to address quality issues can be a source of challenge for the transportation designbuild industry.
3.13 Lack of Familiarity with How to Use the Contractor's Samples in the Acceptance Procedure
The role of state DOTs in the quality management process of alternative delivery projects is transforming to oversight and independent assurance. State DOTs can utilize the samples taken by the designbuilder for accepting the quality of the designbuild project. However, state DOTs need to become familiar with a reliable verification and validation approach that can be rigorously implemented to ensure the results of the tests provided by the IQF are ready to be used in the acceptance decision. Understanding the principles of t-tests and F-tests (statistical approach toward the quality acceptance) is required for implementing a quality assurance program for the alternative delivery environment. Familiarity with the statistical analysis approach toward quality assurance may be challenging for some professionals in state DOTs and the transportation designbuild industry.
25

3.14 Establishing and Maintaining Good Relationships with the FHWA to Ensure that State DOTs and the FHWA are on the Same Page When It Comes to Evaluating the Project Quality
When federal dollars are involved in a road construction project, state DOTs are faced with an additional requirement of complying with the FHWA's guidelines. This is a challenge for the state DOTs as they have to not only make sure the project is delivered as planned but have to constantly ensure that no federal rules and regulations are being flouted. The FHWA has unique expectations for a rigorous quality management system and its implementation in the designbuild environment. However, sometimes, the FHWA resource offices across the nation may have different interpretations of the requirements for a rigorous quality management plan for a designbuild project. These differences in opinions can be a source of confusion for some state DOTs, as well as the transportation designbuild construction industry. This challenge has been overcome by state DOTs through engaging the FHWA's regional offices in reviewing their quality management plans and providing comments to enhance their practices and ensure that they comply with all the FHWA's expectations. Typically, state DOTs invite the FHWA to their project meetings and keep the FHWA informed of the progress and major decisions during design and construction. The FHWA will have an opportunity to provide feedback and recommend necessary changes to the quality management plan before the designbuild project starts.
3.15 Lack of Flexibility and Scalability of Existing Quality Management Software Programs Mainly Designed for the DBB Environment
In most designbuild projects, the designbuilder provides quality assurance. Designbuild projects are generally much larger in size and complexity and involve more stakeholders than those of designbidbuild projects. Different groups performing quality management use many databases and software systems that are typically not integrated. The result is that by continuing
26

to focus primarily on entering data and not necessarily on retrieving data to draw valuable conclusions, DOTs may become data rich and information poor. Also, inherent differences between state DOTs in Quality Management make it difficult to employ a nationwide software approach in a timely manner. Accordingly, each state must employ its own approaches, in order to address alternative delivery quality management needs. Several challenges have been identified when it comes to the current state of software programs available for designbuild projects. During the email interview process and in discussions with subject-matter experts, issues such as scalability and flexibility took the forefront of this matter. It was reported that a central system for all the needs of a modern project is lacking. Real-time collaboration features that allow multiple stakeholders to work on a component of the job, usually on the cloud, is another area for possible improvement. Compatibility with the legacy software systems and historical databases is a barrier to moving to the new software platform for quality assurance. Substantial investment costs to acquire licenses and train the DOT personnel are other reasons that delay the implementation of the new software platform for quality management in the alternative delivery environment.
27

Chapter 4 Strategies to Enhance Quality Assurance Programs
Conducting quality assurance programs for designbuild highway projects presents new challenges for state departments of transportation. Highway agencies have experienced a decrease of experienced staff and are concerned about their loss of direct control over day-to-day quality activities. The main objective of this chapter is to examine the state of the practice in state DOTs that are actively using designbuild and identify best practices in handling the identified challenges for implementing a quality management plan for the alternative delivery environment. The results of the research team's email interview process and review of state DOTs' quality management manuals and designbuild RFQs and RFPs help identify several important areas that can be considered for enhancing the state of the practice for quality management in the alternative delivery environment, as summarized in this chapter.
4.1 Organizational Structure for Quality Management in the DesignBuild Environment
The roles and responsibilities for various elements of the QA program differ significantly across different states in the designbuild environment. The main difference between traditional DBB projects and DB projects lies with those who hold the responsibility (Gad et al. 2015; Gransberg et al. 2008a). With traditional DBB projects, the contractor is responsible for quality control to ensure that it is delivering a project that complies with contractor's drawings and specifications. According to the FHWA, the responsibility for quality acceptance and oversight lies with the highway agency or its representative. The third-party firms do not typically become involved in the quality management. In DB contracting, while the process of quality control, which lies with
28

the designbuilder, is similar to DBB projects across the DOTs, the researchers found that the acceptance decisions of the DOTs vary from state to state based on the levels of owner involvement in acceptance because of involvement of the third-party firms and increasing responsibilities that are shifted to the designbuilder.

Because of these changes in roles and responsibilities, the organizational structure for quality

management in DB projects has changed and varies from state to state. New roles and

responsibilities have changed the organizational structure in the DOTs. Northeast Transportation

Training and Certification Program (NETTCP) (2014a) has recommended an organizational chart

for

a

quality

control

system

(see

Figure 6). The overall quality control system of the designbuilder is divided into two groups: a) design quality control system, and b) construction quality system. Frontline design and consecution QC is the responsibility of the DB frontline. Design production team develops design
29

and an independent team performs design QC. Construction production team performs construction tasks, and an independent team performs construction QC, which is a separate formal construction QC team. This is independent from the production staff that is required to perform "Frontline QC" activities. The following subsections describe the organizational chart of quality management by different DOTs.
Figure 6. An example of the recommended organizational chart of quality control system (recreated from NETTCP 2014b, Chapter 4 Quality Assurance for Design-Build Projects) 4.1.1 Georgia DOT (GDOT) Figure 7 shows the main entities involved in the construction quality assurance program (CQAP) for GDOT's designbuild projects. The construction quality assurance manager (CQAM) shall be a Georgia-licensed professional engineer and an employee of the contractor's quality assurance firm (CQAF). The CQAF staff at all levels must be adequately qualified and be certified to perform
30

their respective duties. The CQAP emphasizes the qualifications and necessary certifications for personnel at all levels on the project.
Figure 7. Components and relationship in the CQAP (Georgia Department of Transportation 2014)
4.1.2 Virginia DOT (VDOT) Figure 8 presents the organizational chart for quality assurance management for VDOT. The QA/QC Plan should detail the QA/QC organization's function, including the expected minimum number of full-time equivalent employees with specific QA or QC responsibilities. The Construction QA/QC plan should list by discipline the name, qualifications, duties, responsibilities, and authorities for all persons.
31

Figure 8. Virginia DOT's Quality Assurance Management organizational chart (Virginia Department of Transportation 2012)
4.1.3 Texas DOT (TxDOT) Figure 9 shows the organizational chart from TxDOT. The designbuilder can hire the third party for "Quality Control" and shall hire the IQF. TxDOT can hire the third party for OV and IA, as well. Texas Department of Transportation (2011) states acceptance responsibilities:
"TxDOT or its designated agent will develop a comprehensive Owner Verification Testing and Inspection Plan (OVTIP)."
"The basis for the acceptance decision: Both the IQF's (performed by the DB Contractor) and OVF's (performed by TxDOT) testing and inspection results."
"DB Contractor's IQF shall be separate from the DB Contractor's QC program." "F-tests and t-tests will be used to analyze OV and IQF data." "IQF's testing and inspection results is a formal QC based on FHWA definition."
32

Figure 9. Texas DOT's Quality Assurance organizational chart (Recreated from Texas Department of Transportation 2016)
4.1.4 Minnesota DOT (MnDOT) Figure 10 presents MnDOT's quality organization chart. The design quality manager is responsible for the full plan check process. The construction QM is responsible for checking all pre-conditions
33

to activities, not requesting MnDOT critical activity point (CAP) approval until ready, material oversight, spec book conformance, etc.

DB's Quality Manager
(reports to ECM*, not PM)

DB's Design Quality Manager

DB's Construction
Quality Manager
*ECM = Executive Committee Members

Figure 10. Minnesota DOT's quality organization chart (Recreated from Minnesota Department of Transportation 2014)

4.2 Acceptance Approaches
Two responses from the first interview motivated the research team to investigate the different approaches toward the acceptance decision. For example, historically, Colorado DOT's (CDOT's) approach toward acceptance on DB projects is that the designbuilder is responsible for both QC and QA and CDOT is responsible for assessment of design, construction, and IA. However, on the I-25 Ilex project, responsibility of QA had shifted to the DOT. The designbuilder was only responsible for QC, similar to DBB projects, and CDOT performed QA and IA. The decision for the change in the practice was made based on past experience with DB projects. CDOT wanted to have more control and oversight of the work performed to handle non-compliant works in-house. The other example was related to Minnesota DOT's (MnDOT's) current practice toward the

34

acceptance practice. Ten years ago, MnDOT experimented with passing more responsibilities to the designbuilder. However, currently, MnDOT still performs most of the material testing on its DB projects as in its DBB approach. The designbuilder is only responsible for QC, and MnDOT is responsible for QA (not as comprehensively as in DBB, but it has `Critical Activity Point' checks). On the Americans with Disabilities Act (ADA) trial project, MnDOT had a high-quality DB team on-board so that responsibilities for QC and QA lay with the designbuilder, and MnDOT performed a minimal oversight role. MnDOT mentioned that it is open to different models as each may be appropriate.
The roles and responsibilities for various elements of the construction QA program significantly differ across states. The FHWA Techbrief states, "All acceptance activities must be carried out by the agency or their designated agent (i.e., consultant under direct contract with the agency) independent of the contractor. This does not preclude the inclusion of design-builder QC data, provided that the QC data are validated by the agency's independently obtained verification data" (p. 4). If the state DOT establishes a dispute resolution system, 23 CFR 637 also allows quality control sampling and testing to be used in the acceptance program. In traditional DBB contracting, the responsibilities of quality control lie with the contractor, who has the duty to ensure that the project delivered complies with drawings and specifications. The responsibility of quality acceptance and oversight lies fully with the owner or the owner's representative. In DB contracting, like DBB contracting, the responsibility of quality control lies with the designbuilder (the contractor).
4.2.1 Approaches for Quality Acceptance The following subsections define three approaches for quality acceptance in a DB project based on levels of owner involvement in acceptance (see Figure 11).
35

Figure 11. Three acceptance approaches
4.2.1.1 Traditional Approach (similar to the DBB approach) The traditional approach is similar to the acceptance approach in DBB projects. The designbuilder is responsible for quality control, and the state DOT is responsible for all acceptance tasks. The state DOT or its consultant conducts all the verification sampling and testing. This approach typically is used by the Maryland, South Carolina, Montana, and Minnesota DOTs. For instance, the Colorado DOT (CDOT) traditionally allows designbuilders to take the lead responsibility for quality assurance on its DB projects, but on the $90M I-25 Ilex DB project, CDOT decided to retain the acceptance process completely in-house, similar to a previous case in which the CDOT used the DBB approach on a problematic DB project. The regional transportation director for CDOT felt that the DOT would retain more control, and, in the end, have a wellorganized project.
4.2.1.2 Mixed Approach (typical DB) In a DB project, the designbuilder is under contract to deliver a project complying with a quality standard specified in that contract. In this approach, the owner assumes an active role in carrying out acceptance duties on a daily basis. The owner may hire an owner's verification firm to act on its behalf. Usually, the QC data of the designbuilder and the testing data of the independent quality firm or contractor's quality assurance firm are taken into consideration in the acceptance
36

process. The goal of the department is to ensure that the project is being constructed in accordance with contract requirements. This approach can be broken down to two levels of acceptance, as shown in Figure 12. The first line of acceptance is usually performed by the designbuilder or a firm hired by the designbuilder with the consent of the owner. This firm is referred to as an IQF or a CQAF. State DOTs usually have veto power when it comes to the hiring of these firms by the designbuilder. The task of the IQF/CQAF is to perform inspection and oversight of on-site construction activities. To ensure that all work complies with the contract requirements, the IQF/CQAF performs regular sampling and testing.
Figure 12. Typical organizational chart for a DB project involving a mixed approach (Recreated from the GDOT Quality Assurance Plan)
After the first line of acceptance is performed by the IQF/CQAF, the owner-performed acceptance is implemented by the state DOT itself or a representative directly hired by the state DOT as the second line of acceptance. In large designbuild projects, it is common that state DOTs hire an OVF, which provides the owner oversight, inspection, and testing. Typically, the OVF performs random sampling and testing, done by the IQF/CQAF, at a 10% rate of frequency. The state DOT may choose to include the QC data of the designbuilder and/or the test data of the IQF/CQAF regarding its acceptance decision. The mixed approach of acceptance is typical for several state DOTs: Colorado, Missouri, Texas, Georgia, Washington, Maine, Michigan, and Oregon. The
37

researchers found that the Massachusetts and Utah DOTs are open to both traditional and mixed approaches to acceptance. 4.2.1.3 Supervisory Approach The owner is not actively involved in day-to-day acceptance activities in the supervisory approach. The designbuilder is primarily responsible for day-to-day acceptance activities, and the owner retains only minimal involvement in acceptance during a project. The liability may shift from the owner to the designbuilder during the project; however, the owner cannot assign acceptance decisions to the designbuilder. The typical acceptance approach of the Virginia DOT is probably most closely associated with the supervisory approach. 4.2.2 Decision Factors The researchers found that DOTs differ when it comes to the selection of the method of acceptance for their projects (see Figure 13). Some states prefer using a single approach while other states decide their acceptance approach on a project-by-project basis. The results, shown in Figure 14, indicate that 10 out of 17 DOTs that responded to the email survey use a consistent acceptance decision approach, and the other seven DOTs--Colorado, Missouri, Minnesota, Utah, Ohio, Massachusetts, and Maryland--tend to change their acceptance decision approaches project by project.
38

Figure 13. Distribution of state DOTs based on their acceptance approaches
Figure 14. Consistency of acceptance approaches Figure 15 shows the main factors the researchers identified to play a role in this decision:
Past experience with similar projects Capability and experience of the IQF, and the extent to which a DOT can rely on and trust
the IQF Size of the project at hand Motivation to save on the project cost and schedule
39

Unique requirements such as the constructability and the complexity of the project Current availability and expertise of in-house QA personnel Involvement in the operations and maintenance (O&M) component of the project
Figure 15. Decision factors of acceptance approaches 4.2.3 Examples Describing the Choice of Acceptance Approaches by Different DOTs Maryland DOT (MDOT): The typical approach for acceptance on DB projects in Maryland is the traditional approach. However, on the $2.4 billion Intercounty Connector project, the department did not have enough resources to provide quality management through its typical approach. Thus, it decided to include the designbuilder in the acceptance process. Missouri DOT (MoDOT): For most projects, MoDOT follows the mixed approach for acceptance. On smaller projects, MoDOT has experimented with conducting acceptance in-house. For example, in the Safe & Sound project, the department decided to use in-house inspection personnel because of the availability of qualified inspectors throughout different parts of the state, which led to cost savings. On larger projects, while the designbuilder performs both quality control and acceptance, MoDOT retains only the responsibility of quality verification.
40

Ohio DOT (ODOT): The Ohio DOT implements major DB projects, procured with low-bid, by in-house acceptance and verification. However, on its first two major DB projects, which accrued project costs of more than $200 million, ODOT adopted a mixed approach for acceptance. An IQF conducted both design and construction quality management tasks. Based on the experience of the Ohio DOT, the IQF worked more effectively in the design role than in the construction role, but the Ohio DOT had to hire the IQF for construction due to limited staffing resources. However, on a designbuildfinanceoperatemaintain (DBFOM) project that the Ohio DOT was working on at that time, which required the developer to maintain the project for 35 years, the department adopted the supervisory approach. Since the risk of the Ohio DOT on this type of project was lower than that of typical DB projects because of the long-term contract, the Ohio DOT felt that spending heavily on quality assurance was not efficient.
Massachusetts DOT (MassDOT): The Massachusetts DOT determines acceptance approaches based on specific risk areas of a project, the history of performance, and the credibility of the designbuilder on board. If the QC activities of the designbuilder are lacking, MassDOT increases its level of review and inspection.
Utah DOT (UDOT): UDOT also decides whether to use an IQF or perform QA in-house based on factors such as project size and in-house resource availability.
Texas DOT (TxDOT): TxDOT uses the mixed approach for the acceptance program that includes tests by the DB firm with validation from the department. The main reason for TxDOT's decision is project schedule. TxDOT does not want to slow down the contractor or affect the contractor's schedule. Considering its limited resources, TxDOT believes it cannot keep up with the pace of construction if it decides to perform all the acceptance tests.
41

South Carolina DOT (SCDOT): SCDOT is currently performing the majority of QA testing consistently on DBs using the same testing requirements and frequencies as its designbidbuild projects. In the past, SCDOT did have the contractor provide QA with reduced frequencies for SCDOT inspectors. However, this was changed years ago in coordination with the FHWA. Washington State DOT (WSDOT): WSDOT uses the mixed approach for an acceptance program where designbuilders perform QC and QA on DB projects. WSDOT has not varied from this model. The department is performing quality verification (QV), auditing the designbuilder's quality assurance program and IA, and verifying that QA and QV testers are following testing procedures correctly and verifying testing equipment. Maine DOT: Maine DOT uses the mixed approach. It typically uses the same quality assurance specification on all designbuild projects and modifies its approach due to unique project features. Montana DOT (MDT): The DB team is responsible for QC during the construction phase, and the Montana DOT is responsible for QA and IA. This approach does not change the roles and responsibilities. The quality management plan (QMP) required during the DB project provides a more formal QC plan, including the DB team's roles/responsibilities, compared to a DBB project. Oregon DOT (ODOT): On the small number of DB projects that the Oregon DOT has completed, the Oregon DOT has used the same DB quality management approach, which included oversight and surveillance/auditing of the contractor's independent quality management of the project. ODOT does not change its quality management approach to individual DB projects.
42

4.3 Selection Criteria
This section explores the importance placed on quality assurance management at the procurement/selection phase of the designbuild team. Typically, the selection of a DB team comprises a two-step process, a qualifications phase and then a proposal evaluation phase. Through the structured interview process, the researchers received responses from 12 state DOTs that replied to the survey regarding the emphasis the states put on quality assurance during RFQ and RFP phases. 4.3.1 RFQ Phase Figure 16 shows the consideration for quality assurance in the RFQ phase among state DOTs. Nine out of 12 states consider quality as a selection parameter. The figure illustrates the position of different states on various other factors. Although only two DOTs consider past experience in the quality management plan, 9 of 12 DOTs deem the quality manager as key personnel. The overall quality management approach of the designbuilder is also a part of the evaluation criteria, and half of the respondents evaluate the designbuilder's approach toward the quality management. Complying with the six core elements of QA, some DOTs also consider the inspector, technician, and testing lab accreditation as selection criteria.
43

Figure 16. Shortlisting criteria in RFQ phase 4.3.2 RFP Phase Figure 17 shows the consideration of quality assurance in the RFP phase among state DOTs. Ten out of 12 states consider quality as selection criteria in the proposal phase. Colorado and Minnesota DOTs do not place quality as a selection parameter during either the RFQ or RFP phases, while other state DOTs place a strong emphasis on quality for their designbuilder selection process. Four DOTs still require the designbuilder to submit the statement of qualification (SOQ) in the RFP, which is already evaluated in the RFQ phase. The figure also illustrates other quality-related issues, such as whether the QMP is considered a part of the technical proposal. Five out of 12 DOTs responded that they evaluate the designbuilder's QMP. Half of the respondents indicated that the detailed QMP is required to be submitted after award. Although the QMP is not a part of their evaluation in the RFP phase, some DOTs require the designbuilders to submit the detailed QMP if the projects are awarded.
44

Figure 17. Proposal evaluation in RFP phase 4.3.3 Quality Management Plan According to the responses to the initial interview the researchers conducted, one of the significant differences between DBB and DB projects is that some DOTs required the designbuilders to submit their own QMPs. The researchers followed up with emails to investigate the development of the QMP. The QMP is not evaluated during the RFP process by MnDOT, SCDOT, or WSDOT. However, those DOTs required submission of the detailed QMP after award. In the evaluation, the Ohio DOT would score and rate the identified strengths and weakness of the draft quality management plan and overall quality approach as compared to the requirements of the scope of work. Based on the follow-up emailed interviews and reviews of the RFPs, in general, designbuilders need to submit the final (or first draft) QMP after award, usually within 15 to 30 calendar days of a notice to proceed (NTP). No construction work may be started without the approval of the department. The following detailed responses are reported by state DOTs regarding the timing of submitting the QMP for review and approval.
45

Ohio DOT mentioned that the selected firm will submit the initial QMP within 15 business days for the department to approve or reject. For subsequent revisions to the QMP, the department requires 10 business days to approve or reject the submission.
Missouri DOT also allows 15 business days after the NTP, but the department requires the final written design and construction QMP.
MnDOT requires the designbuilder to submit the quality manual for MnDOT's approval as a condition of its Notice to Proceed 2 (NTP2) and will respond within 20 working days of receipt of the quality manual.
WSDOT requires the designbuilder to submit a draft QMP within 30 calendar days of NTP.
Some state DOTs provide a QMP that can be used as a template to assist the designbuild team to develop its own QMP for the project. However, some DOTs, such as Michigan DOT, do not have their own quality manual to provide to the designbuilder. The QMP submitted by the designbuilder is expected to meet the minimum requirements as outlined in the state DOT's QMP template or contract documents. For instance, MnDOT has developed a Quality Manual (QM) template to aid the designbuilder with development of the QM for the project. These documents may not include all processes and procedures required for the project. The department allows modification and enhancement of these documents as necessary to provide an overall comprehensive quality manual for the project. The contractor may provide a QM developed independently, but it must cover all the topics contained in MnDOT's Quality Manual Template and meet all requirements of the contract. WSDOT also provides a "Quality Management Plan Outline." The designbuilder may either use all or part of the QMP Outline or make changes to meet its own quality approach. On the other hand, Ohio DOT does not provide a quality
46

management manual, so the designbuilder needs to develop, implement, and maintain a quality management program covering all elements of the project, including management, administration, design, geotechnical investigations, construction, testing, environmental monitoring, and compliance.
If the QMP is not in compliance with DOT requirements, the QMP can be modified and enhanced by the state DOT or the designbuilder as needed throughout the project. If a systematic problem is found regarding compliance with the Department's specifications and materials manual, state DOTs may participate in the development and modification of the QMP. Missouri DOT mentioned that the submitted QMP (the sampling, testing, and reporting of all materials) may be modified when it is not in compliance with the Missouri DOT Specifications and Materials Manual. MnDOT agrees that the quality manual and its implementation can be revised when either the contractor or MnDOT identifies a systematic problem. Ohio DOT requires the designbuilder to engage the department in the QMP development to facilitate approval and ensure understanding of requirements. In addition, Ohio DOT indicated that participating or providing inputs does not waive the responsibility of the designbuild team for meeting the expected quality of the work, nor does it ascribe any responsibility to the department for the work. Further, this involvement does not preclude subsequent rejection of the QMP by the department.
4.4 Establishing and Maintaining Exemplary Working Relationships and Collaborations with the FHWA
When federal dollars are involved in a road construction project, the state DOTs are faced with an additional challenge of complying with the FHWA's guidelines. In the structured interviews, the researchers asked the state DOT personnel to share the strategies their agency uses to establish a good working relationship with the regional FHWA office regarding quality management. The
47

responses to this question unanimously suggested that maintaining a regular channel of communication was the key to ensuring a good working relationship. State DOTs, such as CDOT, Ohio DOT, TxDOT, MDT, MoDOT, and Oregon DOT, invite the FHWA to their project meetings and keep it informed of the progress and major decisions during design and construction. SCDOT allows the FHWA to propose changes and corrections to the contractor's QC plan and considers resolving any concerns that arise. The following examples of state DOT responses provide further elaboration on how DOTs establish and maintain exemplary working relationships and collaborations with the FHWA:
Response from Colorado DOT Innovative Contracting Program Associate "The I-25 Ilex project is a Project of Division Interest (PoDI) for the FHWA. FHWA was involved during the procurement of the project and remains active during design and construction. The key to keeping a good working relationship is keeping FHWA informed of the progress of the project and major decisions made during design and construction. The project's quality management plan was implemented smoothly by everyone working as one team, not a contracting team and an owner's team."
Response from SCDOT Assistant Construction Engineer "FHWA partners go out of their way to ensure positive working relationships, so this is a shared goal. FHWA is intimately involved with their PoDIs and most designbuilds are on the SCDOT PoDI list. FHWA has the opportunity to comment on the Contractor's QC Plan before SCDOT issues our [its] approval and SCDOT does our [its] best to resolve any of their [FHWA's] concerns."
48

Response from Missouri DOT DB Program Associate "Missouri DOT has an outstanding working relationship with our [its] Federal Partners. The Missouri FHWA regional office has been a co-champion of our DesignBuild process and has ultimately become part of our Team. Missouri DOT allows FHWA to be present during much of the procurement and delivery of the project. Respectful communication and understanding of what everyone feels is important [and] has been a real key to our success in DB."
Response from MnDOT DB Program Associate "This has not been an issue for MnDOT for a while. In the early days, the FHWA was very interested, but it has not been much of an issue over the past 10 years (particularly since we took material testing back)."
Response from Texas DOT Comprehensive Development Agreement (CDA) Program Associate "TxDOT has had a strong relationship with the FHWA that works very well. TxDOT developed the current quality assurance program (QAP) using a workshop of contractors, laboratory consultants, district engineers and the FHWA's person. Therefore, the FHWA representative was there from the beginning."
Response from Maine DOT Materials Testing and Exploration Associate "Maine DOT has always had a good working relationship with Maine Division FHWA partners. The FHWA's Engineering Team Leader provided feedback during the development of Maine DOT's QA specification for DB projects."
49

4.5 Independent Assurance
The independent assurance (IA) program provides an independent verification of the reliability of all data used by the DOT in the acceptance determination. IA ensures that the sampling and testing activities performed by the DOT and the designbuilder are conducted by qualified personnel using proper procedures and properly calibrated and functioning equipment (FHWA 2012). 23 CRF 637 states that "[e]ach IA program shall include a schedule of frequency for IA evaluation. The schedule may be established based on either a project basis or a system basis. The frequency can be based on either a unit of production or on a unit of time." The DOT is responsible for IA that is usually conducted by the DOT itself or a designated agent directly contracted by the department. The FHWA Techbrief (2012) also suggested that "For agencies that do not routinely include QC test results in the acceptance determination, using this approach on DB projects may create new challenges for the IA system. The designbuilder may not be familiar with IA requirements. The need for the designbuilder QC staff to cooperate with IA personnel should be clearly stated in the DB contract." Per 23 CFR 637.209, "all personnel performing sampling and testing for QC used in the acceptance decision, verification, or IA are required to be qualified. And laboratories operated by a designated agent of the agency that are used for IA or dispute resolution must be accredited by AASHTO, through a comparable program approved by the FHWA, or by an accreditation body approved by the National Cooperation for Laboratory Accreditation." The FHWA (2011a) defines three IA procedures, as discussed in the subsections below.
50

4.5.1 Project Approach This IA procedure is the traditional approach in which the frequency of IA testing and sampling is set up on a project basis. In general, the DOTs use a frequency of 10 percent of the verification and acceptance testing. 4.5.2 System Approach An alternative approach to deciding IA testing frequency is doing it on a time basis for all testers and equipment. The general idea is to proctor all the testers and equipment over a period of a year. The purpose is to cover all the testers and equipment over that year-long period. This approach can help ensure that most testers are reviewed and the same testers are not reviewed continually. 4.5.3 Mixed Approach It is permissible to separate the verification of equipment and personnel. The underlying consideration to this approach is that some tasks are better suited to the project approach and should be reviewed based on a certain fixed frequency rate, no matter the tester or the equipment. Other tasks are more dependent on the equipment quality and the personnel capability. These tasks should be reviewed based on a system approach. Together, this is called the mixed approach. One method to check equipment is to require a calibration and inspection frequency. Personnel can be checked by sending out proficiency samples. Some test procedures and/or some testers are covered by a project approach where the remaining procedures are covered by a system approach.
51

4.5.4 Summary of the Interview Findings about the State of the Practice in IA Approaches Question 1: In your opinion, which Independent Assurance approach is adopted by your DOT in designbuild projects? Ten state DOTs responded to the survey about the IA approach. Half of the respondents use project approach (see Figure 18). Three of 10 answered that they use the system approach. One DOT mentioned that its approach is similar to the mixed approach. MnDOT does not implement a separate IA function since it performs QA services in-house and finds this approach more cost and time efficient.
Figure 18. Distribution of state DOTs based on their IA approaches Question 2: Is this something that changes project by project? Most DOTs use a consistent IA approach (see Figure 19) throughout their designbuild programs. Eight DOTs out of 10 answered that their approach does not change based on project types. WSDOT only changes the IA approach from project to project based on the number of nonconforming issues (NCIs) reported. SCDOT mentioned that if there are issues on a particular project, then the Office of Materials and Research (Central Lab) will visit the project and study
52

the issues in depth with on-site staff. Maryland DOT generally follows the same approach on all projects; however, it could change for a specific project needs.
Figure 19. Consistency of IA approach Question 3: What are the decision-making factors that go into making this determination? Several reasons are identified by state DOTs in deciding the most appropriate IA approach for the designbuild environment as summarized below. For instance, WSDOT uses the project approach because this approach is to stay at the QV (quality verification) level on all projects, while at the same time utilizing available resources effectively. CDOT selected this method for the ease of coordination with their regional lab (essentially the same as a designbidbuild). On the other hand, Missouri DOT switched to the system approach several years ago since the former projectbased approach simply conducted the test for every specified volume of material on a project. It was found that Missouri DOT would perform duplicate tests of the materials. Under the current system-wide approach, Missouri DOT proctors the testers to ensure they are using proper procedures in sampling and testing, and the same testers are not repeatedly proctored. The IA program of TxDOT is established using the system approach based on the evaluation of the qualified sampling and testing personnel and testing equipment. The merit of the system approach is efficiency balanced against quality of personnel and equipment because this approach bases frequency of IA activities on time, regardless of the number of tests, quantities of materials, or numbers of projects tested by the individual being evaluated.
53

Only one DOT mentioned that its approach to IA is similar to the mixed approach. Oregon DOT feels the traditional per-project basis provides more efficient oversight and control of project construction. Not all technicians in the certification program are evaluated, but only those individuals directly involved with the project. Oregon DOT's IA program is mostly based on a quantity-per-project frequency that is similar to a project approach. Test results are then analyzed according to QA program language and project criteria. At the same time, Oregon DOT uses a system approach for the technician certification program, though based on a 5-year period versus annual evaluations. Also, the lab certification process uses a system approach based on an annual evaluation. Therefore, both systems are utilized but with a different application.
MnDOT commented that it performs QA in-house and does not have a separate IA function. MnDOT no longer hands over the QA material testing on most of its DB projects. "MnDOT does not have large and national DB firms bidding on most of our [its] projects (to lead the way into this) and the local industry has had troubles converting." Therefore, "for materials specifically, MnDOT is doing QA and not IA."
4.6 Cost Mechanisms
Budgeting and funding QA in a DB project is a key factor to ensure that sufficient resources are available to conduct all the QA tasks as required by the project. To get an insight into the key factors that influenced the decision-making for the choice of a funding mechanism for QA, the researchers sent out the following three-part question in a follow-up survey related to cost mechanism:
What is your mechanism for budgeting QA services in designbuild projects? Is the QA budget based on a set percentage (like 3%4%) of total project cost?
54

Or is it broken down based on different types of tasks that need to be performed as part of QA services?
Seven out of 11 DOTs answered that their QA budget is based on a percentage (approximately 3%4%). Missouri and South Carolina DOTs develop the cost estimate based on the tasks required. Oregon and Colorado DOTs do not budget quality-related cost separately; instead, they include it in the overall project budget as shown in Figure 20.
The second question the survey asked the state DOTs regarding the budgeting issue for quality tasks was the following:
Is the designbuilder required to spend at least an `X' amount or an `X' percentage toward quality management tasks?
Out of the 7 DOTs that follow a percentage based method, overall responses indicated that they are not very concerned with making sure that a certain `X' amount has been spent on quality management. The general trend observed was that the DOTs are more concerned that the design builder allocates sufficient resources to ensure that the required tasks are conducted properly. Missouri DOT reported that it may be desirable to make sure that the designbuilder spends a certain `X' amount or percentage in future projects.
The final two-part question related to budgeting and cost management for quality tasks was the following:
How does the DOT ensure that the allocated budget is spent correctly? Do DOTs check the number of work-hours, hourly rates, invoices, etc. that a contractor
spends on different tasks under the quality management services?
55

The research team found that most DOTs are concerned with the performance and as long as performance is ensured, they are not too particular about how it is achieved. However, one common trend among DOTs is not observed here, as the DOTs provided varied answers to this question. 4.6.1 Summary of the Interview Findings about the State of the Practice in Budgeting and
Cost Control for Quality Management Tasks in the DesignBuild Environment Question 1: Typically, what is your mechanism for budgeting QA services in designbuild projects? Is the QA budget based on a set percentage (like 3%4%) of total project cost? Or is it broken down based on different types of tasks that need to be performed as part of QA services? The researchers found three approaches of QA funding mechanism based on the received responses. Figure 20 shows that 7 out of 11 DOTs answered that QA budget is based on a percentage of the designbuild contract, approximately 3% to 4%. Table 1 elaborates detailed cost mechanisms applying to DOTs. Missouri and South Carolina DOTs mentioned that they develop the QA cost estimate based on the tasks required for the project quality. Oregon and Colorado DOTs do not budget quality-related cost as a separate line item. Instead, they include it in the overall project budget.
56

Figure 20. Mechanism for budgeting QA services in designbuild projects

Table 1. States with Percentage-based Mechanism for Budgeting QA Services in DesignBuild Projects

State DOT Maine DOT MnDOT
WSDOT
Montana DOT TxDOT

Approximate Percentage of Project Cost Spent on Quality Assurance
3% of the project cost
Usually 34% of the project cost. Historically about 3.3%, recently 3.8% 6% +/- for the designbuilder to provide QA/QC, 14% for WSDOT to perform quality verification and independent verification QA services are budgeted within CEI costs which are set at 10%
3% for quality assurance and 0.75% for IA

Ohio DOT

4% of total cost

Maryland DOT

Done based on historical % across capital programs

*CEI Construction Engineering and Inspection

Question 2: Do you require the DesignBuilder to spend at least an `X' amount or an `X' percentage toward Quality Management tasks?
The 7 out of 11 DOTs that follow a percentage based method responded that they are not very concerned with making sure that a certain `X' amount has been spent on quality management. The

57

most important factor is to make sure that the designbuilder achieves quality requirements in accordance with the QMP developed for the project. The general trend observed was that the DOTs are more concerned about the task being done correctly. Missouri DOT reported that in the future, it may want to make sure that the designbuilder spends a certain `X' amount or percentage for quality management in DB projects.
Question 3: How does your DOT ensure that the allocated budget is spent correctly? Do you check the number of work-hours, hourly rates, invoices, etc. that a contractor spends on different tasks under the quality management services? The researchers found that most DOTs are concerned with the performance. As long as performance is ensured, they are not too concerned about how quality is achieved. One common trend is not observed here, as the DOTs provided varied answers to this question. TxDOT and Maine DOTs make sure that the correct number of tests are done and the inspection level meets the specifications of the contract. Ohio DOT and SCDOT verify this by the performance.
4.7 Quality Assurance Software
4.7.1 Differences between DB and DBB Projects in Terms of QA Software Requirements This section investigates unique requirements of quality management software systems for DB projects compared to those for DBB projects. In most DB projects, the designbuilder provides quality assurance. DB projects are generally much larger in size and complexity and involve more stakeholders than DBB projects. Different groups performing quality management use several databases and software systems that are typically not integrated. The result is that by continuing to focus primarily on entering data and not necessarily on retrieving data to draw valuable
58

conclusions, DOTs may become data rich and information poor (California Department of Transportation 2015). Considering these key differences, DB projects require that unique features from QA software packages should be taken into account for successful implementation of quality management tasks in DB projects.
4.7.2 Desired Functionalities for QA Software Programs The NCHRP Report, "Guidelines for Optimizing the Risk and Cost of Materials QA Programs" (Scott and Molenaar 2017), found that several agencies are moving toward more custom-built systems or modified off-the-shelf tools. The custom-built systems must have the flexibility needed to ensure compatibility with the variety of materials management systems in use. To get an insight into the key functionality requirements from QA software programs, the researchers reached out to state DOTs in a follow-up interview related to the quality management software programs. A summary of the desired functionality features is as follows:
Centralized system for all needs with ease of use Ability to be used in a collaborative environment Compatibility with legacy software (especially those that are still in use for DBB projects
and other DOT offices are most familiar with) Appropriate levels of access for all stakeholders The FHWA (2012) identified high- level functionality requirements from an effective QA/QC software program. The following attributes are determined as ideal features of an effective QA database:
Access by all departments and personnel Different levels of access security
59

Ability to make user group assignments Audit and tracking information to trace users Offline use or wireless device access Ability to generate system outputs and ad hoc reports Common referencing system and interface with other systems
Detailed responses from several state DOTs regarding the desired features for an effective QA software program provide a better understanding of the most important requirements from the perspective of state DOTs for the DB program, as provided below.
Required Features for the QA Software Program: South Carolina DOT's Perspective (response from SCDOT Assistant Construction Engineer):
Ease of making and tracking comments: This is used to make and track comments on the contractors' plans and shop drawings. This feature replaces the need to make comments in spreadsheets and meet very frequently to make sure the comments are understood by all.
Simple and clear communication: It is desirable to see the design review comments right below marked-up pdf plan sheets.
Saving time: Design review meetings are now held less frequently and often via conference call instead of in person, which saves significant time for everyone.
Required Features for the QA Software Program: Washington State DOT's Perspective (response from WSDOT DB Program Associate): Washington State DOT is flexible in the selection of the software system for the designbuild project as long as all requirements are met for assessing the quality of the project. Procurement documents indicate that the designbuilder must provide the quality and construction management/document control system and train WSDOT staff on how to use it.
60

Required Features for the QA Software Program: MnDOT's Perspective (response from MnDOT DB Program Associate): MnDOT is relatively low-tech in regard to QA. However, two issues are noted in the responses to the interview question:
"MnDOT recently modified our [its] RFP template to have a clearer list of all of the project deliverables that contractors and overseers can both use as a checklist."
"MnDOT also recently hired another person in my unit whose job--in part--will be to make certain that QA is running smoothly."
Required Features for the QA Software Program: Caltrans' Perspective (California Department Of Transportation 2015): The main challenges that Caltrans faces is that the department has a number of database and software systems that serve specific quality assurance functions for different groups. However, these individual databases are not integrated. The result is that by continuing to focus primarily on entering data and not necessarily on retrieving data to draw valuable conclusions, Caltrans may become data rich and information poor. A more efficient, comprehensive, user-friendly database system is needed that can link existing individual databases.
The following recommendations are made as desired features of QA software programs to mitigate the "Data Rich Information Poor" problem:
Full capability in a single system electronically Ease of use:
o All data need to be searchable, lockable, and easily printed or shared o The software should be able to generate reports of any kind
61

o A portal for diaries with the ability to attach pictures and videos of observations is necessary
Collaboration capability for document reviews Status updates and email notification for submittal review, sample results, etc. Encapsulation of the process for design reviews, shop drawing reviews, inspection, QC,
QA, etc. Compatibility with legacy software:
o Ability to replicate the Project File structure of the DOT o "e-construction" to be used for documentation and auditing purposes Trend analysis within a DB or DBB project, as well as between DB and DBB projects. Scalable for project size and complexity Figure 21 shows Caltrans' proposed software architecture for an efficient and effective QA software program.
62

DATA TRANSLATION
MODULE
Figure 21. Caltrans Construction Quality Assurance database architecture (adopted from California Department of Transportation 2015 and FHWA 2006) The proposed system in Figure 21 consists of four modules as per the FHWA Techbrief, "Guidelines for Establishing and Maintaining Construction Quality Databases," Publication No. FHWA-HRT-07-020: 1. "Database input module: This module provides an interface to record all information relevant to each construction project from project initiation to the final acceptance of the construction.
63

2. Database management module: The QA management module uses input data to analyze the quality of construction in the project.
3. Data translation module: This module provides the tools to translate data for compatible communication with other systems within a standard format, such as XML. This module also acts as a communication channel with other systems to provide the desired QA testing data to other databases.
4. Database server module: The database server module forms the core of the architecture and stores system data. All other modules connect with the server though the Internet."
4.8 Pay Factors
4.8.1 Overview The FHWA recommends implementing pay factor adjustments for DB projects. In principle, the adjustment should not be different between lump-sum DB projects and unit-price DBB projects. Since a DB project can be thought of as a compressed DBB project, the lump-sum items still need to be sampled and tested according to the material requirements of 23 CFR 637 in all FHWA federally funded projects. The DOT's proposed pay factor strategy is required to be approved by its local FHWA Division Office.
4.8.2 Summary of the Interview Findings about the State of the Practice in Pay Factor Adjustment for Quality in the DesignBuild Environment
Question 1: Do you implement quality pay factors in designbuild projects?
Figure 22 shows that 8 DOTs out of 11 implement pay factors for designbuild projects. State DOTs track schedule of values and installed quantities, submitted by the designbuilder, to calculate the adjustment amount. There are two approaches to determine the unit price of the line
64

items subject to pay-factors adjustment. The designbuilder is required to provide unit costs for the hot mix asphalt (HMA) items as part of the establishment of its work breakdown structure. Another approach is that DOTs insert special provisions identifying the unit price to calculate incentives/penalties. Several state DOTs apply pay factor adjustments using a change order process.
Figure 22. Implementation of pay factors Question 2: On what line items do you consider pay factor adjustment? In traditional designbidbuild projects, several state DOTs adjust payment for line items based on the levels of quality criteria, e.g., asphalt pavement and concrete structure line items. Most DOTs still implement pay factors on DB projects for either Portland cement concrete pavement (PCCP) or HMA, or both. Figure 22 shows that 8 out of 11 DOTs still implement pay factors for DB projects. Table 2 represents line items for pay factors that are used by the eight DOTs. Two of those eight DOTs applied pay factors for PCCP only, and another two DOTs only applied pay factors for HMA. The remaining four DOTs implemented pay factors for both PCCP and HMA.
65

State DOT Arizona DOT Maryland DOT
MnDOT Ohio DOT
SCDOT CDOT CTDOT UDOT

Table 2. Line Items for Pay Factors

Portland Cement Concrete Pavement
Yes

Hot Mix Asphalt No*

No*

Yes

Yes

No*

No*

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

No*: Email answers and/or RFPs do not indicate pay factors.

Question 3: What materials tests do you use for pay factor adjustment?
Most DOTs emphasize the quality achievement for PCCP and HMA, and pay factors are applied based on the results of material testing. Table 3 represents specified types of material testing that are applied pay factors, as provided by respondents. Three out of six DOTs that implement pay factors for PCCP consider concrete thickness as a critical factor. Concrete strength, concrete smoothness, and concrete air void content are also material tests applied pay factors. With HMA, asphalt binder and pavement density are weighted to adjust payment based on the levels of quality achievement by four out of six DOTs. Maryland and Utah DOTs provided elaborate items of pay factors, which also consider pavement marking paint.

66

Table 3. Material Tests for Pay Factor

Line Items Types of

AZ Maryland Mn

material DOT DOT DOT

testing

Portland Concrete

Yes

Cement strength

Concrete Concrete

Yes

Pavement thickness

(PCCP)

Concrete

Yes

Yes

smoothness

Concrete air

Yes

void

content

Hot Mix Aggregate

Asphalt base

(HMA)

Asphalt

Yes

binder

Asphalt

Yes

mixture

Pavement

Yes

density

Pavement

Yes

surface

profile

adjustment

Other

Pavement

Yes

marking

paint

Ohio DOT
Yes Yes

SC DOT
Yes

CDOT CT DOT
Yes
Yes Yes
Yes
Yes

Utah DOT Yes Yes
Yes Yes
Yes

The following examples of responses provide further elaboration on how pay factor adjustment is handled by different state DOTs.

67

Response from Maryland DOT P3 Project Associate "The designbuilder is required to provide unit costs for the HMA items as part of the establishment of their [its] work breakdown structure. The quantities are tracked and the adjustments would be made just like a designbidbuild project."
Response from Colorado DOT Innovative Contracting Program Associate "CDOT keeps track of those adjustments that are non-conforming to the contract requirements and that are not approved by the contractor's EOR [Engineer-of-Record]. At the end of the project the best way to retain those reductions can be addressed in a change order."
Response from Connecticut DOT Office of Research and Materials Associate "For the limited number of designbuild projects that have been administered, these adjustments are provided in the specifications as well. These adjustments are processed as change orders."
Response from Ohio DOT Alternative Project Delivery Associate "Schedule of values and quantities placed must be submitted. Prices are then adjusted by change order. The same is true except for price based adjustments that Ohio DOT uses state average unit prices for similar materials for projects awarded in the same month."
Response from South Carolina DOT Assistant Construction Engineer "SCDOT maintains a database of all bids received. SCDOT uses average low bids for the past year or two for each item to estimate the unit prices. We admit the unit prices are not perfect because they do not have to be, we are just reducing the size of the risk for the contractor to normalize bids. We do not update the prices very often and I do not think we have had to adjust for a specific project before."
68

Response from WSDOT DB Program Associate "The unit prices are based on an analysis of actual bid item prices received statewide by WSDOT for these items during the previous two-year period. The unit prices are fixed and do not vary by project or region but they are reviewed and updated every two years to coincide with the latest version of the Standard Specifications."
4.9 Non-Conformance Reports (NCRs)
4.9.1 Overview Regarding the question, "Does FHWA recommend a process for Non-Conforming materials and workmanship or are the DOTs free to decide their own process?" the FHWA answered that the ultimate resolution to the NCR should be documented, and the owner (or agency) should retain oversight/approval authority of that resolution. NCR contains details of the work that is nonconforming. Elements of NCR include the observed reason for the non-conformance and detailed remedial actions proposed to achieve conformance to the contract requirements. A typical NCR process is as follows. Non-conforming product shall be reviewed in accordance with documented procedures and one of the following decisions must be made about the NCR element: (a) reworked to meet the specified requirements, (b) reworked in accordance with a department-approved rework procedure, (c) regarded for alternative applications, or (d) rejected or scrapped. Figure 23 shows the flowchart of an example NCR process currently practiced by the Arizona DOT (ADOT). When the results of the OV test do not statistically meet the test results of quality acceptance, ADOT and the IQF jointly investigate the non-conformance. In addition to the need to validate the non-conformance, the material in question needs to be assessed to determine if the material can be left in place or has to be removed, reworked, or repaired. If the material in question is to remain,
69

this material needs to be evaluated using the process provided in the QMP. Engineering judgment can be used to determine whether the material will perform its intended purpose (Arizona Department of Transportation 2016).
Figure 23. Non-validation flowchart (Arizona Department of Transportation 2016) Figure 24 describes an example of how the Colorado DOT implements non-conformance work. CDOT follows an eight-step process to assess and deal with any possible non-conformance that arises. Steps one through four require the `technical assessor' to locate the assessment database and go through the tasks checklists to determine if the work is conforming with the contractual requirements or not. In the event that a non-conformance is found, CDOT has a three-level process to deal with it. The first level of non-conformance is the NC-3, which basically states that CDOT will notify the contractor that a work has been found to be non-conforming and the contractor is
70

expected to promptly communicate with the DOT on fixing the issue. The second level of nonconformance is the NC-2, which occurs if the degree of non-conformance is more than NC-1 but still not critical. The work is allowed to keep moving but special attention is paid to the nonconforming element. If not fixed in a timely fashion, it can be elevated to NC-1. The last level of non-conformance is the NC-1, which is highly critical and the work is stopped and immediate response is expected from the contractor. Safety issues may be involved and it is usually of a serious nature. Once the non-conformance is addressed, CDOT's technical assessor reevaluates the assessment report and ensures that all the contractual obligations have been met. Once this is achieved, the assessment report is closed and the entire report is then ready to be closed out.
Figure 24. CDOT assessment report workflow (Colorado Department of Transportation 2017) 71

Figure 25 shows that 6 out of 11 DOTs implement NCR procedures. NCR is only used if there is an IQF working for the contractors and the contractor's tests are used for acceptance and as the basis of payment. DOTs expect the quality assurance manager (QAM), who is an employee of the IQF and handles non-conformance, to issue most NCRs. The QAM will be responsible for obtaining resolution to NCRs. If the IQF fails to issue the NCR, the state DOT still can do it. DOTs have reported that contractors will go to great lengths to avoid receiving an NCR because they do not want those on their records, which may possibly be used for future consideration in shortlisting and proposal valuation. It is important for state DOTs to educate designbuilders about the NCR process and assure them that it is aimed to streamline the method of resolving quality issues for the project.
Figure 25. Implementation of NCRs 4.9.2 Summary of the Interview Findings about the State of the Practice in the NCR Process The following examples of responses to the interview questions provide further elaboration on how NCRs are handled by different state DOTs.
72

Question 1: Is NCR part of your DOT's quality management plan for designbuild projects?
Response from Maine DOT Materials Testing and Exploration Associate "Maine DOT does not require a formal NCR, but any failing tests or deficient work must be documented and brought to the attention of the Department. Also, the Department will notify the DB quality manager of any deficient items discovered through DOT testing/inspection."
Response from MnDOT DB Program Associate "MnDOT plans to use NCRs only for non-conformances that are not repaired and MnDOT will use Corrective and Preventative Action Plans (CPAPs) to document necessary process improvements related to repaired/temporary non-conformances."
Response from Arizona DOT Assistant Construction Engineer Arizona DOT does not do NCR Reports on DB Projects, only on P3 Projects. A quote from Arizona DOT's P3 agreement describes, "When OV test results do not statistically validate the Quality Acceptance test results, ADOT and IQF jointly investigate the source of non-validation. In addition to the need to investigate the non-validation, the material in question must be immediately evaluated to determine if it can be left in place or has to be removed, reworked, or repaired."
Question 2: Who issues NCR? Can you describe the roles and responsibilities in the NCR process?
Response from Colorado DOT Innovative Contracting Program Associate "The Owner can and the contractor's Independent Quality Control Firm [can issue NCR]. It is important to keep track of those NC's or NCR's, that way [the DOT is well informed] when the
73

contractor is asking for payment on that [an] item that is not in conformance or in the process of being put in conformance or accepted by their [DOT's] EOR prior to payment on that item."
Response from WSDOT DB Program Associate "Non-conformance reports (NCRs) are written and logged by the designbuilder. Non-conforming Issues (NCIs) are written by the owner's representatives from their verification observations during the course of designbuild contract. NCIs are logged by the designbuilder just as NCRs and both require the same process for closure or resolution."
Response from Utah DOT Preconstruction Engineer "Quality Assurance Manager, who is an employee of IQF, handles Non-conformance."
Response from Missouri DOT DB Program Associate "Missouri DOT utilizes a couple of different approaches. NCRs are issued by the Contractor's Quality Staff. Missouri DOT can also issue `Non-Conformances' issued through a variety of different platforms. The Quality Manager will generally issue NCRs, but anyone from the Quality [management team] are able to issue them. The Quality Manager will be responsible for obtaining resolution to NCRs. Missouri DOT will review the NCRs for conformance to the Contract."
Response from Texas DOT Comprehensive Development Agreement Program Associate "The IQF hired by the DB firm. If they don't [it doesn't] then TxDOT does."
Response from MnDOT DB Program Associate "Documentation of MnDOT expects that the Contractor's Quality Manager (QM) will issue most NCRs unless the issue comes up through unusual channels (and MnDOT would have to). In
74

practice, MnDOT usually writes them unless MnDOT can guilt the QM into doing it. In the future, MnDOT hopes to see QMs writing more CPAPs as MnDOT get[s] further from the `scary' NCR term." Response from Arizona DOT Assistant Construction Engineer "IQF is responsible for documenting any non-conformance work. It is the Developer's job to provide a solution for the non-conformance. IQF verifies and accepts the job when that solution is implemented." Response from Maine DOT Materials Testing and Exploration Associate "Maine DOT does not have formal process, but NCRs would be managed by DB Construction Quality Manager."
Question 3: How is the NCR reviewed, handled and resolved? If you have any materials that describe the NCR process in your DOT, would you please share it with us to enhance our understanding? Response from Maine DOT Materials Testing and Exploration Associate "Non-conforming product shall be reviewed in accordance with documented procedures, and if required: A. Reworked to meet the specified requirements; B. Reworked in accordance with a Department approved rework procedure; C. Regarded for alternative applications; or D. Rejected or scrapped. Repaired and/or reworked product shall be re-inspected in accordance with the CQMP [Construction Quality Management Plan] and/or other documented procedures."
75

Response from Utah DOT Preconstruction Engineer "The QMP will identify the process for responding to all Non-Conformances. The nonconformance remediation process will include a report which clearly describes the element of Work that is non-conforming, the reason for the non-conformance, and details [of] the remedial actions proposed to achieve conformance to the Contract requirements. The proposed remediation shall be approved by the Department prior to the Work. The remedial actions employed will undergo the same level of inspection and testing as required for the original Work."
Response from Missouri DOT DB Program Associate "Missouri DOT prefers the Contractor to be responsible for Quality. Missouri DOT will do audits on the Quality staff to ensure they are checking the right items and issuing NCRs if necessary. Missouri DOT will also inspect the work to ensure it's all right. Safe and Sound (S&S) Project is relatively similar, with either party able to issue an NCR, and both parties concurring in resolution. S&S differed somewhat in the logistical challenge--and Missouri DOT had their [its] inspectors performing measurement/testing functions on each site, working within the Contractor's quality program."
Response from Texas DOT Comprehensive Development Agreement Program Associate "NCRs are recorded and NCR logs are reviewed at weekly quality meeting until resolved by removing or repair. The Engineer of Record recommends or approves repair. TxDOT will also apply noncompliance points for recurring or unresolved NCRs."
Response from Arizona DOT Assistant Construction Engineer "IQF verifies and accepts the job when that solution is implemented."
76

4.10 Responsible Charge
The 2011 FHWA memorandum (FHWA 2011b) noted that the STA (State Transportation Agency) must provide a fulltime employee to be in "responsible charge" of the project (FHWA Memorandum on `Responsible Charge' dated Aug 4, 2011, Regulation). This requirement applies even when consultants are providing construction engineering services to the owner. The following duties are assigned to the responsible charge (FHWA 2011b):
Be accountable for the project Be heavily involved in administration of the federal-aid project Be familiar with day-to-day operations including safety Be involved in decision-making process Make regular visits to the projects Review project finances Direct project staff, both in-house and consultants, to perform their duties
The memo did not preclude sharing of these duties and functions among a number of public agency employees. The memo listed the roles and responsibilities of the person appointed as the responsible charge on a DOT project. Some of these roles and responsibilities include "administering project activity, construction quality and project scope, maintaining familiarity with day-to-day project operations including safety, participation in decision making process and regular visits and reviews of the project such that it is commensurate with the magnitude of the project."
Staffing shortages and falling levels of employment in state DOTs can often lead to the question of whether a single state DOT employee can act as responsible charge on more than one project at
77

a time. In 2011, the FHWA released a memorandum discussing this issue. The memo clearly stated that "The regulations also do not preclude one employee from being responsible charge of several projects and directing project managers assigned to specific projects." The FHWA regulations require the person acting as responsible charge to be a "full-time employed state engineer." Each state makes its own determination for the qualifications of the state engineer who is responsible charge for the project(s). For example, Caltrans requires a P.E. license for the person who will be responsible charge for the project.
4.11 Risk-Based Approach
A risk-based approach toward QA is an emerging practice for designbuild projects, and especially, designbuild projects with extended warranty period and designbuildfinance operatemaintain projects. A complete construction project consists of thousands of tasks, out of which not all are of the same level of importance and criticality. Some of the current quality management practices can be disproportionate to the level of assurance actually required for the product or task. For example, the concrete used for a curb or a sidewalk on a service road does not warrant the same level of material testing as the concrete used to construct an interstate highway.
With the constant pressure to deliver more for the same buck with a declining task force, the state agencies are increasingly looking to adopt techniques to ensure quality with limited resources. The objective of an effective risk-based approach toward quality assurance is to optimize the resources and personnel assigned to a particular product/task based on its criticality to the overall project.
One of the recent examples of implementing a risk-based approach to quality management is the I-4 Ultimate project in Florida under FDOT. The I-4 Ultimate project is a P3 project involving a $3.8B concession agreement with a term of 40 years, which consists of a $2.323B contract for
78

design and construction (Construction Period). The construction oversight services (COS) consultant for FDOT has developed a Risk Based Audit Plan (RBAP) to identify specific risks in the project and rate them based on certain criteria. Risk is quantified using the following four parameters: probability of occurrence (P), consequence of occurrence (C), detectability or discovery of occurrence (D), and history of performance (H).
=
The values of these parameters for different work elements were jointly developed between FDOT and the COS consultant during workshops. Probability of occurrence (P) deals with the chance of a certain work element not meeting the contractual quality parameters, consequence of occurrence (C) deals with the severity of the consequence of a particular work element failing, detectability or discovery of occurrence (D) is related to how easy or hard it is to identify the particular work element not performing up to the quality specifications, and history of performance (H) is based on the historic frequency of a particular task/work element failing and the consequences attached to it.
The concessionaire was not involved in this process. Based on the RBAP, the work elements are classified into four risk quartiles: `Very High', `High', `Low', and `Very Low'. As the terms suggest, work elements that have a very high risk associated with them are classified as `Very High' and so on. FDOT expects 50% of monthly audits in the `Very High' Quartile, 30% of monthly audits in the `High' Quartile, 10% of monthly audits in the `Low' Quartile, and the remaining 10% of monthly audits in the `Very Low' Quartile. Using the four quartiles as weightages, random work elements are selected for quality audit and inspection. Close observation of Figure 26 reveals that different work elements are divided into the quartiles based on their risk propensity. As an example, the work element labeled as `Install and Test Drilled Shaft' is assigned
79

a `Very High' risk quartile and the work element `Construct Curb and gutter STA 2520-3030 Lt. Keller Road (Ph. 1-3)' is assigned a `Very Low' risk quartile.
Figure 26 Audit Profile separated into Risk Quartiles FDOT project Another example of risk-based quality assurance and inspection is observed in practices of the Texas Department of Transportation (TxDOT 2016). In its quality assurance program manual for designbuild projects with three optional 5-year term maintenance agreements, three levels of inspection and testing are described, as shown in Figure 27. Level 1 is the highest level of testing and oversight and it involves F- and t-tests and split sample evaluations. The owner's verification firm performs tests at a frequency of 10% of the independent quality firm. Level 2 relates to independent verification and testing at a lower rate than Level 1. Finally, Level 3 recommends observation verification and review of the IQF's testing and operations.
80

Figure 27. TxDOT's owner verification levels for material testing validation
Different work elements are assigned a risk level by TxDOT. As an example, tests for `Plasticity Index' for both embankments and retaining wall merited a Level 1 classification, while tests for `Moisture/Density' were assigned a `Level 3' classification. It was also observed that as the length of the O&M component increased, more and more work elements shifted from Level 1 toward Level 3. This can be explained by the fact that since the concessionaire is responsible for operations and maintenance after the completion of the project, the DOT can safely cut down on the level of QA inspection and testing as the cost of deficient quality issues will be borne by the concessionaire.
4.12 Independent Engineer (IE)
The researchers came across a few examples from Texas P3 projects and Maryland's MTA Purple Line P3 project, where an IE firm, which is neutral (paid both by the DOT and the contractor 50%/50%) and is involved in the QA program. In Texas P3 projects, TxDOT has QA performed by the contractor, oversight performed by the owner (who validates the contractor test results), and an IE firm. The IE represents both sides and resolves conflict. IE is not a common practice across state DOTs, especially, and the research team has not found any designbuild highway project in
81

the U.S. that has used an IE. The examples are from P3 transportation projects. The main responsibilities of this IE firm are to:
"Audit the contractor's Quality program as well as the Owner's Quality Program Report independently and impartially on a range of technical and commercial matters to
both the developer and the state DOT in order to reduce the need for Dispute Resolution. Act as an impartial point of reference during dispute resolution. Decide if noncompliance points should be assessed" The key difference between an IE firm and an owner's verification firm is that the IE firm is hired and paid by both parties. The following example provides further details about the use of IE in transportation projects.
IE in Maryland Purple Line Project The quoted text below is from the P3 agreement for the Maryland Transit Administration (MTA) Purple Line Project. An independent engineering firm was hired by the concessionaire and was paid 50%50% by both the concessionaire and the owner (Maryland Department of Transportation 2016).
"The Independent Engineer will be appointed jointly by the parties and will act independently and not as agent of either party.
Owner's Project Management Consultant and the Lender's engineering consultant are each deemed to have an organizational conflict of interest and therefore are not eligible to respond to the solicitation.
82

Concessionaire shall be responsible for all costs of conducting the Independent Engineer solicitation but has no obligation to reimburse Owner for Owner's costs relating to the solicitation.
Amounts payable to the Independent Engineer under the terms of its agreement shall be paid by Concessionaire subject to the right to receive reimbursement for 50% of such costs from Owner. Such reimbursement will not be subject to the D&C [Design and Construction] Payment Cap."
83

Chapter 5 Conclusions
Based on the findings of the interview and content analysis of the documents provided by the state DOTs, this study concludes that implementation of an efficient and effective quality management plan typically presents a set of new challenges for state DOTs in the alternate delivery environment.
Reluctance of DOTs to shift the responsibility of quality assurance to the designbuild team
Reluctance of contractors to accept the new role of QA in the DB environment Difficulty in developing an appropriate quality management program for the alternative
delivery when detailed design and actual quantities are not available Difficulty in developing an adequate and reliable budget for quality management tasks and
conducting cost control Differences in terminology used by state DOTs for quality management in the designbuild
environment Lack of a unified and consistent guidebook for quality management in the state DOT Differences in organizational structure for the quality management Understanding new roles and responsibilities in designbuild projects Independence of quality management firms from the designbuild team Need for specialized training: Requirements for the new set of skills and qualifications in
working in the DB environment Need for an appropriate evaluation system to evaluate the qualifications of the designbuild
team and its approach toward the quality management in the procurement phase
84

Lack of familiarity with how to use the contractor's samples in the acceptance procedure Establishing and maintaining good relationships with the FHWA to ensure that state DOTs
and the FHWA are on the same page when it comes to evaluating the project quality Lack of flexibility and scalability of existing quality management software programs that
were mainly designed for the DBB environment
In light of these new challenges, there are several important areas in the state of the practice of quality management in the alternative delivery environment, that the state DOTs across the country can consider enhancing. These are presented here as follows:
Organizational structure for quality management in the designbuild environment Acceptance approaches and decision factors to choose the most appropriate acceptance
approach for the designbuild project Selection criteria and quality management plan Establishing and maintaining exemplary working relationships and collaborations with the
FHWA Independent assurance methods (i.e., project approach, system approach, and mixed
approach) Budgeting and cost control for quality management tasks in the designbuild environment Quality assurance software programs Pay factor adjustment for quality in the designbuild environment Non-conforming reports (NCRs) Responsible charge Risk-based approach
85

Independent engineer In summary, several new areas have arisen as a result of the growing number of projects delivered by designbuild and other alternative delivery systems. The specific area of quality management is extremely important to construction projects in general, and its evolution and state-of-the-art trends are vital to be explored and considered for highway construction projects to be successful as things progress. Future research is needed to develop a best-practices guide for conducting quality assurance in the pubicprivate partnership environment with the designbuildfinance operatemaintain project delivery system. Also, detailed statistical analysis should be conducted to develop a customized risk-based approach for quality management within the state DOT. The customized risk-based approach takes into account the unique requirements of any state DOT and reflects the actual quality test results of its historical projects.
86

References
Arizona Department of Transportation. (2016). Public private partnership (P3) design-buildmaintain agreement for between Arizona department of transportation.
Atkinson, R. (1999). "Project management: cost, time and quality, two best guesses and a phenomenon, its time to accept other success criteria." International Journal of Project Management, 17(6), 337342.
California Department Of Transportation. (2015). "Construction quality assurance program manual."
Chan, A. P. C., Scott, D., and Chan, A. P. L. (2004). "Factors Affecting the Success of a Construction Project." Journal of Construction Engineering and Management, 130(1), 153 155.
Chan, A. P. C., Scott, D., and Lam, E. W. M. (2002). "Framework of Success Criteria for Design/Build Projects." Journal of Management in Engineering, 18(3), 120128.
Charles S. Hughes. (2005). NCHRP synthesis 346: state construction quality assurance programs. Transportation Research Board, Washington DC.
Colorado Department of Transportation. (2017). Owner verification teting and assessing plan. Federal Highway Administration (FHWA). (2004). Quality assurance procedures for construction
(23 CFR Part 637). 235236. Federal Highway Administration (FHWA). (2006). TechBrief: Guidelines for establishing and
maintaining construction quality databases. Current Practice.
87

Federal Highway Administration (FHWA). (2008). Transportation Construction Quality Assurance Reference Manual.
Federal Highway Administration (FHWA). (2011a). Techbrief: independent assurance programs. Federal Highway Administration (FHWA). (2011b). Memorandum: Responsible Charge (dated
Aug. 4). Federal Highway Administration (FHWA). (2012). Techbrief: construction quality assurance for
design-build highway projects. McLean. Gad, G. M., Adamtey, S. A., and Gransberg, D. D. (2015). "Trends in quality management
approaches to design-build transportation projects." Transportation Research Record: Journal of the Transportation Research Board, 8792. Georgia Department of Transportation. (2014). GDOT design-build-finance construction quality assurance program (Northwest corridor project). Gransberg, D. D., Datin, J., and Molenaar, K. R. (2008a). NCHRP synthesis 376: quality assurance in design-build projects. Transportation Research Board, Washington DC. Gransberg, D. D., and Molenaar, K. (2004). "Analysis of owner's design and construction quality management approaches in design/build projects." Journal of Management in Engineering, 20(October), 162169. Gransberg, D., Datin, J., and Molenaar, K. (2008b). Quality Assurance in Design-Build Projects. NCHRP. Gransberg, D., and Molenaar, K. (2008). "Does Design-Build Project Delivery Affect the Future of the Public Engineer?" Transportation Research Record: Journal of the Transportation
88

Research Board, 2081, 38. Harman, L., and David N. Sillars. (2013). "Case studies in innovative quality assurance methods
for alternative delivery projects." Journal of the Transportation Research Board, 97214(503). Kraft, E., and Molenaar, K. (2013). "Project quality assurance organization selection for highway
design and construction projects." Transportation Research Record: Journal of the Transportation Research Board, Transportation Research Board of the National Academies, (2347), 2937. Kraft, E., Molenaar, K., and Asce, M. (2014). "Fundamental project quality assurance organizations in highway design and construction." Journal of Management in Engineering, 30(4), 19. Kraft, E., and Molenaar, K. R. (2015). "Quality assurance organization selection factors for highway design and construction projects." Journal of Management in Engineering, 31(5). Loulakis, M., Tran, D., Njord, J., Parkinson, R., and Henk, G. (2015). Review of WSDOT's implementation of design-build project delivery. State of Washington Joint Transportation Committee. Maryland Department of Transportation. (2016). "Public-private-partnership agreement purple line project." Minnesota Department of Transportation. (2014). Construction Quality Plan Template. Munns, A. K., and Bjeirmi, B. F. (1996). "The role of project management in achieving project success." International Journal of Project Management, 14(2), 8187. NorthEast Trasportation Training and Certification Program (NETTCP). (2014a). Transportation
89

construction quality assurance reference manual. NorthEast Trasportation Training and Certification Program (NETTCP). (2014b). "Quality
assurance for design-build projects - chapter 4." 3(December), 4965. Scott III, S., and Molenaar, K. (2017). NCHRP synthesis 838: guidelines for optimizing the risk
and cost of materials QA programs. Washington DC. Songer, A. D., Molenaar, K. R., and Robinson, G. D. (2015). "Selection Factors and Success
Criteria for Design Build in the U . S . and U . K ." 111. Texas Department of Transportation. (2011). TxDOT design-build quality qssurance program
implementation guide. Texas Department of Transportation. (2016). Quality assurance program for CDA / design-build
projects with a capital maintenance agreement with three optional 5-year terms. Virginia Department of Transportation. (2012). "VDOT minimum requirements for quality
assurance and quality control on design-build and public-private transportation act projects." (January).
90

Locations