Strategies for communicating quality expectations for environmental service contracts [Mar. 2017]

GDOT RESEARCH PROJECT NO. 15-06
FINAL REPORT
STRATEGIES FOR COMMUNICATING QUALITY EXPECTATIONS FOR ENVIRONMENTAL SERVICE
CONTRACTS
OFFICE OF RESEARCH 15 KENNEDY DRIVE
FOREST PARK, GA 30297-2534

GDOT Research Project No. 15-06
Final Report
STRATEGIES FOR COMMUNICATING QUALITY EXPECTATIONS FOR ENVIRONMENTAL SERVICE CONTRACTS
By Gordon Kingsley, Ph.D.
Dan Matisoff, Ph.D. Juan Rogers, Ph.D. Baabak Ashuri, Ph.D., DBIA, CCP, DRMP Yehyun An, Ph.D.
Evan Mistur Isabel Ruthotto Darian Agnew, MSW
Georgia Institute of Technology
Contract with
Georgia Department of Transportation
In cooperation with
U.S. Department of Transportation Federal Highway Administration
March 2017
The contents of this report reflect the views of the authors who are responsible for the facts and the accuracy of the data presented herein. The contents do not necessarily reflect the official views or policies of the Georgia Department of Transportation or the Federal Highway Administration. This report does not constitute a standard, specification, or regulation.

1.Report No.: FHWA-GA-17-1506

2. Government Accession No.: 3. Recipient's Catalog No.:

4. Title and Subtitle:
Strategies for Communicating Quality Expectations for Environmental Service Contracts

5. Report Date: March 2017
6. Performing Organization Code:

7. Author(s):
Gordon Kingsley, Dan Matisoff, Juan Rogers, Baabak Ashuri, Yehyun An, Evan Mistur, Isabel Ruthotto, Darian Agnew

8. Performing Organ. Report No.:

9. Performing Organization Name and Address: Office of Sponsored Programs 505 Tenth Street, NW, Atlanta, Georgia 30332-0420

10. Work Unit No.:
11. Contract or Grant No.: 0013612

12. Sponsoring Agency Name and Address:

13. Type of Report and Period Covered:

Georgia Department of Transportation

Final; September 2015-March 2017

Office of Research 15 Kennedy Drive

14. Sponsoring Agency Code:

Forest Park, GA 30297-2534

15. Supplementary Notes:

16. Abstract:

This study explores the communication of quality expectations between the Office of Environment Services (OES)

and environmental consulting firms contributing to the Plan Development Process of the Georgia Department of

Transportation (GDOT). The overall goal of this research is to improve efficiency in the review process for reports

submitted by environmental consultants in terms of the duration of the review period, the number of errors

identified by OES in the review process, and the number of revisions requested of consultants. Focus groups were

conducted with representatives of the consulting community to review existing practices and alternative strategies

for improving communications. The development of the focus groups was supported by case studies of project-

level performance and reviews of consultant performance over a 5 year window of design projects.

Across the focus groups, there was strong consensus regarding the challenges associated with producing high quality environmental reports as well as alternative strategies that could improve performance. The following approaches were recommended by the consulting community as the most promising for yielding improvements in document quality:

Manuals and templates maintained by the OES for environment reports need greater standardization and procedures to address frequent updating by federal and state regulatory sources.
Fast track the development of a standardized performance assessment form for consultant performance that is shared with the executive leadership of consulting firms as well as the GDOT's procurement office.
Consider procedures for normalizing the document review process amongst reviewers on the OES staff. Consultants' report high degrees of variability in the issues identified by document reviewers.
Greater communication and coordination of quality expectations between OES document reviewers and
design project managers.

17. Key Words: Consultant Communication, Environmental Contract, Environmental Summary, Design Process

18. Distribution Statement:

19. Security Classification (of this report):
Unclassified

20. Security Classification (of this page):
Unclassified

21. Number of Pages:
188 pages

22. Price:

TABLE OF CONTENTS
1.1. Research Background.................................................................................................................. 10 1.2. Research Objectives .................................................................................................................... 11 1.3. Research Literature..................................................................................................................... 13
1.3.1. Research on Environmental Procedures and Design Delay ................................................ 13 1.3.2. Research on Communications and Project Management .................................................. 15 1.3.3. Research on Relational Contracting.................................................................................... 18 1.4. Key Research Design Tasks and the Structure of this Report ..................................................... 19 1.4.1. Task 1: Performance Levels of OES Consultants on GDOT Design Projects........................ 19 1.4.2. Task 2: Review of Existing Communication Practices ......................................................... 21 1.4.3. Task 3: Identification of Strategies for Communications................................................... 21
2.1. Introduction ................................................................................................................................ 23 2.2. Methods ...................................................................................................................................... 24
2.2.1. Data collection .................................................................................................................... 24 2.2.2. Variables.............................................................................................................................. 28 2.2.3. Data Analysis ....................................................................................................................... 34 2.3. Results ......................................................................................................................................... 37 2.3.1. Analysis of T-Pro and P-6 Data and Regulatory Information .............................................. 37 2.3.2. Analysis of P-6 and SharePoint Data ................................................................................... 47 2.4. Integration with Case Studies and Focus Groups ....................................................................... 52
3.1. Introduction ................................................................................................................................ 54 3.2. Methods ...................................................................................................................................... 55
3.2.1. Case Selection ..................................................................................................................... 55 3.2.2. Data Collection .................................................................................................................... 56 3.2.3. Data Analysis ....................................................................................................................... 58 3.3. Results ......................................................................................................................................... 58
i

3.3.1. Communication Content and Stability ................................................................................ 59 3.3.2. Signal Form, Stability, and Receptivity................................................................................ 65 3.3.3. Performance........................................................................................................................ 75 3.4. Input for Focus Groups ............................................................................................................... 81
4.1. Introduction ................................................................................................................................ 83 4.2. Methods ...................................................................................................................................... 83 4.3. Results ......................................................................................................................................... 86
4.3.1. Current communications and performance ....................................................................... 86 4.3.2. Challenges in OES projects.................................................................................................. 89 4.3.3. Areas for improvement..................................................................................................... 100
5.1. Summary and Conclusion ......................................................................................................... 107 5.2. Recommendations .................................................................................................................... 110
A.1. Performance Data Issues .......................................................................................................... 120 A.2. Environmental Activity Overview ............................................................................................. 122 A.3. Ecology Document Review........................................................................................................ 126
B.1. High-quality Document Case 1.................................................................................................. 133 B.2. High-quality Document Case 2.................................................................................................. 136 B.3. High-quality Document Case 3.................................................................................................. 139 B.4. Low-quality Document Case 1 .................................................................................................. 142 B.5. Low-quality Document Case 2 .................................................................................................. 144 B.6. Low-quality Document Case 3 .................................................................................................. 147
C.1. Consultant Interview Protocol .................................................................................................. 150 C.2. OES Staff Interview Protocol..................................................................................................... 152
ii

iii

LIST OF TABLES
Table 2-1 Duration of Environmental Summary and Project Design .......................................................... 38 Table 2-2 OLS Regression Model Summary ................................................................................................ 40 Table 2-3 OLS Regression Results ............................................................................................................... 46 Table 2-4 Ecology Document Review Durations ......................................................................................... 48 Table 2-5 Document Deficiency Types and Task Duration ......................................................................... 50 Table 2-6 Document Returns and Task Duration........................................................................................ 51 Table 3-1 Case Overview............................................................................................................................. 57 Table 3-2 Overview GDOT Communication Content and Instruments....................................................... 60 Table 3-3 Signal Forms ................................................................................................................................ 66 Table 3-4 Questionnaire Results, Perception and Usage of Signal Forms .................................................. 69 Table 4-1 Key Concepts and Themes in the Focus Group Protocol. ........................................................... 84 Table 4-2 Critical Conditions to Timely Submission of High Quality Documents ....................................... 90 Table 4-3 Challenges in OES projects .......................................................................................................... 91 Table 4-4 Effective Approaches to Timely Submission of High Quality Documents................................. 101 Table 4-5 Areas of Improvements in OES Projects ................................................................................... 103 Table A-1 Environmental Activity Duration .............................................................................................. 123 Table A-2 Environmental Activity Duration of the Document Types........................................................ 124 Table A-3 Task Outsourcing Based on Environmental Document Type ................................................... 124 Table A-4 Outsourcing and Durations of Environmental Summary and Project Design .......................... 125 Table A-5 Regulatory Interventions and Durations of Environmental Summary and Project Design ...... 126 Table A-6 Returns of Ecology Documents................................................................................................. 127 Table A-7 Ecology Document Review Round Durations ........................................................................... 128 Table A-8 Document Review Duration of Ecology Consulting Firms ........................................................ 129 Table A-9 Returns of Ecology Documents of Consulting Firms................................................................. 130 Table A-10 Document Review Durations of Project Improvement Types ................................................ 131 Table A-11 Returns of Ecology Documents of Improvement Types ......................................................... 132 Table E-1 Focus Group Participating Firms ............................................................................................... 157
iv

LIST OF FIGURES Figure 1 Findings and Recommendations ..................................................................................................... 9 Figure 2-1 Performance Data Source .......................................................................................................... 27 Figure 2-2 Independent Variables in OLS Regression Analysis ................................................................... 36
v

Executive Summary
Effective environmental compliance has been identified as a significant risk factor for the on-time delivery of transportation infrastructure projects by state departments of transportation. In this study, we examine one piece of the puzzle; the communication of quality expectations to environmental consulting firms providing technical studies and environmental summary documentation for pre-construction engineering design projects. The focus of this study is on existing communications between the Office of Environmental Services (OES) in the Georgia Department of Transportation (GDOT) which currently reports high error rates and revision requests for reports submitted by consultants and environmental consulting firms. The overall goal of this research is to identify the following: a) the management and communication factors that contribute to the current levels of performance, and b) alternative strategies for improving existing patterns of communication with the consulting community.
The analysis focuses on the documentation for two key areas of OES operations. First, we examine Ecology studies which provide an opportunity to assess communications on projects that have some of the most complex and detailed technical reporting requirements. Second, we examine documentation complying with the National Environmental Policy Act (NEPA) allowing the research team to observe communications associated with the most comprehensive reports integrating inputs from each of the OES subject areas.
The research design is built upon three inter-related tasks. First, we analyzed performance data from two GDOT databases: a) time durations of key OES activities for 560 engineering design projects completed over the last five years (2011-2015); and b) tracking data used by OES to monitor progress in environmental document review of 274 ecology studies completed in the last two years. This analysis provided an evidentiary foundation for developing protocols for the other two research tasks. Second, comparative case studies were developed to
1

understand communication practices and their influence on performance. Six case studies of engineering design projects were developed. In three cases, consultants produced high-quality environmental documents. These were contrasted with three cases in which consultants generated low-quality documents. The case studies were also used as a foundation for designing the focus group protocols.
Third, as the centerpiece of this research, three focus groups were conducted with 22 representatives from firms within the OES consulting community. Information from the first two tasks were used to develop scenarios of performance by OES staff and consulting firms in producing environmental summaries. The scenarios were presented to the focus group participants as a means of grounding the discussion on different classes of problems encountered in the development and review of environmental documents.
We find that better communication strategies can assist in improving performance by both consultants and OES staff during the document review process. These strategies are outlined below. However, we also find that improved communications between OES staff and consultants is unlikely to be a panacea for current performance issues. Improved communications will need to be accompanied by process improvements in OES work flow and enhanced coordination with other units in GDOT (most notably the project managers and procurement staff). An engineering design project is a noisy setting for environmental consultants. OES is not the only actor communicating performance expectations regarding environmental projects during the life of an engineering design project. GDOT project managers and consultant project managers also convey performance criteria; however, their focus is upon project delivery rather than environmental compliance. More importantly, these expectations are communicated earlier in the life of a project (often with more urgency).
2

Figure 1 illustrates key findings across the research tasks. There was strong convergence of feedback from consultants in the case studies (Task 2) and the focus groups (Task 3) over topics where communications could be improved. The strongest findings are then synthesized into a set of strategic alternatives for OES to consider for improving the communications and the performance of the consulting community.
Key findings from the review of performance data (Task 1) include the following items: The average engineering design project took 1210 days to complete (i.e., average duration of
the 560 completed projects over the last 5 years). Within that envelope of time, the average environmental project took 769 days to complete. In contrast, the document review process for ecology studies takes a small proportion of time. On average, it took 86 days from document review assignment to transmittal (i.e., average duration of the 264 completed projects over the last two years). 64.4% of the ecology documents require 3 reviews or more before transmittal; only 8 documents (or 3.0% of the 267 documents) were transmitted after the first review. Several factors contribute to the time duration of the environment summary process and the overall project design process. Environmental summary and project design durations are inextricably and reciprocally linked. The time devoted to the document review process is not significantly related to the duration of either the environmental summary or the overall project design.
The key findings derived solely from the comparative case studies contrasting projects that produce high-quality and low-quality environmental documents (Task 2) include the following:
3

The most defining characteristic of projects that produce high-quality environmental documents is one where the consultant takes the lead in organizing communication processes with OES.
In projects that produce high-quality environmental documents consultants prefer communications with OES that occur earlier in the project process (i.e., prior to document submission) and direct communications by phone or other interactive technologies. Consultants note an over-dependence by OES staff on passive, one-way electronic media for communications of quality expectations. There are several topics identified in both the comparative case studies (Task 2) and the
focus groups (Task 3) including the following: OES has initiated procedural innovations over the last two years, which consultants recognize
as significant improvements to the communications process. These include the addition of communication channels through SharePoint, file transfer protocol (FTP) sites, and early intervention in the document review process with workshops for consultants needing additional guidance on document development (i.e., face to face meetings with OES staff). In general, OES reviewers' raise substantive questions and provide comments that do yield improvements in the quality of consultant reports. Consultants across case studies and the focus groups stressed the need for an updated Environmental Procedures Manual (EPM) and updated templates for environmental documents. Consultants across case studies and the focus groups noted wide variability among OES reviewers in the types of feedback and the issues identified for improvement during document review. Consultants attribute this variability to a) high turnover rates among OES staff; b) the use of learning-by-doing strategies for the professional development of new OES staff; c) the hiring of replacement staff without significant experience in the transportation sector; d) the
4

assignment of documents to OES staff late in the life of a project, frequently after the document has been submitted for review; and e) the heavy workloads of OES staff. Other factors influencing the consultants' performance include a) a lack of internal OES communications among reviewers in each area of specialization; b) design and schedule changes by proejct managers; and c) a lack of communications between OES and other project team members, particularly project managers.
Key findings that emerge from the focus groups (Task 3) include the following: Successful projects require consultants to proactively manage communications with all project
team members including procurement staff, GDOT project managers, consultant project managers, and OES staff. Focus group participants emphasized that the most effective communications are active and direct interactions with project team members including OES staff. Improved communications need to be accompanied by process improvements in OES work flow and enhanced coordination with other units in GDOT, particularly project managers and the procurement staff. OES needs better feedback systems regarding poor performers. This feedback needs to be shared with procurement staff to inform award decisions for new projects. It also needs to be shared with the consulting companies to improve quality assurance and quality control (QAQC) operations. OES needs to have stronger standardization of reporting formats to reduce the number of points of data entry and the complexity of environmental documents. This will reduce the opportunities for errors in the reporting requirements. Consultants noted that high turnover and mobility among OES staff creates the need for transition meetings where key decisions affecting document development are communicated
5

as projects are handed off between OES reviewers. These types of meetings are very rare in the experience of consultants and the rarity of meetings can contribute to significant swings in the issues identified for modification and change during the document review process. Consultants noted the need for clearer guidance among OES reviewers regarding the standard of legal sufficiency as well as OES reviewers' role in the design process. In particular, the balance between the goals of meeting project delivery schedules and environmental compliance needs to be standardized across OES review staff. As OES moves to improve performance databases, the platforms for data entry and data integration need to be improved. Consultants noted that their role in data entry coupled with staff turnover in OES can yield incomplete performance datasets that present challenges for project monitoring and developing an accurate account of project history.
Focus groups were also asked to consider and prioritize alternative strategies for improving the communication of quality expectations to consultants that might yield improved performance in documentation for environmental summaries. The following strategies were prioritized by the consultants: Strategy 1 Updating Reporting Templates and the Environmental Procedures Manual: The
most consistent recommendation provided across the focus groups and the case studies is that OES should devote greater managerial attention to updating the Environmental Procedures Manual as well as updating and introducing greater standardization of templates for generating environmental documents. OES staff acknowledge that the current manual and associated templates have not been updated in several years due to a lack of available personnel. As OES moves to improve and update templates, great emphasis should be given to streamlining OES documents. Consultants who have experience working with other public agencies stated that the current reports required by OES are cumbersome and overly detailed.
6

Strategy 2 Consultant Peformance Assessment: In our analysis, the incidence of high rates of document returns was not normally distributed across the consulting community. For some time, OES has been developing a feedback form for assessing consultant performance. This is an important activity and should be fast-tracked to an operational procedure. This feedback can be an important resource for continuity of knowledge within OES and valuable information for the QAQC procedures of consulting firms.
Strategy 3 Document Review Normalization: OES should consider developing procedures for normalizing the range and types of comments offered during the document review process. The edits and guidance provided during document review constitute the period of time in which OES is in most active communication with consultants. One of the strongest points of feedback provided by consultants during the focus groups is that there is wide variability in the types of issues that are identified for correction by OES reviewers. This is particularly noticable in projects where consultants experience a change in the OES reviewer.
Strategy 4 Communications with Project Managers: OES is not the only source of communications regarding quality and performance expectations for environmental activities during an engineering design project. Other sources include GDOT and consultant project managers who give strong weight to the importance of meeting project delivery schedules. OES should develop practices for communicating performance quality expectations to the project managers.
Strategy 5 Data Management: The data systems currently maintained by OES and GDOT have proved useful in our understanding of both the durations of project activities and the performance of consultants in the document review process. However, there are areas where data limitations were encountered that, if improved, could provide a stronger foundation of monitoring and assessing performance. Consultants noted the challenges of maintaining
7

accurate performance systems in light of the role that consultants play in data entry and the human capital constraints upon OES.
8

Findings and Recommendations

Task 1: Performance Data Review
-- The average engineering design project duration is 1,210 days. Within this envelope of time, the average environmental project takes 769 days to complete.
-- The average time for document review in ecological studies is 86 days from document review assignment to transmittal. 64.4% of documents require three reviews or more before transmittal. Only 3.0% (of 267 documents) are transmitted after the first review.
-- The document review process is not significant factor explaining the duration of environmental summaries or projects.

Task 2: Comparative Case Studies
-- Consultants identified the need for updated templates for documents and an updated Environmental Procedures Manual.
-- Consultants report significant variability among OES reviewers in terms of the issues identified during document review.
-- In high-quality document cases consultants were more proactive in communicating with OES before and after the submission of documents as well as with project managers.
-- Consultants identify OES procedural innovations such as the use of SharePoint and FTP sites as well as workshops as effective improvements in communication.

Task 3: Focus Groups
-- OES communication recommendations by focus groups: 1) template and EPM update; 2) better information dissemination; 3) standardized, objective review criteria; and 4) early communication.
-- OES procedural recommendations by focus groups: 1) transition meetings and transition supervisors when reviewers are re-assigned on projects; 2) clearly defining roles and responsibilities of OES reviewers balancing beween project delivery and environmental compliance; 3) shared understanding regarding legal sufficiency standard among OES reviewers; and 4) improved strategies for maintenance of OES performance databases.

Alternative Strategies
Manual and Templates Update
Consultant Performance Assessment
Document Review Normalization
Communications with PMs
Database Management

Figure 1 Findings and Recommendations

EPM = Environmental Procedures Manual PM = Project Manager

9

Chapter 1 Introduction
1.1. Research Background
One of the persistent challenges in the delivery of on-time, high quality pre-construction design projects is successful identification and quantification of environmental risk factors. This work is often performed for the Georgia Department of Transportation (GDOT) by consulting firms serving both as prime consultants and subcontractors within a larger design project. Currently the GDOT Office of Environmental Services (OES) reports high rates of error in the initial submissions of reports by consultants. This study is designed to assess existing patterns of communication between OES and consultants regarding quality expectations and strategies for improving performance.
Few public reports or academic studies tackle the issue of communicating performance expectations and quality requirements to consultants. This is true in the general literature on consultant management as well as for the specific literatures on the provision of environmental services. In the environmental arena, studies tend to focus on providing guidelines on the preparation of formal environmental documents (AASHTO 2006), or the production of quality environmental programs (NCHRP 2014), or the role of environmental services in a particular type of contracting, such as design-build project delivery systems (Ashuri, Mostaan, and Hannon 2013). These studies recommend practices such as: 1) early and proactive coordination with relevant federal regulatory agencies either by State DOTs or consultants; 2) early involvement of legal counsel in order to anticipate problems and risks; 3) strategies for the effective use of environmental analysis in design projects; and 4) strategies for following-up on problem resolution.
10

This study is designed to address this gap in the research literature. In doing so, we provide an assessment of the existing communication patterns between OES and environmental consultants. Additionally, we examine the relationship between existing communication patterns and the current performance patterns experienced by OES in the document review process.
1.2. Research Objectives
GDOT, like all state DOTs, is responsible for compliance of the National Environmental Policy Act (NEPA) requirements for planning, analysis, permitting, and re-evaluations for transportation infrastructure assets. Included in this process are a variety of reports prepared for each proposed site for infrastructure development organized by OES. These reports demonstrate compliance with federal and state environmental laws and document studies that identify and evaluate impacts on the environment in a variety of subject areas each requiring specific skill sets.1
Timely completion of these reports is an important part of the concept and preliminary design phases of project development. Delays can have a cascading effect throughout the Plan Development Process. Poor quality in the reports can generate significant cost and schedule delay in both the design and construction phase of projects. More importantly, the time devoted to reviewing documents multiple times represents a real opportunity cost for OES personnel in terms of time and attention to other duties. At the outset of this research project, OES staff estimated a 90% return rate on the initial submission of reports by consulting firms. In reviewing the
1 Areas of specialization include expertise in protected species and their habitats, air quality, noise volumes, archaeological sites, historic properties, protected water resources, community resources, environmental justice populations, and NEPA documentation including Categorical Exclusions, Environmental Assessments, and Environment Impact Statements.
11

documents of Ecology Section over the past two years, we find that the actual rates of return are in fact higher: over 96% of documents were returned after the initial submission and 62% were returned to consultants three or more times. A key question is whether improved communication of quality performance expectations is likely to produce a reduction in the return and error rates.
The chief goal of this research is to provide guidance to OES about improving existing patterns of communication with the consulting community. We focus on work related to the generation of documentation for NEPA reports and ecology studies. In doing so, we answer the following questions:
1) What are the existing patterns for communicating performance expectations to OES consultants?
2) Which communication practices are effective and which ones need improvement? 3) What are consultant perceptions regarding the relationship between communications of
quality expectations and performance? 4) What strategies could improve communication practices in such a way to yield
improvements in consultant performance to a level acceptable to OES? To answer these questions, we analyze OES performance data and comparative case studies to better understand existing patterns of communication and performance. We then conduct focus groups with current providers of environmental consulting services to gain a deeper insight into existing practices and to explore alternative strategies for improving communications and performance.
12

1.3. Research Literature
1.3.1. Research on Environmental Procedures and Design Delay
The environmental summary process has been commonly described within the research literature and the professional press as long and arduous, largely due to NEPA documentation requirements. These requirements have been criticized as being too cumbersome, too expensive to implement, taking too long to complete, and, ultimately, not accomplishing the goals and objectives of NEPA (Oppermann 2015).
Hansen, Wolff, and Melcher (2007) attribute problems with the NEPA process to external factors rather than the process itself. They find that procrastination in document preparation and unwarranted internal delays can lead to panicking and frustration by managers who then overcompensate by developing unachievable schedules. Additionally, the absence of timely and adequate coordination and communication between the lead agency and other agencies at the local, state, and federal levels can generate problems within the environmental summary process. Other external factors include insufficient knowledge of NEPA requirements among agency personnel, applicants, and contractors; inadequate personnel training; faulty project management; internal uncertainty; project complexity; and lack of funding.
Recent federal initiatives have sought to streamline the NEPA process. These efforts have tended to focus only on Environmental Impact Statements (EIS) neglecting other types of NEPA documentation including Environmental Assessments (EA), Findings of No Significant Impact (FONSI), and Categorical Exclusion (CE) determinations. These efforts, while useful, are incomplete considering that, according to data provided by the FHWA, 95% of NEPA decisions are based on CE determinations. In light of this metric, it seems useful to obtain more data regarding delays in the handling of other documents. Such analysis could explore topics such as the average
13

number of pages for each document or the average amount of time required to complete each document (Trnka and Ellis 2014).
Lamb (2014) describes the essential components of the NEPA process through five categories: management, organizational, staffing, processes, and access. Senior management is, perhaps, the most critical of these elements. For the process to properly function, management must both value and support the NEPA process. As such, senior management serves as key enablers for promoting a culture of effective engagement and participation of environmental staff within the agency decision-making processes. One important indicator of a supportive culture is when senior management place NEPA staff in positions of authority within the agency. Another essential component is the employment of a sufficient number of qualified NEPA staff within the agency. Staffing levels may also be augmented with consultants who are qualified EA and EIS contractors. Staff also need to employ up-to-date NEPA procedures, training modules on NEPA and impact assessments, sufficient NEPA decision support systems, and processes for responding to inquiries from internal agency staff as well as to identify and respond to lessons learned. Finally, agency staff will need access to a cadre of impact assessment specialists and quality data on existing environmental conditions.
NEPA is not the only source of delay in pre-construction engineering design projects. Yang and Wei (2010) identify a variety of contributing factors to design delay in construction projects that are within the control of project leadership. These factors include: project complexity, inadequate selection of consultants, lack of adequate communication among project designers, and inadequate systems integration between various project subcomponents. Factors outside the purview of project staff and leadership include client modification requirements, the level of bureaucracy of the client, and client-initiated suspensions or delays which induce project change orders. Yau and Yang (2012) describe causes of delay in turnkey projects, which include the
14

following: the lack of detailed schedule planning, failure to adequately integrate interfaces of subcomponents, political interference and other public pressures, repetitive and/or slow government-required review processes, increased scrutiny of specification reviews and drawings caused by designer failures, and obstructions to land acquisition.
1.3.2. Research on Communications and Project Management
There has been growing political pressure to improve environmental performance within the transportation and construction sectors. The research literature has attempted to document the ways in which communications shapes current environmental management practices. Two types of studies have emerged describing the role of communications in environmental management.
First, there are studies that analyze the characteristics of the communication practices. Gluch and Risnen (2009) examine how project organization, practices, and contractual agreements are altered and maintained through the interaction of communication and action in environmental projects. In their study, they find that discrepancies between the communication practices used by different parties can lead to ineffective work across a variety of performance measures. They argue that the forms of communication need to be accounted for and planned around rather than expecting information to be transmitted naturally and without effort. Proposing a "Communication-Mapping Model for Environmental Management" (CMEM), Tam et al. (2007) analyze the environmental communication practices in project development process. Their model provides a means for identifying communication gaps that occur between different parties in construction projects and limit overall efficiency and project success. Using Gadamer's Hermeneutics, Klimova and Semradova (2012) examine sources of discontinuity in communication, such as differing education, professional cultures or practices, language, and technology, while providing practical suggestions for avoiding the communication pitfalls in an
15

academic setting. Tsai (2009) examines information flows between project participants and identifies the effect of communication barriers on the disruption of information exchanges. The author recommends strategies for overcoming barriers through technical means, specifically in on-site communication breakdowns. These studies provide detailed descriptions of communication practices as a means of understanding information exchanges.
Another class of study examines the relationship between communications and project success. Carvalho, Patah, and de Souza Bido (2015) analyze how project success is affected by project management and complexity. In their study, they find that "hard skills" such as documentation and project control are useful, but "soft skills" such as communication and managerial skills are essential to project outcomes. Focusing on project supply chains, Meng (2012) describes how project success declines as a result of ineffective supply chain management. In Meng's study, strengthening relationships and participant communication can make these chains more effective to minimize delays and other problems. By identifying communication issues as one of the primary barriers to project success, Tran, Hallowell, and Molenaar (2015) examine the managerial challenges for rural construction projects. Their study identifies proactive communication and planning, as well as direct communications between the project site and other project participants, as key factors to success.
Sosa, Eppinger, and Rowles (2007) identify internal communication as a key factor influencing project-level performance amongst design engineers working for Pratt & Whitney on the development of the PW4098 commercial jet engine. They find that break-downs in communications lead to unattended interfaces, which describe areas where integration teams and cross-functional design teams fail to communicate. When an unattended interface coincides with a critical point in the design process, the consequences could be disastrous. Reasons cited for communication break-downs include rigid organizational boundaries, which increase the
16

likelihood of unattended cross-boundary interfaces; the presence of non-critical interfaces, whereby attention to components are sacrificed in favor of more complex and critical interfaces; and the use of informal means of communication in lieu of direct communication. Proposed solutions to these examples of communication lapses include a review of organizational boundaries, the forming of teams to handle mismanaged interfaces, and choosing adequate communication support tools.
Perlow (1999) describes the influence of poor communication processes on work interruptions in engineering design teams that lead to time famines (i.e., "... a feeling of having too much to do and not enough time to do it"). Time famines became manifest as a result of repeated interactive activities and sustained through an organizational culture which privileges individual heroics amid crisis. To perform their job effectively and efficiently, software engineers indicated that they required an extended amount of continuous and uninterrupted time. While it was acknowledged that some interactive activities were necessary for successful job performance, the frequency with which they occurred, apparently due to lack of advanced planning, precluded them from effectively and efficiently performing their jobs within the traditional work week. This requires the engineers to devote additional time, including weekends, to complete their tasks. Organizational culture further promoted increased incidences of time-famine as management rewarded those who delivered work on time and maintained high-level visibility amidst such a chaotic environment. In contrast, those who engaged in activities to mitigate time-famines were actually penalized. Time famines can occur in any organization, especially one where the level of staffing is insufficient, staff are inadequately trained or experienced, or both. Employees experiencing time-famines will have difficulties maintaining productive communications with both internal as well as external points of contact or stakeholders.
17

1.3.3. Research on Relational Contracting
There is also a growing literature on relational contracting and partnering, which highlights cooperation strategies between agencies, prime contractors and sub-contractors in construction projects (Chan and Yu 2005, Gransberg and Molenaar 2004). A common prescription is for contract documents to provide a high level of specification for work requirements. Others call for team building arrangements and close knit working relationships between actors where adversarial relations are changed into ones based upon trust (Abudayyeh 1994). This is often accompanied by strategies for conflict resolution to facilitate relationship building. Another approach is to call for standardized rating systems that provide clear signals about desired performance levels of contractors (Minchin and Smith 2001).
A backdrop for all of this research is the body of literature on quality management and quality assurance. There is relatively little in this literature that specifically addresses the problem of communicating performance quality expectations for environmental projects. More generally, there is a lack of a unified framework for specifying performance requirements for environmental consulting services. However, many of the core concepts on quality assurance are still relevant for this study and serve as a foundation for modeling quality in practice.
While each of these literatures notes the importance of working closely with service providers, there remains a need for specific analysis of communicating expectations regarding quality performance. Reports and literature relevant to tackling the issue of communicating quality expectations in the transportation sector typically focus on the construction phase of the projects rather than providing for a comprehensive approach. Moreover, the means of communication and the potential use of information and communication technologies are not included in these studies. The applicability of findings from the management and construction literatures to a preconstruction and design environment is still to be determined.
18

1.4. Key Research Design Tasks and the Structure of this Report
The research design for this project encompasses three inter-related tasks. The centerpiece of the research consists of focus groups conducted with representatives from firms within the OES consulting community. In order for these focus groups to be effective, we conducted research on two related topics: a) current performance levels of environmental activities and impacts upon GDOT preconstruction design projects, and b) comparative case studies of existing practices of communication between OES and consultants regarding quality. This provided an evidentiary basis with which to ground the focus groups, specifically on topics related to communication and performance. In consultation with OES, we concentrated on two subgroups within the consulting community: those engaged in generating NEPA documentation and those who generate ecology reports.
1.4.1. Task 1: Performance Levels of OES Consultants on GDOT Design Projects
The findings for Task 1 are presented in Chapter 2 of this report. We constructed and analyzed a dataset of environmental and performance activities across a sample of GDOT projects. Over time, one would expect for consultant performance to improve in terms of timeliness and quality as the firms become more familiar with GDOT policies and procedures. The fact that this had not been observed by GDOT personnel indicates of the presence of barriers to performance-based learning.
The sample includes firms that regularly provided NEPA and ecology services over the past five years (this period of time coincided with a significant expansion of OES services provided by the private sector). The analysis focuses on the following types of information:
The number and types of studies conducted by a consultant for OES The number of studies returned as deficient
19

The type of deficiency Whether there is a change in federal standards requiring the re-work of studies Whether deficient studies led to delay in the overall preconstruction design project The magnitude of the delay in the project.
To compile this information, we accessed several databases maintained by GDOT and by OES. Data on the duration of environmental project activities and engineering project design activities were drawn from the T-Pro and P-6 databases. These provide information on the project schedule and the duration of key activities associated with GDOT's Plan Development Process. While these databases did not provide a true measure of time on task, they did capture the initiation and completion points of projects and specific project related activities. In general, duration measures represent an envelope of time within which project related tasks are completed.
The analysis also drew upon data maintained by OES to track the document review process. Document review accounts for a large portion of the work life of an OES staff. However, document review takes place relatively late in the life of a design project as it occurs after the OES consultant has completed a report and submitted it for review. It represents a small portion of the overall time devoted to environmental activities within a design project. It also represents the time period when OES staff are in their most active communication with consultants.
The availability of this type of data stemmed from a recent set of innovations by OES designed to improve tracking of documents and facilitation of the review process. The Ecology Section had the most complete set of tracking data covering the last two years of operation. Consequently, we focused our analysis on this dataset for information about errors, returns, and consultant performance on documents.
20

1.4.2. Task 2: Review of Existing Communication Practices
A comparative case study design was used as a means of understanding the degree to which communication practices influence performance outcomes. Six case studies were developed and compared based on the quality of the documents submitted to OES; three cases were considered examples of high performance and three cases were considered examples of low performance. The findings from the case comparisons are presented in Chapter 3.
The focus of each case was on the communications of performance expectations to consultants and sub-consultants and whether these communication practices contributed to the subsequent outcomes. In each case study, we examined the relationship between the source of the communication, the content of the message, the media used to communicate, and the receptivity by consultants.
1.4.3. Task 3: Identification of Strategies for Communications
Findings from three focus groups are presented in Chapter 4 of this report. Each of the focus groups is comprised of consultants who frequently provide services for OES in NEPA and/or ecology related studies. The goal of each focus group was to identify challenges in the communication of performance expectations, the impact of the identified communication patterns on performance, and alternative strategies that may improve the performance levels that OES currently experiences.
Each focus group was asked to review summaries of information generated from Tasks 1 and 2. Alternative scenarios of communication and consultant performance were prepared and presented to each group prior to the beginning of each focus group session. Each member of the group was also given a pre-meeting survey to assess his/her initial perceptions concerning the topic of the focus group. Focus group members were also asked to review different types of
21

performance issues and suggest alternative communication strategies that might assist in reducing the error rates and improve performance. They were also asked to identify best practices based upon their experience working with industry and DOTs.
Chapter 5 provides the conclusions and recommendations associated with this research. The conclusions are organized into a set of alternative strategies for OES to consider as approaches for improving communications with consultants. These strategies fall into two broad categories of activities. First are strategies that require coordination and cooperation with other units within GDOT. While OES has responsibility for managing relations with environmental consultants, it is not the only unit communicating performance expectations. Other key actors include project managers from GDOT and the consulting community whose priorities are driven by project delivery schedules. A second class of strategies are aimed at reducing the number of times environmental documents are submitted for review by environmental consultants. These strategies involve activities where OES can act directly to improve operations.
22

Chapter 2 Performance Indicators in Environmental and Engineering Design Projects
2.1. Introduction
Efforts by OES to communicate quality expectations to environmental consulting firms take place in the context of engineering design projects. This is a complex setting comprised of several actors who communicate their own performance expectations to environmental consultants. Early in the life of a design project, the GDOT project manager communicates performance expectations regarding the work schedule for project delivery. These communications are reinforced by the project manager for the consulting firm responsible for the overall engineering design (roughly 75% of GDOT design projects are performed by consultants). Environmental consultants experience this as intra-firm communications when they are a unit within the consulting firm or inter-firm communications when they are a stand-alone company working as a subcontractor to the design firm. The expectations communicated by these actors emphasize project delivery.
In most projects, OES is the only actor communicating performance expectations with regard to environmental standards. The communications from OES are designed to provide an up-to-date synthesis of environmental standards from a variety of federal and state regulatory authorities. Environmental consultants are expected to satisfy both sets of expectations. When OES guidance is properly integrated into project planning, expectations can be coordinated. However, conflicts can arise when expectations are not coordinated or consultants give priority to one set of expectations over another.
In order to understand the context in which communications of quality expectations take place, we developed a dataset of OES and consultant performance on environmental projects. No one dataset maintained by GDOT contains all of the information related to consultant
23

performance of interest to this study. We developed our sample by drawing from three separate data sets and integrating them as necessary for elements of the analysis. The sample focuses on firms that regularly provide NEPA and ecology services over the most recent five-year period (2011-2015).
The integrated database provides a rich platform for observing recent trends in consultant performance. Descriptive models are developed for observing differences in performance trends across the following:
Classes of environmental projects GDOT-sponsored projects vs. projects sponsored by local governments Projects performed in-house or by consultants Document preparation by consultants Changes in regulations and GDOT procedures
We use results from the analysis of consultant performance data to inform the protocols for case studies of OES projects as well as focus groups conducted with the environmental consulting community. Having a deeper understanding of the performance trends allowed us to target questions more directly to opportunities and challenges in existing operations.
2.2. Methods
2.2.1. Data collection
An integrated dataset of OES consultant performance was developed by building upon information contained in GDOT's T-Pro database and the associated P6 modules. We selected 560 projects that completed the environmental summary process during the 2011 to 2015 time
24

period.2 The environmental summary completion date was based on activity 18100 in the P-6 database. The data drawn from T-Pro and P-6 included project details, activity durations, information about GDOT staff associated with the project, and information about consultants. Activity durations provided the foundation for the performance data and were measured in the number of days from the starting date of the activity to the finish date of the activity.
OES has undertaken several initiatives in recent years aimed at improving communications with the environmental consulting community. Central to this strategy has been the use of SharePoint and FTP sites to improve access to project guidance and project-related documents as well as enhance transparency in the OES process of reviewing environmental reports submitted by consultants.
The Ecology Section has made the greatest strides in using these technologies to manage the work flow with consultants. The NEPA Section followed suit with more recent developments of a SharePoint site. However, the NEPA Section has not progressed to the same point of having a detailed dataset monitoring document review as is maintained by the Ecology Section. This points to strengths and weaknesses in the current performance data systems that required us to rely upon different databases to analyze the NEPA and Ecology Sections in the performance data. For NEPA documentation, we could observe whether the NEPA studies were performed in-house or by consultants using the T-Pro/P6 dataset. However, there are insufficient SharePoint data to do a detailed analysis of NEPA document review. For Ecology documentation, there were considerable missing data regarding whether projects are performed in-house or by consultants
2 A total of 811 projects completed the environmental summary process during the 2011-2015 time period are observed in the T-Pro/P6 dataset. The sample of 560 was developed by excluding 1) maintenance projects determined by Project Identification (PID) information, 2) duplicated projects determined by T-Pro comments, baseline date, and county information, 3) intentionally delayed projects determined by funding phase status, and 4) incorrectly coded projects determined by calculated duration information.
25

in the T-Pro/P-6 data. However, there is a rich database that monitors document review in SharePoint.
The main purpose of the performance data analysis is to understand recent trends and to establish an evidentiary foundation for the other research tasks. We handled the Ecology and NEPA consultant data differently only in this chapter. We are able to do a direct comparison of NEPA and Ecology performance in the case study and focus group data. For more information on data limitations, see Appendix A.1.
We integrated the following information from the Ecology Section's SharePoint site into the consultant performance dataset for portions of our analysis:
Descriptions of 274 ecology documents submitted for review to the SharePoint site from 2014 (when the site came on-line) and 2015.
Document review durations, which are calculated by the number of days from the reviewer-assigned date to the transmittal date of a document to a regulatory authority. The Ecology Section also employs a process by which consultants are invited in for a workshop with OES staff if a document is returned more than twice for additional corrections. The goal of each workshop is to create a completed document ready for approval. Based on this information, we determined the number of rounds of reviews conducted for ecology documents and whether a workshop was required.
Regulatory change announcements are distributed by Ecology Section to consultants through the SharePoint site or through email. We identified 10 announcements indicating a regulatory change and one announcement indicating a procedural change (out of 96 announcements distributed over the 2012-2015 time period).3 The data included the
3 The data from the SharePoint site related to announcements begins in 2012 and runs through 2015. Project performance data from the T-Pro/P6 database are observed from 2011-2015
26

announcement dates and indications of the counties impacted by the announcements. This information was used to calculate the performance reaction time consultants demonstrated in adapting to project changes (as calculated by the number of days from the announcement date of a regulatory or procedural change to the starting date of the environmental summary). Figure 2-1 summarizes data source and data collection process for this study. Descriptive statistics for the collected data can be found in Appendix A.

Project Information
Source: T-Pro and P-6 Sample: 560 projects
that completed their environmental review in 2011 - 2015 Data: project details, activity durations, and GDOT staff and consultnat information Activity duration: the number of days from the actual starting date to the actual finish date

Document Information
Source: Ecology section SharePoint
Sample: 274 ecology documents submitted for review (2014 to 2015)
Data: document details and review timeline information
Review duration: the number of days from the reviewer-assigned date to the document transmittal date

Regulatory Change Information
Source: Announcements on SharePoint and email blasts
Sample: 10 regulatory changes and 1 procedural change
Data: announcement content and date
Lead time: the number of days from the announcement date to the starting date of the environmental review

Figure 2-1 Performance Data Source

For other analyses, the data integration strategy was pursued in two steps. The first approach was to combine project performance information with regulatory change information. Regulatory changes were coded as independent variables based on the announcement date and county information. This provided the basis for ordinary least squares (OLS) regression models

27

that identified significant factors related to the environmental summary duration and the overall project durations (see Section 2.2.3).
Second, the project performance information and document type information were integrated for t-tests and ANOVA tests based on the Project Identification number (PID). If a document had multiple PIDs, the document was duplicated in the dataset to merge with project information. This provided the basis for analyzing the relationship of the document review information with the duration of the environmental summary and the overall duration of a preconstruction engineering design project (see Appendix A.1).
2.2.2. Variables
Dependent Variables. In order to understand the performance of consultants on environmental projects, we observed four variables that capture the time devoted to key activities in the development of environmental summary documents that support the preconstruction design. The dependent variables used in this analysis were developed using two strategies. First, we develop envelope measures of the overall duration of the environmental project and the overall engineering design project. Second, we develop measures of distinctive phases of activity to better capture types of interactions between consultants and OES. The following measures are used in the analysis:4
4 There are challenges associated with developing and interpreting these dependent variables. First, the time duration measures are not a true measure of time on task. They represent beginning and end points found in the official record keeping by GDOT. Ideally, a performance measure should capture the level of effort put into a set of tasks. Current data entry processes do not allow for this degree of certainty to be attributed to the time data. Second, environmental projects do not unfold in smooth, linear, sequential patterns. Environmental projects are reciprocal with other phases of an engineering design project. Activities associated with determining the right-of-way, or utility placement, or geothermal properties of a site can all impact the scope and timing of environmental studies. Third, within OES there are a variety of
28

Environmental summary duration: This measure captures the number of days from the start date to the finish date when the environmental summary (activity 18100 or 10000) is complete.
Non-technical duration: This measure captures the number of days devoted to the environmental summary duration subtracting for days devoted to technical studies. This measure provides a means for sorting between different phases of environmental projects. It also provides for a technical correction for regression model 4 where technical studies are used as an independent variable.
Non-NEPA documentation duration: This measure captures the number of days associated with an environmental project subtracting for days devoted to NEPA document (Categorical Exclusion, Draft Environmental Assessment, and Final Environmental Assessment) preparation. This is another approach to developing an indicator of the time in which more interactions occur between consultants, GDOT reviewers, and federal agencies.
Project design duration: This measure captures the number of days from the start of design project to the actual let date for construction authorization. This measure gives us an ability to compare time devoted to environmental activities within the larger envelope of time devoted to overall project design.
Independent Variables. Building off of our review of the literature and interviews conducted with OES staff, we identified five distinct classes of independent variables that may
disciplines that contribute to the environmental summary. In this study, we solely focused on two of the most complicated ones: Ecology Section (which generates numerous technical studies) and NEPA Section (which integrates reports from all of the different OES disciplines contributing to a design project). However, the activities of these two disciplines have a reciprocal relationship with the other disciplines working within OES.
29

influence the performance time of consultants on environmental projects in the regression models. We also analyzed a sixth class of independent variables drawn from our review of the Ecology Section SharePoint site in the ANOVA and t-tests. Project Characteristics. The first class of independent variables indicates types of work performed in an environmental project. The following indicators (see Table 2-1 and Appendix A for the descriptive statistics) were used in the analysis:
Funding: This is an indicator of whether the project is sponsored by GDOT or by a local government. The measure is a dummy variable where 1=state project sponsored by GDOT and 0=local project.
Document type: This is an indicator of whether the environmental project was designated a programmatic categorical exclusion (PCE), a categorical exclusion (CE), or an environmental assessment (EA) project. Each class carries different standards of environmental summary and different types of documentation for consultants to perform under the NEPA process. There are no environmental impact statement (EIS) documents in the sample. Each is indicated by a nominal measure (1 or 0).
Improvement type: This is an indicator of the type of engineering design work being performed for this project. Table 2-1 provides a list of the 17 different improvement types that engineering design projects might address. Each type is coded as a nominal measure (1 or 0), and 11 improvement types were included in the regression analysis.
Activity Duration: The second class of independent variables capture durations of environmental activities performed during the pre-construction phase. We identified 13 different environmental activity durations and two envelope measures of duration of environmental summary and project design (see Appendix A for the descriptive statistics). The following measures were used for the regression analysis:
30

Technical duration: This measures the number of days devoted to technical studies consolidating the preparation of all technical studies, state review, and federal review. This measure was also used to calculate non-technical duration, a dependent variable in Model 2 and independent variable in Model 4.
NEPA documentation duration: This measures the number of days devoted to different classes of documentation activities such as categorical exclusion (CE) documentation, draft environmental assessment (DEA) documentation, and final environmental assessment (FEA) documentation. These activities were measured in days and used to calculate non-NEPA documentation duration, a dependent variable in Model 3.
Outsourcing. A third class of independent variables describes the variety of contractual relationships found in the sample. The following variables were developed for when tasks are performed in-house or on an outsourced basis (see Appendix A for the descriptive statistics and for t-statistics):
In-house NEPA: This nominal measure indicates whether the NEPA analysis was performed by a consultant or by a GDOT NEPA analyst (1=in-house and 0=consultant).
Consultant ecology reviewer: This nominal measure indicates whether a consultant was used to review the ecology documents (1=consultant and 0=in-house).
GDOT design: This is a nominal measure indicating whether GDOT performed the engineering design work (1=GDOT design and 0=consultant designer). This measure is used as an independent variable in regression model 4.
GDOT Staff Experience. A fourth class of independent variables captures the level of experience of GDOT officials working on environmental and engineering design projects. The following variables were observed:
31

Project Manager (PM) experience (N=559, M=15.60 projects, S.D.=14.54)5: This measures the number of projects in the sample that the GDOT project manager leads providing an indication of the GDOT project manager's level of experience and their workload.
NEPA analyst experience (N=544, M=35.8 projects, S.D.=21.99): This measures the number of projects in the sample that the GDOT NEPA Analyst leads.
Ecologist experience (N=490, M=32.7, S.D.=21.25): This measures number of projects in the sample that the GDOT Ecologist leads.
Regulatory Relations. A fifth set of independent variables was included to measure the influence of federal regulatory changes on project and consultant performance. Many environmental regulations prohibit grandfather clauses, which would enable on-going projects to proceed under legacy rule regimes. This led us to measure both the lead time needed to adapt to new regulations and the level of disruption of new regulations on on-going projects. The following independent variables (see Appendix A for t-statistics) were used in the analysis:
Bat change lead time (N=560, M=198.2 days, S.D.=324.23): In 2012, both Indiana and Gray Bats (myotis sodalis and myotis grisescens) were found south of the previously established ranges in the U.S. Southeast. This discovery prompted federal regulations for these endangered species to be extended into areas in which they were not previously required. The lead time was calculated by the number of days from the announcement date of the regulatory change to the starting date of the environmental summary. If this was announced after project completion or during the project period, the lead time was coded as 0. This indicator measures the amount of time in days devoted to incorporating this bat change.
5 N = Number of Observation, M = Mean, and S.D.= Standard Deviation 32

Procedural change lead time (N=560, M=70.1 days, S.D.=175.75): In 2013, GDOT initiated a procedural change under FHWA's Every Day Counts Initiative. The change allows for Preliminary Field Plan Reviews (PFPRs) to proceed without full environmental document review approvals. PFPRs can proceed either once the draft EA has been signed by FHWA or once all technical studies have been done on CE documents. This lead time was calculated in the same way as "Bat change lead time" and measures the amount of time in days devoted to incorporating this procedural change.
Total regulatory changes (N=522, M=0.75, S.D.=1.02): By observing emails and announcements from the Ecology Section, we identified 10 regulatory changes that were initiated during the time period of this sample. This measures the total number of regulatory changes during an environmental project.
Bat change intervention (N=143): This is a nominal measure for observing whether the change in bat regulations occurred during an environmental project.
Sturgeon change intervention (N=145): The National Marine Fisheries Service listed the Atlantic Sturgeon as endangered. This anadromous species makes its way up Georgia's Atlantic Ocean-draining rivers to spawn during the winter months. A biological effect determination should be proposed for any project that crosses these waters. This is a nominal measure of whether the change in sturgeon regulation occurred during an environmental project. This measure is not used in the regression due to a collinearity problem, but is used for t-statistics (see Appendix A for t-statistics).
Document Review. A final class of independent variables was incorporated using data from the SharePoint site developed by the OES Ecology Section. This site is used to facilitate communications with consultants and to track progress towards the review and approval of all document submissions by OES ecology consultants. This is a relatively new resource for
33

communications with the consulting community having come on-line in 2014. As such, the data from this source was not included in the regression analysis, but used for other separate analyses. The sample analyzed was derived from review processes associated with 274 documents (these documents are work product in support of 205 engineering design projects). The following variables (see Appendix A for the descriptive statistics) were used to describe the environmental document review process.
Document review duration: This measures the number of days from the date of document assignment to a reviewer to the date of document transmittal to FHWA.
Document transmitted version: This measures the number of times a document was returned to the consultant for corrections and clarifications prior to submission to FHWA.
Deficiency type: This measures the significance of errors made by the consultant. It is determined by GDOT ecologists as a non-substantive or substantive error.
Levels of complexity: This is a four-point ordinal ranking of the level of document complexity based on pre-set criteria for each document type.
Ecology document types: There are 15 different types of documents submitted by consultants.
Ecology consulting firms: There are 25 consulting firms in the dataset.
2.2.3. Data Analysis
The performance data tracking project activity duration times from T-Pro and P-6 were analyzed using an ordinary least squares (OLS) regression. The performance data from SharePoint that tracks environmental document review times were analyzed using t-statistics and analysis of variance (ANOVA). The goals of this phase of the analysis are as follows:
34

1. Develop a description of OES projects and engineering design projects that provides a rough estimate of the relationship between the overall duration of each.
2. Develop a stronger understanding of the key contingent factors that shape the time durations of performance as they relate to project type and document type.
3. Develop a stronger understanding of the key managerial factors that are connected to the three tiers review system.
4. Create a performance-based foundation for the development of protocols for the focus groups with representatives of environmental consulting organizations.
OLS regression analysis was adopted to identify the factors that affect environmental summary and engineering project design. We adapted the standard approach to OLS regression by suppressing the constant in the model. The suppressed constant begins with the assumption that all projects start at a baseline of 0 days, which forces the models to count up from day 0. The suppressed constant facilitates a comparison of the coefficient of variables between models of different dependent variables.
The general model for the OLS regression is as follows:
Y [time duration]= [project and activity conditions] + [consultant management] + [internal management] + [regulatory management]
35

Project Characteristics
Activity Duration
Outsourcing
GDOT Staff Experience
Regulatory Relations

Funding: Sponsored by GDOT Document type: PCE/ EA
Improvement type: 11 different types
Technical/ non-technical durations
CE/ DEA/ FEA documentation durations
In-house NEPA Consultant ecology reviewer GDOT design
PM / NEPA analyst / Ecologist experience
Bat change lead time Procedural change lead time Bat change intervention Total regulatory changes

Project and Activity
Conditions
Consultant Management
Internal Management
Regulatory Management

Figure 2-2 Independent Variables in OLS Regression Analysis

We refined the general model to analyze the following duration time periods: Y1= environmental summary duration Y2= non-technical duration Y3= non-NEPA documentation duration Y4= project design duration
T-tests and ANOVA were applied to the SharePoint data to examine the durations of document review, environmental summary, and project design with regard to document review information (deficiency types and transmitted versions). The SharePoint data are only available for the last two years and are organized based on document submissions by consultants. While projects that have completed environmental summary in 2011-2015 were included the data set
36

for the regression, many of projects in the SharePoint are ongoing. This created challenges for linking SharePoint performance data to the T-Pro and P-6 data. We attempted to make this linkage but experienced large numbers of missing observations in the overall dataset. Only 68 projects are found in both datasets. Therefore, we analyzed the SharePoint data separately.
2.3. Results
2.3.1. Analysis of T-Pro and P-6 Data and Regulatory Information
Table 2-1 shows the average number of days that environmental projects and engineering design projects took to complete. The average engineering design project had a duration of 1,210 days. Within that envelope of time, the average environmental project took 769 days to complete. However, there was considerable variance in the duration of projects. For example, projects sponsored by local governments took 139 days more to complete the engineering design work; similarly, the environmental work took 92 days more to complete than the average GDOT sponsored project.
The type of work performed in the engineering design project influenced the number of days associated with completion. Major improvements such as relocation with added capacity, bridge replacement with added capacity, major widening, reconstruction with added capacity, and construction of new bridge took more than 2,000 days for the environmental summary. Meanwhile, restoration, safety improvements, and traffic management projects took less than 400 days for environmental summary.
The durations associated with different environmental activities were closely associated with the overall length of the engineering design projects. Projects requiring an environmental assessment (EA) took 2,592 days (7.1 years) to complete the environmental summary. This took place during an engineering design project that lasts 3,168 days (8.7 years). In contrast, projects
37

subject to categorical exclusion standards (CE) took 893 days (2.4 years) in the environmental summary, which took place during the 1,551 days (4.2 years) for the engineering design project. Programmatic categorical exclusion (PCE) projects took 265 days (0.7 year) in the environmental summary during the 547 days (1.5 years) to complete the engineering design project.

Table 2-1 Duration of Environmental Summary and Project Design

Classification
Total Funding Local
State Document PCE
Type CE EA
Improvement Bridge Rehabilitation (Added Type Capacity) Bridge Rehabilitation Bridge Replacement (Added Capacity) Bridge Replacement Construction of New Bridges Construction of New Roads Environmental Improvements Major Widening Minor Widening Other Enhancements Reconstruction (Added Capacity) Reconstruction Relocation (Added Capacity) Relocation Restoration & Resurfacing Safety Improvements Traffic Management &Engineering

Environmental summary Project Design

N % M S.D. N M S.D.

(Days)

(Days)

560 100 769 1034.2 452 1210 1213.7

197 35.2 829 1023.2 153 1302 1010.4

363 64.8 737 1040.0 299 1163 1304.6

259 46.1 265 439.3 218 547 559.2

246 44.1 893 869.2 194 1551 1047.3

55 9.8 2592 1430.6 40 3168 1702.1

1 0.2 991

1 937

3 0.5 1850 1322.8 3 2328 1823.8

8 1.4 2799 1853.4 5 2661 1988.5

81 14.5 1185 1039.5 60 1744 1185.1

7 1.3 2092 1284.0 6 2595 1702.1

4 0.7 1837 247.8 3 2387 51.4

1 0.2 2065

1 2300

26 4.6 2771 1393.7 18 3875 1403.0

22 3.9 1003 1101.5 20 1561 1341.5

115 20.5 528 484.3 93 1124 654.4

5 0.9 2093 1746.9 3 2954 2402.9

3 0.5 2 0.4 1 0.2 7 1.3 220 39.3

1734 2153.1 3

4107 0.0 2

1961

1

364 169.6 6

350 599.6 192

2608 1924.6 5221 0.0 2657
889 435.9 646 763.1

54 9.6 399 413.3 35 892 611.0

38

Table 2-2 provides the results of four alternative models describing the influence of project
characteristics, contract management characteristics, experience, and regulatory relations on
performance as measured by the alternative approaches to measuring the time durations of
engineering design projects and environmental activities6. The results indicate that the models
6 For the interpretation of regression models, R-squared is an overall measure of strength of the models. The higher the R-square value, the stronger the explanatory power of the model. To interpret the results of the models, the coefficients (B in Table 2-3) of variables are important. The coefficient predicts or describes the relationship between the independent variable and dependent variable. The p-value shows whether the coefficient is statistically significant at a certain level. In general, when p-value is less than 0.05, the coefficient of the independent variable is considered as significant.
Adjusted R squared (r2_a): R squared measures the improvement of the proposed model over an elementary model using the mean of the dependent variable as its estimator. It represents the percentage of reduction of the variance of the error of the former model over the latter. Adjusted R square penalizes models with a larger number of variables because increasing the number of variables reduces the error even when these variables are not significant.
F-statistic (F): The F statistic is the ratio of the Mean Square Error (sum of squares of residuals) of the proposed model to the Mean Square Error of the null model (estimating the dependent variable with its mean). This ratio has an F distribution and tests whether the proposed model has a greater likelihood of being an estimator of the relationship in the data than one with no coefficients at all.
Degree of freedom (df_r, df_m): The degrees of freedom indicate how many independent sources of information are left in the data once the estimating equations are taken into account. Each independent variable requires one equation to estimate its coefficient using all the data. So the error will have as many degrees of freedom as the data minus one (for the mean of the dependent variable) minus the degrees of freedom of the model that depends on the number of estimated coefficients. The model degrees of freedom (df_m) corresponds to the number of coefficients estimated minus 1, and the residual degrees of freedom (df_r) is total degree of freedom minus the model degree of freedom.
Coefficient (B): The coefficients represent the strength of the relationship the dependent variable has with each independent variable. So the dependent variable would increase (decrease if negative) by the amount indicated in the coefficient for each unit increase of its independent variable, holding all other independent variables constant.
Standard Errors (Std. Error): The standard errors of the coefficients are the estimates of the standard deviation of the sampling distribution of the coefficients. That is to say, if the sample from the population were taken many times and the model estimated each time, the standard deviation of the set of those estimates would be the standard error. These are the standard errors associated with the coefficients.
39

provide a good description of the factors that influence performance as indicated by the overall fit of the models (adjusted R squared = 0.90, 0.66, 0.82, and 0.98 respectively).

Independent Variable
Observations r2_a F df_r df_m

Table 2-2 OLS Regression Model Summary

Model 1 Environmental
Summary Duration
323 0.903 104.9 294
29

Model 2
Non-Technical Duration
317 0.656 21.84 288
29

Model 3 Non-NEPA Documentation Duration
323 0.817 50.58 294
29

Model 4
Project Design Duration
219 0.983 411.6 188
31

Models 1, 2, and 3 describe the factors that influence the duration of various aspects of the environmental projects conducted as part of an engineering design project. Model 1 observed the broadest measure of performance as it was designed to explain factors influencing overall Environmental Summary Duration (i.e., the envelope of time devoted to conducting an OES project). The fit of Model 1 was good with an adjusted R2 of 0.90.
Models 2 and 3 were designed to describe specific components of work for environmental projects. Model 2 separates out the time on an environmental project devoted to conducting the technical studies (i.e., Non-Technical Duration]. The technical studies phase of work happens early in the life of an engineering design project and serves as input into the design phase. The NonTechnical Duration model describes the factors that influence the phases of work later in the project coordinating between design and the submission of final environmental summary

p-value (p): The p-value associated with each coefficient is the probability that the coefficient is zero rather than the non-zero value given by the estimate. In other words, it is the probability of making a mistake when affirming that the coefficient is different from zero when, in reality, it is zero. This sort of mistake is called a Type I error (affirming something to be the case when it is not).
40

documents. As with Model 1, factors associated with each of the classes of independent variables influenced the time devoted to environmental projects outside the technical studies portion of the work. However, the fit of the model, while still good, did reduce to an adjusted R2 of 0.66.
Model 3 separated out the portion of time on an environmental project devoted to conducting NEPA analysis (i.e., Non-NEPA Documentation Duration). This model describes the amount of time devoted to generating the documentation from all the different specialty areas within OES inclusive of all technical studies. As with Model 1, factors associated with each of the classes of independent variables influenced the time devoted to duration of non-NEPA document preparation activities. The fit of Model 3 was good with an adjusted R2 of 0.82.
By comparing the fit of the Models 1, 2, and 3, we see that the regression models of the environmental portion of work provide a robust description of the relationship between project and managerial characteristics and the duration of environmental activities. While there are differences between the fit of the overall duration and specific subsets of environmental activity, the measures of fit are sufficiently strong to indicate a good representation of the processes of conducting environmental work of OES.
Model 4 describes the relationship between project and managerial characteristics and the overall duration of an engineering design project (i.e., Project Design Duration). The dependent variable, Project Design Duration, was used in Model 4, and an additional independent variable, GDOT Design, was used to examine the differences in time duration between projects designed by consultants and projects designed by GDOT. The overall fit of the models of project design duration are even stronger than those for environmental duration with an adjusted R2 of 0.98.
While we remain cautious about making predictive claims, the descriptive properties of these models provide us a basis for observing key contingencies in the development of
41

environmental activities within an engineering design project. This, in turn, can provide OES with a framework for managerial decisions regarding the allocation of resources and time across a portfolio of projects.
Four classes of independent variables are included in the analysis for each of the dependent variables: a) project and activity conditions; b) consultant management; c) internal management; and d) regulatory management (see Figure 2-2). We found factors within each class of independent variables influential in each of the models. Project and Activity Conditions. Project characteristics and technical duration describe the influence of factors related to governance and the type of work performed on the time durations of environmental projects and the overall engineering design project in the following ways:
Funding: Whether projects were sponsored by GDOT or a local government is a significant factor influencing the overall time duration of project (Project Design Duration, Model 4). However, this was not a significant factor for the models describing the duration of environmental projects (Models 1, 2 and 3). In the descriptive statistics, engineering design projects sponsored by local governments took longer to complete on average. The regression analysis indicates a contrary finding that projects sponsored by GDOT take 113 more days to complete project design in Model 4.
Environmental document type: PCE type projects take 157 fewer days (Model 4) in the project design duration, relative to CE projects. EA type projects take much more time to complete for all dependent variables (Models 1-4), holding other factors constant. For models of the duration of environmental projects (Models 1, 2, and 3), PCE type projects take fewer days (73 days for Model 1 and 3 and 93 days for Model 2), relative to CE projects, but the differences are not statistically significant. EA projects account for a
42

larger proportion of the duration of an environmental project as compared to the overall engineering design project. Improvement type: These factors were measured as dummy variables with resurface and restoration projects serving as a reference improvement type. There is considerable variation in the amount of time devoted to different improvement types and this is reflected in the length of time for environmental activities and project design. For example, bridge rehabilitation projects take around 1,250 more days in environmental activity durations than resurface and restoration projects, but are not significantly different in project design duration. Reconstruction projects with adding capacity take around 1,400 more days in environmental summary and around 600 more days in project design than resurface and restoration projects.7 Activity duration. Technical duration is highly correlated with all dependent variables. With each additional day required for technical studies, the environmental summary and project design increase 0.83 and 1.07 days, respectively. Similarly, non-technical duration is also a significant factor in the duration of project design (Model 4). When the time devoted to environmental activities outside the technical studies increases one day, the project design duration also increases 1.05 days.
Consultant Management: The following consultant management factors influence the duration of the environmental project and the overall engineering design project:
7 Unlike other improvement types that take a longer time than the reference group, new road construction projects show an odd result in that the sign for the coefficient is negative. However, only three cases belong to this category, so the small number of cases might lead to this uninterpretable result.
43

NEPA documentation duration: Currently, 68.4% of NEPA documentation is completed by consultants. The length of time that consultants take to prepare CE and DEA documents can be significant to the environmental summary duration and non-technical duration. With each additional day that CE documentation requires, the environmental summary duration and non-technical duration increase 0.25 and 0.23 days, respectively. With each additional day devoted to DEA documentation, the environmental summary duration and non-technical duration increase 0.96 days and 1.01 days, respectively. Unlike CE and DEA documentation, FEA documentation is correlated with the project design duration. With each additional day devoted to FEA documentation, the project design duration increases 0.27 days.
Outsourcing: The regression models include three variables related to outsourcing: ecology document review, NEPA documentation, and project design. Projects with consultant reviewers, on average, take less time to complete the environmental summary. However, these same projects take longer periods of time for the entire project design process. Projects with in-house NEPA documentation take more time to complete project design (Model 4). In contrast, projects with GDOT design teams take less time to complete project design. Outsourcing a task can help reduce time for that task, but increase the duration for other tasks that need collaboration of the in-house staff. Outsourced environmental tasks speed up that environmental summary process, but lead to later delays in the form of longer total design processes.
Internal Management: The following internal management factors influence the duration of the environmental project and the overall engineering design project:
GDOT staff experience: The workload of the OES staff and project managers in GDOT can significantly increase the durations of environmental activities and project design. 44

Ecologist's experience (as measured in number of projects assigned to the ecologist in our sample) is significantly related to the duration of the environmental project review. In contrast, PM experience and NEPA analyst experience are significant factors influencing the duration of the overall project design. Regulatory Management: These are factors associated with management of external relationships with environmental agencies setting standards and procedures for compliance. The following factors influence the duration of the environmental project and the overall engineering design project: Regulatory relations: We observed the amount of time between when changes were made in federal regulations and/or procedures and the beginning of a project. Bat Change Lead Time captures this time period associated with a change in federal regulations on an endangered bat species. Projects that had greater lead time take less time on the environmental summary and the overall project design. Procedural change lead time captures the time period associated with PFPR. In contrast, the lead time for PFPR change is associated with a slight increase in the duration of environmental summary. Projects experiencing regulatory change during the life of the project did not report a significant increase in time duration. While projects experiencing multiple regulatory interventions during the life of the project do not show significant differences in the environmental activity durations, they took less time to complete the overall project design.
Table 2-3 shows the results of four OLS regression analyses. For consultant and regulatory management, t-tests with different set of outsourcing and regulatory intervention variables were conducted. The results are attached in Appendix A.
45

Table 2-3 OLS Regression Results

[1] Environmental

summary Duration

Classification

Variables

B

Std. Error

Funding

Sponsor by GDOT

-26.45 (72.29)

Document PCE

-73.47 (73.32)

type

EA

572.8*** (180.4)

Improvement Bridge rehabilitation

1,254*** (295.6)

Project and Activity Conditions

type

Bridge replacement (capacity added) 463.4* (249.6)

Bridge replacement

390.7*** (113.4)

New bridge construction

400.8* (233.5)

New road construction

-872.9*** (327.6)

Major widening

515.8*** (194.7)

Minor widening

331.2** (158.5)

Reconstruction (capacity added)

1,415*** (278.4)

Reconstruction

812.0** (372.3)

Relocation (capacity added)

296.2

(412.6)

Safety improvements

297.6*** (109.4)

Traffic management

301.3** (126.6)

Other enhancements

340.8*** (116.0)

Activity

Technical duration

0.825*** (0.0419)

duration

Non-technical duration

NEPA

CE documentation duration

0.249** (0.105)

documentation DEA documentation duration

0.961*** (0.129)

duration

FEA documentation duration

-0.128 (0.169)

Outsourcing

In-house NEPA

-22.18 (66.19)

Consultant ecology reviewer

-170.4** (81.94)

GDOT Design

GDOT staff

PM experience

-1.581 (2.979)

experience

NEPA analyst experience

1.202

(1.365)

Ecologist experience

3.114** (1.363)

Regulatory

Bat change lead time

-0.708*** (0.244)

relations

Procedural change lead time

0.780* (0.463)

Bat change intervention

-20.58 (145.0)

Total regulation changes

-26.73 (60.60)

Standard errors in parentheses; *** p<0.01, ** p<0.05, * p<0.1

[2] Non-Technical

Duration

B

Std. Error

-2.310 (71.52)

-92.57 (72.86)

400.1** (182.9)

1,248*** (290.6)

507.2** (245.6)

404.8*** (112.4)

481.2** (230.3)

-749.9** (323.5)

621.3*** (193.2)

335.1** (155.8)

2,069*** (322.9)

889.5** (366.4)

442.2

(407.3)

304.7*** (107.6)

303.3** (125.6)

364.7*** (114.4)

-0.166*** (0.0413)

0.232** 1.012***
-0.113 -33.84 -170.9**

(0.104) (0.128) (0.166) (65.68) (80.67)

-1.790 0.924 2.792** -0.634*** 0.820* 9.105 -38.71

(2.937) (1.351) (1.346) (0.241) (0.461) (142.8) (59.76)

[3] Non-NEPA

Documentation Duration

B

Std. Error

-26.45 (72.29)

-73.47 (73.32)

572.8*** (180.4)

1,254*** (295.6)

463.4* (249.6)

390.7*** (113.4)

400.8* (233.5)

-872.9*** (327.6)

515.8*** (194.7)

331.2** (158.5)

1,415*** (278.4)

812.0** (372.3)

296.2

(412.6)

297.6*** (109.4)

301.3** (126.6)

340.8*** (116.0)

0.825*** (0.0419)

-0.751*** -0.0393 -1.128*** -22.18 -170.4**

(0.105) (0.129) (0.169) (66.19) (81.94)

-1.581 1.202 3.114** -0.708*** 0.780* -20.58 -26.73

(2.979) (1.365) (1.363) (0.244) (0.463) (145.0) (60.60)

[4] Project Design

Duration

B

Std. Error

112.8** (53.43)

-157.3*** (53.57)

284.0** (139.3)

101.7 (187.7)

465.1** (178.7)

400.9*** (83.97)

222.3 (147.9)

70.75 (227.4)

505.6*** (149.2)

344.4*** (107.6)

637.6** (254.8)

-8.585 (234.9)

287.7

(266.2)

304.2*** (77.49)

400.3*** (97.59)

467.6*** (87.13)

1.069*** (0.0302)

1.047*** (0.0399)

0.105 (0.0691)

-0.238 (0.193)

0.270** (0.137)

99.83* (56.47)

133.4** (58.83)

-119.0** (57.38)

6.401* (3.583)

2.140** (1.035)

0.400

(1.017)

-0.634*** (0.224)

0.938

(0.578)

72.57

(105.1)

-89.00** (44.88)

46

Other findings regarding project and activity information are included in Appendix A and include: Environmental Activity Duration, Environmental Activity Duration of the Document Types, Task Outsourcing Based on Environmental Document Type, T-tests with a Set of Task Outsourcing, and T-tests with a Set of Regulatory Changes.
2.3.2. Analysis of P-6 and SharePoint Data
Estimates of the overall duration of environmental projects do not provide an accurate assessment of the level of performance associated with OES staff. Environmental projects have increasingly been outsourced and performed by consulting firms. The work of OES staff has become focused on the review of the work product of consulting firms. Examination of data from the Ecology Section SharePoint site provides a more accurate view of performance in the review of consultant work product and the evaluations provided by OES staff.
Table 2-4 provides summary statistics for data drawn from the SharePoint site. Perhaps the most telling indicator of the magnitude of the problem that OES confronts regarding consultant performance is that 64% of the documents had to be returned to consultants three or more times for revisions and corrections to the report. Cutting down the number of documents that have multiple touches by OES staff represents an opportunity for substantial time savings by GDOT. GDOT staff report that they do not employ particularly stringent criteria during the review process. They are required to accept reports that meet the standard of "legally sufficient." However, even by this standard, over 57% of the consultant reports were wrought with numerous and substantial errors within the submitted document.
On average, it took 86 days from document review assignment to transmittal to a federal agency for approval. Documents returned to consultants multiple times were particularly problematic. Those documents returned to consultants four times or more required twice as
47

much time to reach a stage of readiness for transmission to federal agencies in comparison to documents receiving approval on the first or second version. Similarly, documents containing substantial errors required 46 more days to complete review than non-substantial error documents.

Table 2-4 Ecology Document Review Durations

Classification

Total
Transmitted 1st version Version 2nd version
3rd version 4th or higher version

Deficiency type Levels of complexity
Ecology Document Type

Non-Substantial Error Substantial Error 1 2 3 4 Addendum (ADDM) Aquatic Species Survey Report (ASR) Biological Assessment (BA) Buffer Variance Exemption (BVE) Buffer Variance Modification (BVM) Buffer Variance Application (BVA) Ecology Assessment of Effects Report (EAOER) Ecology Resource Survey-Assessment of Effects Report (ERS-AOER) Ecology Resource Survey Report (ERSR) Individual Permit Application (IPA) Memo Practical Alternatives Review (PAR) Pre-Construction Notification (PCN) Permit Modification Protected Species Survey Report (PSSR)

M (Days)
85.66 14.88
62.94 91.70 116.34
59.92 106.14 71.04 84.58 94.23 93.56 79.64 138.80 93.50 13.00
0.00 66.17 129.60 98.49

N

S.D.

267 70.765

8 14.015

87 57.570

111 67.731

61 81.091

98 47.359 153 77.589 71 60.816 64 71.980 96 74.852 36 73.662 53 70.812 15 95.053
4 64.697 1 1 23 50.818 5 130.422 57 74.781

103.00 108.60 38.31 70.89 61.75 22.50 117.12

16 48.022 5 58.748
26 34.690 9 50.792
24 51.310 2 16.263
26 73.281

Not surprisingly, more complex ecology reports required greater time to transmission. Document review duration based on the ecology document type also showed a variance. For example, survey reports such as Aquatic Species Survey Report and Protected Species Survey
48

Report took more time for review because they often required completion of other document
reviews for concurrent transmittal. More complex documents, such as Ecology Assessment of
Effects Report and Individual Permit Applications, also required more than 100 days to document
transmittal on average.
The SharePoint data provide insight into the document review process conducted by OES
ecology staff. While document reviews occupy a large portion of the work life of all OES staff
regardless of area of expertise, they also occupy a relatively short amount of time in the overall
life of an environmental summary project or an engineering design project. For example, the
average review time for an ecology document with non-substantial errors was 59 days out of an
average environmental summary process of 788 days and an overall project design duration of
1,442 days (see Table 2-5).
Table 2-5 provides the results of a t-test of a sample of documents based on deficiency
types of the ecology documents 8 . The difference in total review duration between non-
8 For the interpretation of t-test and ANOVA, the p-value shows whether the difference between groups are statistically significant at a certain level. In general, when p-value is less than 0.05, the difference is considered as significant.
t-statistic (t): The t-statistic is computed under the assumption that the sample (or residuals in the case of regression model) has a normal distribution to test whether the value of the difference is sufficiently different from zero (approximately twice the standard deviation of the sampling distribution) to be considered statistically significant.
Standard Error Difference (S.E.D): Standard error is the estimated standard deviation of the mean for each level of the independent variable, and the standard deviation of the sample means is expected to be close to the standard error. Standard error difference is the difference in the standard error of two groups.
95% Confidence Interval (95% CI): These are the lower and upper bounds of the confidence interval for the mean. A confidence interval for the mean specifies a range of values within which the unknown population parameter may lie, with 5% or less probability that the true value lies outside.
Sum of Squares: Sums of squares are applied as an overall measure of the variance of the error of an estimate. Each deviation is taken and squared and all of them summed. This sum is a measure of the variance, an estimate of which is obtained by dividing the sum of 49

substantial error documents (N = 111, M = 58.88) and substantial error documents (N = 183, M = 106.53) was statistically significant. The durations of environmental summary and project design were not statistically significant; but the substantial error documents showed longer durations than the non-substantial error documents.

Table 2-5 Document Deficiency Types and Task Duration

Deficiency type
Total Review Duration Environmental summary Duration Project Design Duration
* p < 0.05

Non-Substantial Error Substantial Error Non-Substantial Error
Substantial Error
Non-Substantial Error Substantial Error

t

N Mean S.D. (df) p S.E.D. 95% CI

111 58.88* 183 106.53*

46.09 75.03

-6.745 (292)

.000

7.06

[-61.55, -33.74]

59 77

788.29 856.37 1092.97 1009.27

-1.861 (134)

.065 163.70

[-628.45, 19.08]

23 38

1442.39 1357.06 2105.79 1660.47

-1.616 (59)

.112

410.62

[-1485.04, 158.24]

Table 2-6 provides the results of a one-way ANOVA based on document transmitted versions that is linked to the times of document return to consultants. These differences are broken down between documents transmitted early in the review process (i.e., transmission after the 1st or 2nd review) and those taking 3 reviews and 4 reviews or more to complete.
The difference in total review duration was statistically significant. Documents transmitted after the 1st or 2nd review took an average of 61 days in the review process (N = 102). Documents transmitted after the 3rd review took an average of 89 days to complete (N=136).

squares by the number of observations minus 1. The sums of squares are used in analyses of variance that test the ratio of two variances to check whether they are equal or not. Since the constant (number of observations minus 1) cancels out in the ratio, the sums of squares are sufficient.
50

Documents requiring 4 or more reviews took an average of 116 days to be transmitted (N=74).9 Two years ago, in response to the high number of reviews, OES instituted a workshop system where OES staff have the authority to bring in consultant at the stage of a 3rd review to correct deficiencies and pre-empt additional reviews prior to transmission. As was the case with the deficiency types, the durations of environmental summary and project design were not statistically significant.

Table 2-6 Document Returns and Task Duration

Transmitted Version

Total Review 1st or 2nd

3rd

4th or higher

Total

Environmental 1st or 2nd

Summary

3rd

Duration

4th or higher

Total

Project Design 1st or 2nd

Duration

3rd

4th or higher

Total

* p < 0.05

Std.

F

N Mean Deviation (df1, df2)

102 60.6* 61.10

136 88.9* 64.54 15.394

74 116.3* 75.62 (2, 309)

312 86.1 69.26

54 932.1 878.71

57 919.5 932.39

0.784

28 1181.9 1187.80 (2, 136)

139 977.3 967.49

24 1504.3 1284.36

26 1794.9 1419.36

1.395

13 2402.2 2200.54 (2, 60)

63 1809.5 1572.03

Sum of

P

Squares

.000 1491988.872

.459 129172504.489

.256 153219339.714

Additional findings in relation to ecology document review information are included in Appendix A and include the following: Document Review Duration of Ecology Consulting Firm,

9 Scheffe tests also showed that the differences among the three groups are statistically significant. The Scheffe test is used generally in the context of One-Way Analysis of Variance for differences of means for more than two populations in a sample. It uses a constrained optimization method to test the statistical significance of differences of means in multiple groups with respect to all groups simultaneously rather than pairs of groups at a time. It provides narrower confidence intervals for each difference of means since it uses the information in the entire data set for each rather than the population in each pair at a time (for more information: https://en.wikipedia.org/wiki/Post_hoc_analysis#Scheff.C3.A9.27s_method).
51

Document Review Durations of Project Improvement Types, Ecology Document Review Round Duration, Returns of Ecology Documents, Returns of Ecology Documents Based on Consulting Firms, and Returns of Ecology Documents Based on Improvement Types.
2.4. Integration with Case Studies and Focus Groups
The results of performance data analysis provided a foundation for the development of the comparative case analysis and the focus groups. First, the results of the performance data analysis were utilized to understand environmental work and document review processes. The results of performance data analysis provided a reference for the examination of the status quo of environmental tasks in the context of PCE, CE, and EA projects. The performance data provided a better understanding of the reciprocal nature of environmental project work and the document review process.
Our review of the performance data helped us identify several factors that contribute to the time duration of the environment summary review process and the overall project design process. We also identified several activities that shape the overall time devoted to environment summary and project design. Environmental summary and project design durations are inextricably and reciprocally linked. Delays in one lead to delays in the other.
However, this relationship does not extend to the document review process. Multiple rounds of review were not a significant factor in the overall time duration of projects (or even the overall time duration of the environmental summary). Document review happens to occur at a time critical juncture between the environmental work and the schedule for the project design. However, the factors that contribute to multiple rounds of review and poor document performance are not entirely attributable to the complexity of the project design. Thus, we
52

examine the relationship between project communications and the quality of consultant performance more deeply in the comparative case studies and the focus groups.
53

Chapter 3 Comparative Review of Case Studies
3.1. Introduction
The case studies served two purposes for the overall research objectives of this project. First, they facilitated a review of the existing communication practices between GDOT and the OES consulting community. A comparative study design was used in order to gain understanding about the degree to which these communication practices influence performance outcomes. Second, the case studies provided an evidentiary basis to ground the focus groups around communication and performance topics. They clarified the types of communication practices in use, the current performance levels of OES consultants, and the impact their actions have on GDOT projects. This information facilitated the creation of project scenarios used to stimulate conversation among focus group participants.
This chapter provides a summary of the data across the cases. The individual cases have value as a study of management practices associated with environmental summary and can be accessed in Appendix B. However, the overall design of this study focuses on understanding the relationship between existing communication patterns with consultants and the resulting performance levels experienced by GDOT managers during document review. The best lens for this topic is the comparison of cases between those that were performed at an acceptable level of quality and those that did not meet these standards.
54

3.2. Methods
3.2.1. Case Selection
Using a multiple comparative case study design, we examined the communication patterns that occurred in the environmental summary work associated with six engineering design projects. Table 3-1 provides a summary description of the cases.
Cases were chosen from consulting communities involved in generating NEPA documents and ecology reports. The selected cases represent a cross section of the portfolio of work performed by OES in terms of the improvement type, the source of funds, and the type of environmental work performed. The primary criterion in grouping the cases was whether the performance was perceived positively or negatively during document review. Three high-quality document cases were matched with three low-quality document cases. The selection strategy further called for the inclusion of specific types of projects matched one-to-one with the highquality document group and the low-quality document group, including the following: a) projects funded by local governments, b) projects performed by a single firm, and c) projects of the same improvement type.
We created a list of potential cases based on the case selection criteria and document review data provided by OES. The final selection of three high-quality document projects and three low-quality document projects was made in consultation with OES. While the cases were perceived as a good performance or bad performance by OES with regard to document quality, this is not an assessment of overall project quality.
55

3.2.2. Data Collection
A semi-structured interview protocol (see Appendix C) was developed to explore specific aspects of communication practices between GDOT and the OES consulting community. The protocol was designed to investigate how communications of performance expectations are being transmitted to environmental consultants and sub-consultants and whether or not these communication practices contribute to the subsequent performance outcomes in terms of document quality. Questions examined the relationship between the source of the communication, the content of the message, the media used to communicate, and the receptivity by consultants.
56

Table 3-1 Case Overview

Project Sponsor Improvement Type
Environmental Summary Document Review Example
Review Comment Summary

High-quality Document Case 1 Local Government Major widening
Mar 2014 Jun 2015 (15 months) [ERS-AOER & PSSR] 54 days 1 round of review Perceived as a good document

High-quality Document Case 2 State Government Bridge Replacement with No Added Capacity Mar 2014 Dec 2015 (21 months) [ERS-AOER] 71 days 2 rounds of review
Perceived as a good document

High-quality Document Case 3 State Government Bridge Replacement with No Added Capacity Jul 2013 Jul 2014 (12 months) [ADDM] 40 days 2 rounds of review
Perceived as a good document

Design Consultant Prime Contractor

NEPA Consultant Ecology Consultant Interviewees: Consultants

Subcontractor I Subcontractor II
Ecology Consultant, NEPA Consultant (also for low-quality document 1)

Interviewees: GDOT

GDOT Ecologist, NEPA Analyst

GDOT In-house Design Prime Contractor Prime Contractor

GDOT In-house Design Prime Contractor Prime Contractor

Ecology and NEPA Consultant (also high-quality document 3), Ecology Consultant (also high-quality document 3) GDOT Ecologist (also low-quality document 2), NEPA Analyst

Ecology and NEPA Consultant (also high-quality document 2), Ecology Consultant (also high-quality document 2) GDOT Ecologist, NEPA Analyst

Low-quality Document Case 1 State Government Bridge Replacement with No Added Capacity Apr 2012 Jan 2014 (21 months) [ADDM] 28 days 2 rounds of review
Teleconference and workshop held
GDOT In-house Design Subcontractor Prime Contractor
Ecology Consultant, NEPA Consultant (also high-quality document 1)
GDOT Ecologist

Low-quality

Low-quality

Document Case 2 Document Case 3

State Government Local Government

Safety Improvements Construction of New

Bridges

Aug 2014 Mar 2016 May 2011 Apr 2014

(20 months)

(36 months)

[ERS-AOER & PSSR] [ADDM] 107 days

104 days

3 rounds of review

3 rounds of review

Workshop held and Returned without a

100+ comments

complete review due

to errors

Prime Contractor Prime Contractor

Prime Contractor Prime Contractor

Subcontractor I Subcontractor I

Ecology and NEPA Consultant, Ecology Consultant

Ecology Consultant, Ecology and Air/Noise Consultant

GDOT Ecologist (also high-quality document 2), NEPA Analyst

GDOT Ecologist, NEPA Analyst

57

The protocol provides a standard framework of topics addressed in every interview including the following topics:
Professional background of the interviewee, including the level of experience, education, and training;
Project history, including processes and communications during pre-award, post-award, and postsubmission;
Comparisons to other projects, including communication patterns with other public sector clients; Recommendations, including how communication practices and work relationships can be
improved; Firm experience, including the firm profile, clients, and competencies.
3.2.3. Data Analysis
All interviews were recorded, transcribed and coded by three members from the research team as a means of identifying the key factors and relationships. The use of three coders helps avoid inter-rater bias and ensures exhaustive coverage of categories. In the first cycle of qualitative analysis, descriptive coding was utilized to understand the processes discussed by respondents and the context of the project. In the second cycle of analysis, pattern coding was used to explain the relationship between key concepts. Data from the checklist questionnaire was also analyzed for systematic differences in responses from highquality document cases and low-quality document cases. We also explored the questionnaire responses to determine whether there are differences between GDOT reviewers and consultants.
3.3. Results
One of the primary goals of the case studies was to identify key points of communication between reviewers and consultants. We sought to understand the main communication patterns and dynamics in
58

each case and to see if there are consistencies between cases. However, the goal was not to develop a detailed chronology of communications for each case.
As a first step, we assess the content of communication and compile an inventory of communication resources that provide knowledge, guidance and instructions to NEPA and ecology consultants. These resources create a framework for communication between reviewers and consultants. Furthermore, we investigate the amount of change in content that occurred over the lifetime of a case. In doing so, we examine the variety of signal channels and the amount of change in signal channels used by reviewers and consultants to communicate performance expectations during the case. We also explore the relationship between communication practices and performance outcomes with both OES reviewers and consultants.
3.3.1. Communication Content and Stability a. Communication Content
Communication content refers to task-related information that flows vertically from OES (and other agencies) to NEPA and ecology consultants. GDOT provides communication content to its consulting community in a number of ways. Table 3-2 gives an overview of the different communication instruments, which GDOT uses. With these instruments, GDOT creates the architecture for communication, which can be classified into two main categories.
Policy, Procedural and Regulatory Guidance and Instructions Document Preparation Guidance and Instructions.
59

Table 3-2 Overview GDOT Communication Content and Instruments

Instrument Environmental Procedures Manual (EPM)
SharePoint Site
FTP Site
GPTQ Meetings
Email Blasts

Content

The EPM is a 262 page document outlining how to successfully complete

environmental projects. It is intended to be a detailed resource advising how each

section of a project ought to be completed before being submitted to OES for

review. It gives instructions on how to prepare documents that comply with both

state and federal regulations in the preparation of both NEPA and Georgia

Environmental Policy Act (GEPA) documents and includes OES expectations for the

following:

Quality control/assurance

Public Involvement

Technical document preparation

4(f) provisions prohibiting DOT use

NEPA and GEPA document

of land in significant

preparation

natural/historical areas

Plan Development Process

Reevaluations

/scheduling

Commitments (Green Sheets)

Required early environmental

Environmental certifications

activities

Responsibilities to local

Environmental studies

governments

The OES SharePoint Site is a web-based platform used to make up-to-date reference

material available to both the consulting community and in-house OES staff. It

supplies a shared space in which consultants and OES staff can access time-sensitive

material. It contains contacts/addresses, templates for necessary documents, and a

board listing announcements on a variety of relevant subjects including

methodological suggestions, procedural changes, and regulatory updates. Only

consultants who request access can use it.

The FTP site is a secure site used for exchanging sensitive documents between

consultants and OES staff. This site is employed by consultants and their reviewers

during the review process to exchange comments and responses to those comments

on submitted documents for environmental projects. All documents contained on

the site are deleted after an interim period for security purposes.

Georgia Partnership for Transportation Quality (GPTQ) meetings are held at GDOT

by OES and are open to anyone of the consulting community who choose to attend.

These meetings are held quarterly and revolve around relevant topics for

environmental projects. Of special interest are topics and questions, which have

proven troublesome or of concern to OES staff. Consultants who attend are

encouraged to participate and discuss how best to address these issues after the

presenter has spoken on the topic.

Email blasts are a supplement to the SharePoint site. They are used to disseminate

up-to-date reference material and time-sensitive materials to consultants on the

OES mailing list. This list is different than the list of consultants signed up to use the

SharePoint site. Email blasts include information on template alterations,

methodological suggestions, procedural changes, and regulatory updates.

Across the case studies, we observed communications of these contents occurring at multiple points during the process: prior to the project assignment, during the document preparation, and after
60

the submission of the documents. Consultants are expected to have the knowledge and skills to produce quality ecology and NEPA documents prior to taking the project assignment. We observed a variety of reasons for the necessity of content communication during the life of a project. In some cases, document preparation standards changed over time requiring communication prior to the submission of documents. However, in other cases, consultants were poorly prepared or had trouble keeping up with announced changes in content. Policy, Procedural and Regulatory Guidance and Instructions Consultants in both high-quality document and low-quality document case studies report accessing the EPM and attending quarterly GPTQ meetings. There was less evidence in the interviews that consultants receive email blasts from GDOT about procedures and regulatory guidelines. Only two ecology consultants (high-quality document case 2 and the low-quality document case 2) mentioned that they would occasionally receive email blasts containing procedural and/or regulatory updates.
In addition to GDOT sources, there are other potential federal sources which inform consultants about policy and regulatory changes. The NEPA and ecology consultants in high-quality document case 1 mentioned that they actively look for non-GDOT sources of policy and regulatory information such as the U.S. Army Corps of Engineers and the U.S. Fish and Wildlife Service. The ecology consultant noted that it accesses these resources because it serves a lot of clients other than GDOT. Document Preparation Guidance and Instructions Across the cases, consultants stress the importance of having access to up-to-date templates and formats for improving the consistency and quality of documents and ensuring compliance with federal standards. OES publicizes report templates through two channels; on the SharePoint site as well as the GDOT website.10 One challenge noted by both consultants and reviewers is that not every consultant has access
10 The section on "Receptivity" provides more detailed information on SharePoint. 61

to SharePoint. For example, one NEPA consultant (high-quality document case 1) reported relying upon email blasts for template updates as he did not have access to the SharePoint site. There is little evidence across the case studies that OES provides content on preparing quality documents prior to the submission or that consultants are relying on other sources. b. Communication Stability Communication stability refers to changes in the communication content impacting projects and changes in the structure and types of channels used in the communication. Changes may come from federal regulatory authorities, professional standards, or changes in GDOT procedures and discretion. Policy, Procedural and Regulatory Guidance and Instructions There was no indication of a regulatory change in any of the six cases after the project award. However, one of the six cases (high-quality document case 1) faced a regulatory change before project award. The Federal Highway Administration (FHWA) changed one regulation that made it possible to shorten up the project schedule by turning an Environmental Assessment (EA) into a Categorical Exclusion (CE). Since this new regulation had not been tested in Georgia, the environmental consultant conducted substantial research and investigated whether the new regulation applied to his project. He presented his ideas to OES staff, GDOT design staff and one FHWA reviewer. FHWA finally approved this approach, allowing the consultants and OES to do a CE instead of an EA.
Consultants identified the EPM as the most important source of communication instability for GDOT projects. In every case study, consultants noted that the EPM is largely outdated and does not reflect current rules and regulations. Environmental regulations and rules have changed over time, but the EPM has not been updated to reflect those changes. The current EPM does not provide sufficient policy, procedural and regulatory guidance for consultants.
A second, related, source of communication instability stems from frequent regulatory changes. For example, the NEPA consultant in the high-quality document case 2 acknowledged, "there's no good
62

way of sharing all this information. ... ecology is probably the most dynamic. It's changing all the time." Further, the GDOT ecologist in the low-quality document case 3 described a limited capacity to monitor regulatory changes and communicate them to consultants: "there is the potential for a bit of a knowledge gap. [...] it's a struggle for me to just keep up with the current laws. [...] Sometimes that can be a bit of a beast for me to make sure I'm current and for everybody else." Thus, the interviews raised some concerns about whether information about regulatory changes is disseminated in a way that reviewers and consultants understand it and that it contributes to their work.
While we did not observe regulatory changes that occurred during the lifetime of a case, interviewees reported that OES implemented new standards for the electronic submission and revision of documents along with standardized review timelines roughly two years ago. Both consultants and reviewers in the high-quality document cases and low-quality document cases reported that these new standards made the submission of reports and the following review clearer and easier. Document Preparation Guidance and Instructions Interviewees in both the high-quality document cases and the low-quality document cases repeatedly pointed out that the templates provided on the SharePoint site are not kept up to date (see high-quality document cases 2, 3; low-quality document cases 2, 3). The GDOT ecologist in the high-quality document case 3 explained that keeping templates and the SharePoint site up to date is a challenge because of limited OES staff resources. GDOT ecologists (high-quality document case 2, 3; low-quality document case 3) noted that consultants are not always aware of updated templates and use old ones instead. For example, the GDOT ecologist in the low-quality document case 3 commented: "... we don't necessarily tell them we updated the template. I've gotten submissions where they're using a three year old template, and I send them an e-mail, `Do you have access to the SharePoint site because we have new templates'." Consultants also noted (high-quality document case 2, 3; low-quality document case 3) templates do not always provide clear guidance because they are "vague" and do not "necessarily have all the wording like
63

they [OES] would want it" (ecology consultant, low-quality document case 3). For example, one ecology consultant noted: "the guidance isn't very clear and you get different guidance from different people on every other project." Another challenge noted by the same consultant was the timeliness of guidance: "we get comments back from GDOT [after document submittal] where they will then let us know about the guidance. But we didn't know about [the guidance] before submitting, even if the guidance came out before we did submit it."
One of the main themes in both high-quality document and low-quality document case was reviewer variability, which refers to inconsistencies across OES reviewers in terms of the wording and formats that they prefer as well as OES reviewers who do not stick to the templates when reviewing documents. For example, one consultant noted that the comments "differ among reviewers" (low-quality document case 3) and requested that reviewers follow "the template and [review] based on the template versus their own personal, grammatical expertise." Similarly, the consultant in the high-quality document case 2 noted that reviewers are "a little bit different with how they like to review things. So depending on who you're working with, you get different comments." Thus, there is evidence within both high-quality document and low-quality document cases which suggest that missing, false or inconsistent guidance caused uncertainty during the preparation of the initial draft, leading to frustrations during the review process, and caused longer reviews.
Around the same time, GDOT also introduced a workshop procedure, bringing in consultants for a meeting with OES staff to resolve document deficiency issues during the document review. All three low-quality document cases reported that they attended one workshop. 11 Interviews produced ambiguous results as to when exactly a workshop takes place. One ecology consultant (low-quality
11 Low-quality document case 1: The reviewer is certain that he organized a workshop in order to resolve document deficiencies but it is not clear which document he was referring to. Low-quality document case 3 had a workshop for noise only.
64

document case 1) reported that OES hosts workshops after the second draft. One GDOT ecologist (lowquality document case 2) said, "If we get a first draft that's not very good, we do a workshop really quick." Another GDOT ecologist (low-quality document case 3) pointed out that the workshop will take place after the third draft.
3.3.2. Signal Form, Stability, and Receptivity
This section focuses on signal forms that reviewers and consultants generally use for communications and investigates the actual points of communication in three high-quality document and three low-quality document cases. One of the central questions we investigated is how the communication architecture that GDOT provides is utilized and how the different communication instruments are accessed.
Signal Form refers to the variety of signal channels used by GDOT and consultants to communicate performance expectations during the case.
Receptivity refers to the degree to which consultants monitor and employ the channels of communication created by OES.
Signal Stability refers to the amount of change in signal channels that occur during the conduct of the case.
It is worth noting that the initial scope of this research project was to focus on the communication between reviewers and consultants. However, the interviews revealed that there are other stakeholders, in addition to the reviewer and the consultant, who a) have an impact on the communication of performance expectations during a case and/or b) are involved in the conduct of a project and have an influence the consultant's performance. Thus, the analysis has to take into account these other stakeholders such as public and private project managers. Excluding them from the analysis would result in drawing an incomplete picture of the communications of performance expectations to consultants and would not fully explain whether or not these communication practices contributed to the subsequent outcomes. These aspects will be considered in the sub-section "Alternative Explanations."
65

a. Signal Form There are a variety of signal forms and/or communication channels available to reviewers and consultants during the lifetime of a project. These forms (Table 3-3) can be clustered as follows:
Static and one-way communication, including the EPM, SharePoint, and email blasts;

Dynamic and two-way communication, including IT-based channels such as email and FTP; and conversation-based channels such as phone, meetings, and workshops.

Mediations Text-based/ IT-based
Conversation-based

Table 3-3 Signal Forms
Static and one-way EPM SharePoint Email Blast

Dynamic and two-way Email FTP
Phone In-person Meeting GPTQ Meeting Workshop

Some of the signal forms that are presented here are used for sharing general information on procedural/ regulatory and document preparation aspects, including the EPM, GPTQ, SharePoint and Email blasts. Other signal forms are used for project-specific purposes, including phone, in-person meetings, emails, workshops, and the FTP site for the document review. b. Receptivity Receptivity refers to the degree to which consultants use the different signal forms available to them. We assessed receptivity with a questionnaire as well as during interviews. Overall, there are no significant differences between the questionnaire and interview results. We did observe significant differences between consultants and reviewers. Consultants use more dynamic two-way signals, including phone, email and meetings, resulting in more active communication with reviewers. OES reviewers, on the other hand, rely more on static one-way signal forms such as the EPM and SharePoint, which may lead to a more passive communication style.
66

We also observed significant differences between high-quality document and low-quality document cases. High-quality document case participants use more diverse signal forms than low-quality document case participants. The most distinctive pattern was that consultants in high-quality document cases perceive conversation-based, two-way signal forms (telephone, in-person meetings, and GPTQ meetings) more useful and clearer, and prefer to use them more often. Table 3-4 gives a detailed overview of the perceptions and usage of the different signal forms. Interview participants were asked to rate the usefulness, clarity, accessibility, and frequency of use for each signal form.
Overall, questionnaire results indicate that the text-based one-way channels were perceived less useful and clear, while the conversation-based, two-way channels were perceived more useful and clearer. The EPM was perceived the least useful signal form. In terms of accessibility, the text-based channels that do not require a secured access were perceived more accessible. SharePoint, which requires a secured access and quarterly meetings and workshops that are available for targeted groups were, perceived less accessible channels. Between the various signal forms available to reviewers and consultants, email was utilized most frequently while a workshop was used least frequently.
There are significant differences between the high-quality document and low-quality document case participants in terms of their perceptions and usage regarding different signal forms. Questionnaire results suggest that low-quality document case participants consider emails a more useful, more accessible and clearer way of communicating with each other. However, they seem to use emails less frequently compared to high-quality document case participants. Except emails, high-quality document case participants perceive all other signal forms (EPM, SharePoint, telephone, in-person meetings, GPTQ, and workshops) more useful than negative-case participants. However, this observation only holds true for usefulness. When looking at accessibility and clarity, the picture is less clear. Low-quality case participants perceive emails, in-person meetings and GPTQ meetings clearer and more accessible, but the EPM less accessible. In comparison, high-quality case participants perceive SharePoint clearer and more
67

accessible, but telephone and workshops less accessible. In terms of usage, high-quality document case participants use more diverse signal forms including SharePoint, email blasts, emails, telephone, in-person meetings, and GPTQ meetings.
68

Table 3-4 Questionnaire Results, Perception and Usage of Signal Forms

Signal Form EPM
SharePoint
Email blast Email
Telephone
In-person meeting
GPTQ meeting Workshop

Perception Least useful and clear, but most accessible
(along with email) form High-quality document case participants rate
EPM slightly more useful and accessible, but less clear Less accessible than other forms Consultants perceive it less accessible High-quality document case participants perceive it more useful, accessible and clearer Less useful and clear than other forms, but more accessible High-quality document case participants perceive it more useful and accessible Considered more accessible, but not less clear than in-person meeting or telephone Reviewers rate emails more useful and clearer Low-quality document case participants rate emails more useful, accessible and clearer More useful, clearer and accessible than other signal forms Consultants perceive it more useful and clearer, but less accessible High-quality document case participants perceive it more useful and clearer, but less accessible Most useful and clearest signal form, but less accessible than other forms High-quality document case participants rate meetings more useful and accessible, but less clear
Compared to all other signal forms, about average in terms of clarity, usefulness and accessibility
High-quality document case participants rate it more useful, but less clear and accessible
More useful and clearer, but less accessible than other signal forms
High-quality document case participants rate it slightly more useful and clearer, but less accessible

Usage Not used frequently Low-quality document case
participants use EPM slightly more frequently
Frequently used High-quality document case
participants use it more frequently
Not used much High-quality document case
participants use email blasts more frequently Most frequently used signal form High-quality document case participants use emails more frequently
Frequently used Consultants use it more
frequently High-quality document case
participants use phone more frequently
Not used much Consultants use it more
frequently High-quality document case
participant use meetings more frequently High-quality document case participants use it more frequently
Least frequently used signal form
Low-quality document case participants attend workshops more frequently

69

Interview Results In order to investigate the question of whether or not questionnaire results align with the results of the interviews, we conducted a comparison of both datasets. Further, we used interview data to understand and explain why reviewers and consultants favor one signal form over the other.
The interviews confirm the questionnaire results that the EPM is not a useful tool. While consultants in both the high-quality document and low-quality document cases pointed out that they use the EPM as a reference during the document preparation and revision, they also complained that the EPM is outdated and therefore not useful.
Interviews as well as the questionnaire revealed that OES reviewers and consultants in both highquality document and low-quality document cases communicate with each other most frequently via email. Reviewers and consultants appreciate the many advantages of emails, namely being a convenient, fast, and easy-to-use communication tool. One NEPA analyst (high-quality document case 3) also pointed out that he prefers emails to other signal channels because they serve as a proof of communication. It is documentation that becomes part of the official record of the project, which is important for legal reasons. This might explain why reviewers rated emails more useful in the questionnaire.
Consultants in the high-quality document cases pointed out that they particularly prefer phone over email during the document review phase because they save time to clarify comments and increase mutual understanding. For example, the ecology consultant in high-quality document case 1 noted, "Whenever I get comments back from them, I call the reviewer and find something to talk about on there. ... We can just talk on the phone and come to an understanding that renders that comment unnecessary, so I do a lot of that."
Further, the interviews verified the impression gained from the questionnaire that in-person meetings are rare, particularly because they are perceived to be less accessible due to scheduling challenges and GDOT's new security protocols. This is why consultants aim for alternative means of
70

meeting participation such as video conferences and conference calls. Furthermore, interviewees in both the high-quality document and low-quality document cases pointed out that meetings vary depending on project size and complexity, with more meetings for more complex projects. In both the high-quality document and low-quality document cases, it is either the project manager or the consultant, not the OES reviewer, driving the planning and organization of meetings. OES staff attend depending on their time availability, the subject matter and the project complexity. For example, one reviewer (low-quality document case 2) noted that he only sees the necessity to sit in with consultants if the project is ecologically complex. Also, we heard from one NEPA analyst (high-quality document case 2) that he expects consultants to set-up meetings and invite OES staff reviewers. We did observe significant differences between high-quality document and low-quality document cases in terms of the frequency of meeting: the interviewees of two high-quality document cases (2 and 3) reported that they had a large number of cross-office team meetings.
Interviews verified the questionnaire finding that workshops are the least used signal channel available to consultants and reviewers. Because the intent of a workshop is to resolve document deficiency issues during the document review, it only occurs if deemed necessary by OES staff. Workshops, as designed, only occurred in the low-quality document cases. Consultants in the low-quality document cases considered the workshops as beneficial and reported very positive experiences. However, the consultants in low-quality document case 1 and 2 complained about its negative stigma: "I have found that more to be, we get called in the Principal's office" (ecology consultant, low-quality document case 2) and "I think the negative stigma of the workshop doesn't necessarily need to be there because like I said, I had great experience with it, and I don't think it needs to be something that's seen as, `Your quality is so bad'" (ecology consultant, low-quality document case 1). Similarly, the GDOT ecologist in high-quality document case 3 noted: "it's in a sense almost like detention for-- when the teacher gives detention, she has to go also sit with that student at the end of the day."
71

c. Signal Stability We examine signal stability across the flow of communication during the three main project stages, namely pre-award, pre-submission, and post-submission and transmittal to FHWA. One of the main objectives is to point out the main similarities and differences between high-quality document cases and low-quality document cases.
Across all cases, we do not see a continuous signal stream (i.e., continuous flow of communication) between the OES reviewers and consultants during the project. Rather, there is a passive initial process where information is made available to consultants and an active process following the submission of the document. In between, there is a lot of activity associated with doing the work, but not a lot of communication of quality expectations.
High-quality document cases tend to have more streams of interaction earlier in the process, before the document is submitted and more proactive communication facilitated by the consultant during the document review. Moreover, interviews suggest that the amount and intensity of communication in the high-quality document cases is determined by the consultants, meaning that consultants are driving communication processes. Thus, there is a class of consultants in the high-quality document cases that is not only receptive for communication, but that takes ownership of the communication process. Pre-Award: Across all cases, there is little interaction between OES reviewers and consultants prior to contract award. Most firms have a long partnership history with GDOT and consultants were informed about the project opportunity and task order by the GDOT project manager or prime consultant. Thus, most cases reported that there would be some interaction between the prime consultant and the GDOT project manager before the project is awarded. Pre-Submission: While GDOT provides resources that facilitate the document preparation (EPM, SharePoint) and communicate quality expectations, we did not observe communication between OES reviewers and consultants regarding those materials. Thus, across the cases, we observed relatively little
72

communication about quality expectations between the consultants and OES reviewers during the ecology and NEPA work.
If there was communication, it was mostly in the high-quality document cases. Only high-quality document case 2 shows a substantial amount of communication between all parties, including email correspondence, phone conversations, recurring constructability review meetings, and avoidance minimization meetings with both NEPA and ecology stakeholders.12 In contrast, OES reviewers and consultants of the low-quality document cases 1 and 2 reported no communication before the document submission. In low-quality document case 3, there was some email correspondence between the NEPA consultant and OES.
If early communication between OES reviewers and consultants took place during the lifetime of the project, it was mostly driven by the consultant, not the reviewer. This observation holds true for both high-quality document and low-quality document cases. OES reviewers acknowledged that, generally, they would reach out to consultants before the document submission: "If possible, there is a little bit of pre-coordination" (GDOT ecologist, low-quality document case 3). Only in one instance do interviews clearly show that the GDOT ecologist reached out to the consultant and inquired about the status of the project (high-quality document case 3).
We do, however, observe communication between the consultants and GDOT project managers in all three high-quality document and in one low-quality document cases. For example, the consultant ecologist in low-quality document case 2 reported no communication with OES reviewers, but did report coordination meetings with the project managers at GDOT before the document submission. Further, in
12 We observed that reviewers experienced difficulties in remembering the project history and the actual point of communication with the consultant. Those times they could remember, they were referring back to emails. Another problem is that the reviewer we interviewed took over from another reviewer after the document submission. We do not know if and to what extent his/ her predecessor was involved in the project.
73

high-quality document cases 1 and 2, we observed extensive communication between the consultant and the GDOT project manager, including in-person meetings, phone calls and emails. Again, interviews suggest that the communication is mostly driven by the consultant, not the project managers.
Interviewees pointed out that the amount of pre-submission communication would increase with the complexity of projects. Thus, more complex projects would require a stronger signal stream and a greater variety of signal channels being used. This can be illustrated by the following quote: "the more complicated the projects are, the more long-term planning and communication is involved, reaching small milestones in between and checking in with consultants" (reviewer, low-quality document case 3). Post-Submission and Transmittal to FHWA: Across the cases, the majority of communication between OES reviewers and consultants took place during the document review process. Particularly, consultants in the low-quality document cases acknowledged that this communication practice would be standard procedure in most projects. One consultant noted "a majority of the communication happens at the back end" (low-quality document case 2) and another consultant mentioned that the communication "just came down to submitting reports and getting comments" (low-quality document case 3). Some OES reviewers in the high-quality document and low-quality document cases shared this opinion. For example, one NEPA analyst (high-quality document case 3) commented that communication is "by and large passing the paper back and forth."
There are significant differences between the high-quality document cases and the low-quality document cases in terms of communication patterns during the review process. Consultants of all the high-quality document cases were pro-active during the review process and called the reviewers immediately after they received comments. In the low-quality document cases, consultants were prompt to respond to the comments electronically, but not pro-active in addressing review comments by telephone or email.
74

After transmittal, there was not a lot of interaction between OES reviewers and consultant because most reports were approved by FHWA. Interviewees reported that OES reviewers generally address the comments of FHWA, if applicable, and send a revised document to the consultant for the consultant's review. In only one case (low-quality document case 2), there was more communication between OES, the consultant, FHWA and National Resources Conservation Service (NRCS).
3.3.3. Performance
Performance refers to the relationship between communication practices and performance outcomes with both OES managers and consultants. There are two important themes in this line of inquiry: 1) what is working and what is not; and 2) are there alternative explanations besides communications that may explain challenges confronted by consultants in meeting OES performance expectations? a. Lessons learned What Works, and What Does Not? The six cases revealed that there are a number communication practices that positively impact the performance of consultants. First, consultants and reviewers in both the high-quality document and lowquality document cases considered SharePoint and FTP as important tools that facilitate their preparation and submission of documents. They acknowledged that these IT-based channels facilitate processes and communication, and thus have a direct impact on their performance. That being said, consultants in highquality document and low-quality document cases highlighted that these sites need to be maintained and updated on a regular basis.
Second, consultants highlight the importance of document revisions and appreciate the feedback of OES reviewers. Across the cases, many consultants noted that reviewers do raise substantive questions and provide helpful comments, resulting in better quality documents and better consultant performance. Further, consultants in the low-quality document cases welcomed the opportunity to attend workshops with the reviewers because they facilitate mutual understanding and improve the quality of reports.
75

Third, interviews revealed that it is usually the consultant who drives project communication as well as determines which communication channels and patterns are to be used. Ultimately, it comes down to the consultant's project management and communication skills and efforts. In the high-quality document cases, we saw more communication and interaction between reviewers and consultants before and after the submission of documents. This finding may allow for the conclusion that earlycommunication can have an impact on the quality and timeliness of submissions.
At the same time, the six cases revealed communication practices that negatively impact the performance of consultants. First, dissemination of project-specific information as well as information about regulatory and procedural changes is critical for the timely and quality preparation of environmental documents. However, existing materials (EPM, templates on SharePoint) are outdated and therefore not helpful for consultants and reviewers. Both high-quality document and low-quality document case respondents reported instances in which the use of old templates led to confusion, miscommunication and longer reviews.
Second, while FTP facilitates the submission and revision of documents, it also puts up a barrier for more active (i.e., conversation-based) communication. One consultant noted: "in theory, we respond to all of those comments electronically [...] and nobody ever talks to anyone" (high-quality document case 1). Interviews suggest that this communication practice does not facilitate relationship-building and may lead to extended reviews. The same consultant commented as follows: "We can just talk on the phone and come to an understanding that renders that comment unnecessary."
Third, while most consultants in the high-quality document and low-quality document cases appreciated feedback and comments from OES reviewers, they also raised concerns about the kind of comments they often receive and dismiss them as being repetitive, painful, and not addressing the actual issues in the report. Similarly, some OES reviewers in the low-quality document cases complained about receiving documents that lack robust descriptions and explanations and contain too many grammatical
76

errors. Further, some consultants in the high-quality document and low-quality document cases raised concerns about a wide variability of reviewers, meaning that revisions are not consistent and reviewers point out different things. For example, one ecology consultant (low-quality document case 1) noted: "I think the quality of the documents we prepared was relatively uniform across, we would just get highly different reviews dependent on which reviewer was looking at it." Thus, inconsistent reviews may lead to confusion, misunderstandings and prolonged reviews.
Fourth, there is little early communication between the OES reviewer and the consultant. The reviewer comes in relatively late in the process, namely after the submission of the documents. If there was early communication, it was mostly in the high-quality document cases. Further, early communication in the high-quality document cases was mostly driven by the consultant. These findings demonstrate the relationship between proactive communication of consultants, project performance, and project outcomes. Better communications may not directly result in better outcomes, but can lead to better performance by boosting cooperation and increasing mutual understanding. b. Alternative Explanations The case studies highlighted other variables that may impact the consultants' performance. These variables are not a function of communication, but instead are functions of structural and process related factors which shape communications and influence the consultant's performance. Further, some of these factors are within OES realm of control while others are not. Assignment: The case studies revealed that OES reviewers come in relatively late in the process. This is largely due to the late assignment of reviewers. Interviewees pointed out that usually projects would not be assigned to a reviewer after the project was awarded, but close before or after the document is submitted. A respondent explained the system: "managers were waiting until we had something to do to assign a project, rather than overwhelming everybody and being like, keep tabs on all these things that we have nothing to do yet on." Thus, the late assignment hinders early communication between
77

consultants and reviewers. Instead, GDOT project managers seem to fill the communication vacuum arising from the absence of the reviewer. Workload: Interviews suggest that the late assignment of OES reviewers and the lack of early communication may be largely due to the heavy workload of reviewers. The heavy workload of reviewers and GDOT project managers was a theme across the cases. However, in most cases, reviewers did not give a specific number of how many documents they have to review. GDOT ecologists reported to have 20 to 60 projects while NEPA analysts have 40- 100 projects. Turnover: Interviews suggest that OES reviewer turnover is the norm rather than the exception. Each of the six cases experienced a change in the GDOT ecologist, the NEPA analyst, or both. One consultant noted: "And those get switched out so frequently that we have multiple reviewers for different projects" (lowquality document case 3). The high turnover imposes significant communication challenges for both sides. Interviewees across the cases highlighted that every turnover causes a loss of knowledge and potentially slows down projects. Internal OES Communication: To minimize the challenge of turnover, transition meetings where the project is handed off from one reviewer to the next one become imperative. However, transition meetings occurred in only two cases (ecology in high-quality document case 1 and high-quality document case 2).13 One GDOT ecologist noted that staff meetings and briefings after a project assignment would be "a luxury" (low-quality document case 3). Across the cases, we observed little internal communication between the NEPA analyst and the GDOT ecologist. Training and Experience: One of the common themes in the cases was the low level of training and experience of new OES reviewers. Consultants in both the high-quality document and low-quality
13 The NEPA analyst in low-quality document case 3 mentioned that there would be transition meetings if projects were re-assigned. However, it cannot be verified whether a transition meeting occurred in this case.
78

document cases noted that they have encountered reviewers who are lacking the knowledge and experience to provide quality feedback and comments on documents. For example, the ecology consultant in low-quality document case 2 noted, the "majority of the people need the understanding and the experience, and when you don't have that, you have people that are making comments that are irrelevant, or making you change things that really did not benefit the project, and this is all wasted time, wasted effort." Further, one NEPA consultant (high-quality document case 1 and low-quality document case 1) commented: "the concern I have is [...] do they really internally see where they need to mentor those younger, newer staff more?" Similarly, several OES reviewers reported that they did not have a formal training when they started their job at GDOT. OES reviewers learn their trade on the job gaining experience from simpler projects to more complex projects. Often, their review is revised by more seasoned staff, including their managers. One GDOT ecologist explained: "when I started three-and-a-half years ago [...] it was just you take on light projects and then [...] work [my] way up. And your manager slowly gives you projects that have more nuance to them. [...] There's no training that's really going to get you there other than what you're picking up along the way." Role Definition and Role Clarity: Interviews raised questions as to how roles and communication responsibilities are defined and laid out, meaning who is supposed to communicate what, when, and with whom. Prior to our interviews, the assumption was that OES reviewers and consultant communicate with each other throughout the lifetime of a case. However, the interviews showed that OES reviewers in both the high-quality document and low-quality document cases seem to define their work as document review only and mainly communicate with the consultant after the document submission. Most OES reviewers in both the high-quality document and low-quality document cases spent the majority of the interview time on describing processes and communication patterns during the document review and were rarely knowledgeable about other project phases. If they did communicate before the document submission, that communication is mostly initiated by the consultant. Thus, the cases produced enough evidence to
79

suggest that the role of the reviewer in the overall process is very narrowly defined and communication is very limited. Communication Guidelines: OES reviewers in the high-quality document and low-quality document cases expect consultants to initiate and lead communications and, if necessary, organize meetings. However, there is no evidence that this expectation is made explicit in contracts or other agreements. Further, interviews suggest that there is no requirement for OES reviewers to be involved before the submissions of the documents, thus to communicate with consultants during the document preparation. This allows the consultant to shape communication patterns as desired. Some consultants who are aware of the benefits of early communications and follow-ups will engage in pro-active communication, but others will not. Design and Schedule Changes: Communication of design changes is one of the biggest themes and concerns among interviewees in both the high-quality document and low-quality document cases. Interviewees suggested that some project managers share project-related information such as design changes and schedule changes with consultants upfront, while others do not. Some consultants (lowquality document case 1, low-quality document case 2, and high-quality document case 2) reported that sometimes they are informed of changes too late, which may impact their internal schedules and budgets significantly. This may be due to a lack of knowledge and understanding of design and district offices, and in part, due to a lack of communications between GDOT offices. Interestingly, we observed more communication between the design team and consultants in the high-quality document cases and hardly any communication in the low-quality document cases (except some communication in low-quality document case 1). Other Actors Communicating Quality Expectations: While both consultants and reviewers play a critical role in shaping communications, there are other stakeholders involved in the Plan Development Process that can have an impact on communications and the consultant's performance, particularly public and
80

private project managers. In order to understand the relationship between the environmental procedure and project design delay, communication and interaction between these other stakeholders need to be investigated.
3.4. Input for Focus Groups
The case studies yielded valuable information for designing and organizing the focus groups held with consultants. To initiate conversation within the focus groups, we designed scenarios representing possible environmental projects, which we presented to the consultants at the meeting for discussion. The scenarios were designed based on information we received during the case study interviews and, though entirely fictitious, incorporated elements from real life projects that we wanted to discuss.
The case studies made it clear that the actual structure of GDOT environmental projects is nonlinear and takes place with many different processes happening simultaneously. It also brought up questions about how each project's scoping takes place and how the body of consultants views GDOT contracting and scoping for projects. They also created questions on how consultants are perceived within GDOT and how that affects the review process and the requirements necessary for individual consultants to be called in to a workshop in OES. GDOT turnover and reviewer variance were also topics of high concern in the case studies and were developed into conversation subjects for the focus groups.
In terms of communication, the case studies informed the questioning for the focus groups by serving as a guideline for how consultants typically communicate with GDOT staff, how frequently they are expected to do so, and how the communication is driven during the life of the project. Though the types of communication used with GDOT were similar for all the case studies, the way communication took place varied widely. Different consultants communicated with different groups, some contacting many parties within GDOT and some only talking to their direct reviewers. Further, consultants engaged in communication very differently, some being passive while others were active initiators of conversation
81

with GDOT. In the focus groups, we included talking points about who the consultants talk with, whether they talk only to OES reviewers or engage in communication with the project manager, design team, and others as well, and who initiated those conversations.
82

Chapter 4 Focus Groups and the Identification of Alternative Strategies
4.1. Introduction
Three focus groups were conducted with environmental professionals drawn from firms that provide OES consulting services. Focus groups provide a means for observing natural discussions amongst professionals regarding a shared experience. This qualitative research method is valuable for understanding common values, patterns of behavior, and reflecting upon alternative courses of action. In this research, focus groups are used to better understand patterns of communication with OES and other actors influencing the development of environmental documents. The focus group participants were also asked to reflect upon alternative strategies that could improve performance and reduce the number of times that documents are reviewed by OES staff.
Findings from the performance data and comparative case studies were used in developing prefocus group surveys and alternative performance scenarios based on existing patterns of behavior experienced by OES staff. These instruments were used to set the stage for the conversations that took place regarding patterns of communication and strategies for improving performance.
4.2. Methods
Forty consultants were invited to participate in the focus groups. The list of forty was representative of a range of firms in terms of size, years of experience working with OES, and consulting services provided. The original plan was to have two focus groups but respondent interest was so high that we added the third focus group. Two sessions of the focus group were conducted on June 12, 2016, and one session was conducted on June 14, 2016. The two sessions on June 12 had 7 participants each, and the session on June 14 had 8 participants. In total, we had 22 participants (55% of total invitations) that consisted of ecology consultants, NEPA consultants, and consultant managers (see Appendix E Focus Group Participating Firms).
83

The duration of each session was about 100 minutes. The focus group activities were carried out in two phases. During first phase, the research team had participants react to scenarios that were developed and centered on the environmental preparation and review process (see below for further information on the scenario assessment survey). The purpose of this phase was to foster a dialogue among the group participants regarding their experiences working with GDOT as well as to observe whether their experiences are similar or different (and in what way) to the scenarios.
In the second phase, consultants were asked to reflect on their experiences working with OES on environmental projects and for the public sector in general. Consultants were also asked to reflect on strategies by which communications and/or consultant performance could be improved on GDOT projects. Table 4-1 lists the key concepts and themes that were addressed in the focus group protocol (see Appendix F for the complete Focus Group Protocol). The topics and questions do not reflect a specific order used in each focus group. Instead, we let the discussion evolve among the participating consultants while ensuring that each topic for questions was addressed at some point in the conversation.

Table 4-1 Key Concepts and Themes in the Focus Group Protocol.

Phase 1 Assessment of
Scenarios
Similarity of Scenarios with Experience
Effective Use of Communication Channels
Consultant's Reports and GDOT Responses

Phase 2

Communications of the Consultants

Challenges

Areas of Improvement

Typical

Technological

GDOT Compared to

Communications

Uncertainty of

Other

Normal Job and

Environmental

Organizations

Communications

Work

Improvements in

Knowledge Flow Technical

GDOT's Processes

Between Firm and

Complexity

Contractual

GDOT

Challenges for High

Specifications and

Quality

Obligations

Environmental

Documents

The pre-meeting survey was designed as a means of understanding the initial perceptions of

respondents prior to participation in the focus group. The questionnaires (see Appendix G Pre-Meeting

84

Survey Questions and Results) were sent to all 40 of the consultants invited to participate in the focus groups. There were 24 survey responses (60%). Focusing on consultants' perceptions of challenges in current OES projects and areas for improvement, the survey questions were designed based on the case study findings and the performance data. The main questions were as follows:
How critical the following factors [identified challenges listed] are to timely submission of high quality environmental documents (1: very uncritical to 5: very critical)
How effective the following approaches [identified areas of improvement listed] are likely to be on communications and timely submission of high-quality environmental documents (1: very ineffective to 5: very effective)
In addition to the pre-meeting survey, we asked all participants to review two scenarios based on experiences of OES consulting community in preparation for each focus group. The scenario assessment survey was designed to help us understand whether a consultant's experience is similar to or different from the scenarios that we compiled and to facilitate the focus group discussions in an organized manner.
Based on ten constructs of project conditions and ten constructs of performance conditions identified in the case studies and performance data, we developed five scenarios that reflect a range of performance from the consulting community including some contrasting possibilities. The scenarios were developed based on a composite of collective findings and did not represent any one specific project. We included two scenarios with the survey. Scenario 1 assumed poor quality of reviewer comments in a simple improvement project and Scenario 2 assumed poor quality of consultant performance to simple project changes.
The scenario assessment survey questionnaires (see Appendix H Scenario Assessment Survey) were sent to 30 focus group participants who confirmed their participation before their focus group
85

session with 19 responses (63%) received. The questions focused on consultants' experience in communicating with OES and performing OES projects.
Similar to data analysis for the case studies, all focus group audio recordings were transcribed, reviewed, and subsequently coded by three members from the research team as part of the process of investigating the experiences of the OES consulting community. The use of three coders assists in avoiding bias by any one data analyst. During the first cycle of qualitative analysis, descriptive coding was utilized to understand the processes discussed by respondents and the context of the project. In the second cycle of analysis, axial coding was used to identify the relationships between categories and codes.
4.3. Results
4.3.1. Current communications and performance
We asked consultants for their experiences in communicating with GDOT and how this shaped their document quality expectations and performance. Focus group participants emphasized that the most effective communications are active and direct interactions with OES reviewers. This is consistent with findings from the case studies. The following excerpts represent the participants' perceptions of active communications:
"If you're trying to reduce the amount of time that it takes back and forth, you pick up the phone and you call the person, and you answer their [his/her] question." (from focus group 3)
"If you know who's reviewing your project, and you can get into the meeting with them [him/her] up front, I think it definitely helps." (from focus group 1)
Successful projects require that consultants proactively manage communications with all project team members, including other GDOT offices, consultants at different firms, and federal agencies. This is similar to the patterns illustrated in the high-quality document cases of our case studies. A participant in
86

focus group 2 explained, "I probably sent 30 e-mails to that prime design firm. `If anything changes with these culvert extensions, you have got to tell me because the ecology has to be updated.' [...] I must have sent 30 e-mails out to the Project Manager. And Federal Highways signed the document."
The importance of proactive communication as a management strategy was also emphasized in consultant reactions to the scenarios we provided prior to the focus group. Perhaps the strongest reaction by consultants is that the scenarios did not capture the large amount of communications that they have early in the life of a project or with actors outside OES. Included in these communications are the following: 1) initial meetings for project scoping, budgeting, and scheduling, 2) initial meetings with the project manager and the procurement office; 3) coordination with consultant project managers and subconsultants; and 4) status updates and coordination with OES by consultant task supervisor.
Many consultants pointed to the critical role played by project managers in environmental work. It is particularly helpful when the project manager has a good level of understanding of the importance of environmental work for design: "it's so nice when your engineer kind of knows a little bit" (from focus group 2). In general, good teamwork was stressed across the focus groups: "if you have the right mix of people and personalities on a team, it can really work as a team" (from focus group 2).
GDOT project managers can also be important in facilitating the OES review process. The following excerpts represent the participants' perceptions of the roles GDOT project managers play:
"The project manager at GDOT was phenomenal. I mean, he was over there at OES's office every single day when something critical had to be pushed through" (from focus group 2).
"When OES is sitting on my document review for one reason or another, and I say, ping [the project manager], if you want to move this project, would you please go ding my OES reviewer and tell them [him/her] to step on the gas?" (from focus group 1).
87

Consultants highlighted relationship-building to facilitate the review process. Across the focus groups, consultants noted the important role that OES reviewers can play in information sharing. An example of this perspective was shared in focus group 2: "NEPA has [a] more detailed response than the PM [project manager] does and actually fills the PM back in." A quality working relationship can also lead to more productive communications with the OES reviewer. One participant in focus group 1 noted the following: "Like for a reviewer that I've worked with for six years, he's not going to give me comments that are downplaying."
Other factors contributing to the poor performance by consultants are low levels of consultant experience on state-level projects and the internal QAQC of consultants. These are examples:
"That's a problem of having people that don't have the level of experience doing the work that they need. [...] if the management hierarchy knew the level of quality of document that was being submitted in some cases with their company's name on it, they would be embarrassed and ashamed." (from focus group 1)
"QAQC is where the improvement really needs to happen." (from focus group 2) "How few times a consultant will say, `I don't know.' [...] It's like there was a fear of [not knowing
the answer], `I'm a consultant, I'm supposed to be the expert in this area,' [...] I don't think GDOT creates that culture where we're afraid, but I still see it." (from focus group 3) "Some [do] bad stuff over and over again, yet they're still getting work." (from focus group 1)
In the scenarios that we provided prior to the focus groups, some of the factors that the consultants most strongly identified with corresponded with points of disagreement or tension in communications with OES reviewers. Consultants also identified with the scenarios that described problems arising from working with consulting firms that have limited experience in the environmental arena and recognized this as a potential source of poor document quality. Consultants strongly identified with scenarios that describe problems arising from the following: 1) lack of early communication on
88

projects, 2) the use of old versions of templates and late notification of changes in templates, 3) inconsistency between the first and second round of reviewer comments, 4) the number of nonsubstantive comments in reviews, 5) the difficulties in relationship building with OES reviewers, and 6) and the potential for power struggles in judgment calls and interpretation of rules.
4.3.2. Challenges in OES projects
This section reviews challenges in OES projects identified in the pre-meeting survey and the focus group discussions. Table 4-2 shows the results of the pre-meeting survey about effective factors to communications and the timely submission of high quality documents. Consultants perceived that design change and regulatory and procedural changes are critical (4.42 and 4.08). 14 The complexity of a transportation project and the environmental conditions of the site also showed high scores (4.04 and 3.83) as did GDOT staff turnover and the consulting firm's level of experience in working with GDOT (4.04 and 3.83).
In contrast, the type of project sponsor and contractor were perceived as less critical (2.79 and 2.38). Compared to other conditions, the type of project sponsor and the terms of the contract show wider variances (1.22 and 1.31 in Standard Deviation).
14 These are the average scores on a 5-point scale with 5=I completely agree and 1=I completely disagree.
89

Table 4-2 Critical Conditions to Timely Submission of High Quality Documents

Condition Design Changes Regulatory/ Procedural Changes Complexity of the Environment of the Project Site GDOT Staff Turnover Overall Complexity of the Transportation Project Consulting Firm's Experience Working with GDOT Seniority of GDOT Reviewer Miscommunication with Other GDOT Offices Consultant Turnover Terms of the Contract with GDOT Project Sponsored by a Local Government Consulting Firm as a Subcontractor

Mean S.D. N 4.42 0.76 24 4.08 0.7 24 4.04 0.93 24 4.04 0.98 24 3.83 0.99 24 3.83 1.11 24 3.75 0.88 24 3.33 1.14 24 3.17 0.94 24 3.04 1.31 24 2.79 1.22 24 2.38 1.03 24

Min Max

3

5

3

5

2

5

2

5

2

5

1

5

2

5

1

5

1

5

1

5

1

5

1

5

Twelve participants suggested other critical factors to OES project performance in an open-ended question. Many factors are common between the responses. The following factors were emphasized (the number of respondents):
Comments not focusing on technical accuracy and legal sufficiency (6) Reviewer's inconsistency (5) Evolving standards and report requirements (3) Lack of communications between internal GDOT offices (3) Double standard for in-house works (2) Project scope change (2) Lack of supervision by senior GDOT staff (2) GDOT's impact determination ability Over-documentation Multiple rounds of full review system that do not focus on tracking changes Tight schedule for internal QAQC

90

Unlike common arguments that focus on challenges interacting with GDOT, one respondent emphasized that the quality of the initial document depends solely on the consultant team: "document quality is driven by the consultant's field experience, regulatory knowledge, experience and knowledge of GDOT requirements and expectations, writing ability, and QAQC process."
Focus group participants also identified a number of non-communication challenges that relate to structural and process factors of working with GDOT. Structural factors describe the working environments within which OES pursues project-related tasks that require engagement with the consulting community. These factors include the procurement process, human resource management, and the federal regulatory environment. Process factors describe the formal and informal processes that proscribe the actions and contributions of OES to preconstruction design projects. These factors consist of items that shape the overall relationship between GDOT, OES and the consultant community and include: 1) turnover in personnel on projects, 2) project management, 3) the workload of reviewers, and 4) inter-agency communication. Table 4-3 summarizes communication and non-communication challenges. It should be noted that very few of these communication and non-communication challenges generated contradictions or disagreement among focus group participants, but were endorsed by nearly all participants.

Challenges identified by focus group participants having significant impact on the timely submission of high-quality documents

Table 4-3 Challenges in OES projects

Communication Challenges Outdated EPM and templates No centralized information
dissemination infrastructure No communications prior to
regulatory and procedural changes Low quality of comments Inconsistency across reviewers Lack of standardized, objective review
criteria Unresponsiveness of OES staff Unpractical review timelines

Non-communication Challenges Workload of OES staff High OES turnover Frequent reviewer re-assignments Role definition and role clarity Cultural barriers Frequent design changes and
unexpected scope changes Procurement system rehiring low-
performance consultants Unrealistic schedule

91

Communication Challenges Focus group participants pointed out a number of communication challenges that they regarded as major hurdles for their performance. Some of those communication challenges arise at multiple points during projects, while other challenges are particularly apparent during the document review. Outdated EPM and templates: In all three focus groups, consultants noted that a lot of GDOT's communication content, including the EPM and templates, are outdated. One consultant from focus group 1 commented, "it's been updated 100 times virtually but not physically on the system."
The concerns expressed over the manual and templates point to a larger problem of knowledge management and knowledge transfer. Consultants in focus groups group 2 and 3 noted that although SharePoint was established with the intention to have a centralized information platform, it is not effectively used in that way. If template changes occur, they are not updated on SharePoint. If regulatory or procedural changes are introduced, that information is not always available on SharePoint. Instead, GDOT uses various channels to communicate changes and updates, including emails and review comments. A consultant in focus group 2 described the issue in this way: "[...] the department has been trying to accommodate regulatory considerations and be proactive about that. But then its applications are piecemeal; and so it [new regulatory information] isn't in the SharePoint site and it's not necessarily something they [the department] want to institutionalize yet."
Consultants described how they now maintain large binders or electronic folders that contain all of the email blasts and the SharePoint announcements describing changes in rules, formats, and procedures. Sorting through this material to discern a current template is challenging for even the most experienced consultant (from focus group 3). Several consultants noted that they use the most recent document approved by OES so that it will continue to be sufficient: "A lot of times, we will get notice that they [OES] changed their procedures by a review coming back to us" (from focus group 2). Thus, GDOT's
92

current knowledge transfer management ultimately impacts consultant performance and can lead to additional and/or prolonged review processes. Quality of comments and reviews: Across the three focus groups, participants criticized the art of reviewing, including the types of and the number of comments. A number of consultants noted that they would often receive editorial comments (including punctuation, grammar and wording), but few comments that address the technical quality of the report. These editorial comments were regarded as non-substantive as well as very subjective. For example, one consultant from focus group 3 commented, "they give comments dealing with grammar and things that don't really matter, non-substantive comments." Another consultant from focus group 2 explained, "They're always tweaking the wording." A consultant from focus group 1 described the issue this way: "If it doesn't change the intent of what the sentence said, then why change it."
Participants in all three focus groups criticized how personal preferences of OES reviewers would lead to inconsistencies regarding the number and type of comments provided. One consultant from focus group 1 explained, "even for the firms that are [...] experienced [...] and have a rigorous internal QA process, and deliver documents to GDOT with the initial delivery that are high-quality documents, you're still going to have two to three rounds of review from OES even if there's not some type of major, glaring technical error. [...] Just with personal preference." From the consultants' perspective, OES reviewers seem to have wide ranging differences in expectations as to what constitutes a high-quality document. Such inconsistencies create uncertainty for consultants and hinder their ability to prepare and deliver high-quality documents, which ultimately impacts consultant performance. Lack of objective review criteria: Among all three focus groups, participants provide their own theories as to why inconsistencies across reviewers exist. A number of consultants explained the inconsistencies as attributable to an absence of standardized, objective review criteria and guidance for OES reviewers. Reviewers are given a lot of discretion to make judgments. One consultant in the first focus group
93

explained, "I see the problem is they don't have a standardized guidance that they follow themselves internally. If everybody basically has to follow the same template, then if you change the person off you lose some institutional knowledge certainly, but you're not encountering problems like this where somebody's judgment was radically different from somebody else's judgment and therefore you're having to go back out and do additional fieldwork that you didn't have to do."
The absence of standardized, objective review criteria also creates challenges for consultants to prepare a high-quality document because expectations are unclear and non-transparent. One consultant in the second focus group noted, "there is nothing in guidance that really tells you what you're doing. You're always having to constantly sort of modify what you're putting in." In addition, consultants in the first focus group acknowledged that the EPM is supposed to provide "a set of procedures." However, the problem is the implementation of it: "different analysts [...] interpret the EPM differently and apply it differently," resulting in inconsistent feedback and comments. One consultant described the following situation: "I'll be in a NEPA review with a person who will remain nameless. And it sounds like they're developing policy on the fly on the phone while addressing the comments. [...] And I'm like, `Whoa man. This sounds totally different than anything I've ever done before, but okay if it will get over to Federal Highway, you know, we'll do it.'" Review timelines: Participants of focus groups 1 and 3 also discussed the review timeline, which allows 30 days for the initial review and a set schedule of days for the second and third reviews. Participants in focus group 1 acknowledged that the document review has improved since GDOT implemented new procedures. However, they also noted that the 7-day revision window would provide too little time to receive information from other project stakeholder such as district engineers and incorporate that information in the document. Participants in focus group 3 mentioned that the initial 30-day review window can be too long for certain documents: "Some projects are very standard and you're still taking a really, really long time, when there's really nothing substantial or nothing-- there's no real problem to
94

solve. It's just getting through the process is taking way too long." Further, consultants in focus groups 1 and 3 criticized that OES reviewers would not hold themselves accountable to the timeline. For example, one consultant in focus group 1 pointed out, "I'm complying with my end of that bargain, but I don't really see, especially, after I submit comments. [...] they're not complying with the 15 days [of the second review]."
Compounding the problem of review timeliness is the perceived unresponsiveness of OES reviewers and GDOT project managers. This was a topic brought up and discussed in all three focus groups. A large number of participants referred to the difficulties they encounter in communicating with GDOT OES and describe situations in which the unresponsiveness of OES staff had impacted their performance. For example, one participant in focus group 1 commented, "If I email in a question [...] and you don't write me back and I contact you again two weeks later, and in two weeks later. That's another month that we've lost. And a month is a lot and for what we do." Another participant in focus group 3 noted, "Even on projects where we're in a highly urgent scheduling situation, responsiveness is problematic." Finally, one participant in focus group 2 reported, "I have an environmental PM [project manager] that I don't believe exists, even though we've submitted a lot of stuff."
Non-Communication Challenges Focus group participants pointed out a number of non-communication challenges that impact their performance. These challenges relate to structural and process factors within OES. Although these challenges are not directly related to communication, they impact communications between OES and consultants. Moreover, we noticed that communication and non-communication challenges reinforce each other. Workload of reviewers: Participants in all three focus groups were concerned about the high volume workloads that OES staff has to complete, which creates an environment of constant stress and limited
95

resources for engaging in communications with consultants and updating materials such as templates and the EPM. One consultant in focus group 1 noted, "That's the only way they operate, is always under crisis. It's never a non-crisis operation." Further, the heavy workloads impact OES ability to effectively manage projects and trigger late project assignments, which in turn impede early project communication between reviewers and consultants. In fact, consultants suspect that the heavy workloads are one of the main reasons for OES staff being unresponsive. One consultant in focus group 1 explained, "That's my problem with communication at OES [...] I don't even know that this can be fixed with the workload that they have. [...] I have had a project for 18 months. I can't get a response for anything for 17 months. And then its due for right of way [...] And I'm working on other stuff. I tried to get an answer five months ago on this, and now I've got to get all of this stuff done in one month so we can hit the right of way." Thus, the heavy workload of OES staff becomes a challenge for consultants as well because it affects their ability to effectively manage communications and complete projects on schedule. OES turnover: Participants in all three focus groups were concerned about high staff turnover at OES. The turnover creates two different challenges for the consultant. First, a lot of the more experienced OES staff leave and new people come in who lack the knowledge and experience to perform their work, which can impact the quality of reviews. For example, one consultant in focus group 3 who formerly worked at OES reported, "I have been gone three years. I bet 80% of that office has turned over. That means 80% of those folks are less than three years. That is junior, junior, junior, junior, junior. I don't care if you got a senior position staff and you have got to promote and now they are senior. They are not senior." Second, the high turnover increases the workload of senior level personnel since they have to train newly hired people and review their work, which in turn limits their capacity to devote their time to project management. One consultant in focus group 2 noted, "It is always them [the managers] teaching the new people, they hand that up to their bosses for review. That's the bottleneck effect because it is two or three people handing to one, and they have to sit there and put their own comments on it, give it back to you
96

to give to consultants. It's a hard chain there, because most of the time those managers are actually having to help the people." Reviewer re-assignments: Participants in focus groups 1 and 2 discussed frequent re-assignment of reviewers to projects, which creates two challenges for the consultant. First, participants explained that every time a project gets re-assigned, project knowledge is lost because there is insufficient internal communication and knowledge transfer. One consultant in focus group 1 referenced a recent situation that he had encountered: "I guarantee you in that five minutes between [the re-assignment of] the project, there was no communication of the history of the project or where it was." Another participant in focus group 1 noted that the negative effects of re-assignment would be mitigated if transition meetings existed: "But the thing is, if you had one person that was the task supervisor, coordinator, it would not matter so much as the other groups or the specialty people are rotating around because you'd have somebody to get them up to speed as well as yourself, and you had also have some knowledge that it's happened." Second, frequent re-assignments aggravate existing communication challenges with OES, particularly the problem of inconsistencies across reviewers. For example, two consultants in focus group 2 explained, "you will have just a few comments, you get that addressed and you get this worked out, and then another reviewer comes in [who] finds a whole new [view on the document]." Role definition and role clarity: Participants in all three focus groups spent a substantial amount of their time discussing how reviewers understand their role and responsibilities within GDOT OES. Consultants noted that OES reviewers struggle with finding the right balance between the goals of environmental protection and project delivery. A large number of consultants introduced the theory that many of the younger reviewers do not only want to minimize impacts on the environment, but protect and save the environment. For example, one consultant in focus group 3 commented, "I think what you're struggling with is, you want to protect everything. That's not necessarily the department's mission. The department's mission is to build roads." OES reviewers who lean more heavily towards environmental protection may
97

apply a very rigorous review that includes asking for additional, unnecessary work. Thus, the lack of role clarity can impact the document review, and thereby consultant performance. One consultant in focus group 1 explained, "And while they have to comply with the environmental laws and follow the implementation process that FHWA wants them to follow, it seems like they [younger reviewers] go above and beyond what they necessarily have to do." Similarly, one consultant in focus group 2 reported, "Organizationally, their [young reviewers'] goals aren't geared to support the management goals when it comes to those -- and that's where all the environmental pressure comes in, on meeting those initial deadlines and initially the lead schedule. It's going to create that conflict constantly." Another consultant explained: "there definitely is not ... enough higher-ups ... in this organization telling the folks and the new folks at OES, `Our number one job is to get projects built. [...] We are here to make sure we follow every law and that we minimize and avoid impacts where possible.'"
Focus group participants further acknowledged that GDOT managers would have to make an effort to ensure role clarity and communicate GDOT's mission and goals to OES reviewers: "I think there is a lack of clear communication from GDOT management all the way down on [...] what are the basic goals of Georgia DOT - for them to understand, day one, it's about providing safe, efficient transportation" (from focus group 2).
Focus group participants perceive the role of OES as poorly defined within the current GDOT structure so that OES is not a full partner in the project team. For example, consultants in focus group 2 described situations in which they saw a disconnect between OES and their projects: "It's completely silent and you don't have one person in OES who knows all about this project and knows when things all have to come together." Another consultant in focus group 2 noted, "Are they there to review everybody else's documents or are they there to be a resource?" Participants in focus group 1 identified the same challenge as follows: "It feels like OPD [Office of Program Delivery] has ownership of the project, OES does not. [...] OES can kind of pull back from the process. They are at the table but not part of the team. And that's
98

where you get your disconnect [...] there's an effort to involve and meet the specialists and the T-Pro process. And at least copy them so they know what's going on. But at the end of the day, it's OPD [that] owns it with the PM being held accountable professionally, and the consultant. But OES isn't--." Cultural barriers: Participants in focus group 1 and 2 described instances in which they experienced a cultural divide between themselves and OES staff. One participant in focus group 2 defined the problem as follows: "They're hiring us to help them deliver a project, yet there's more of an adversarial relationship, as opposed to a team relationship." Interestingly, consultants reported that their projects were most successful if all project stakeholders had worked together as a team. The cultural barriers can hinder effective communication between OES and consultants and ultimately jeopardize timely completion of a project. Design and project changes: The communication of design change was another topic brought up by participants in all three focus groups. Due to the interdependence of tasks, communications with other offices and design consultants are critical. Often, OES consultants are not informed of project changes in a timely manner, which has significant impacts on their schedule and leads to re-work. For example, one consultant in focus group 3 noted, "during the process of a project, there are things happening with design, and design needs to move forward so that we can get the information we need to prepare the environmental studies, and we don't get what we need from design when we need it [...] the project manager doesn't necessarily push that envelope very much, but once it's done, it's now all ecology's fault."
Project changes are partly caused by a hierarchical relationship with authorizing agencies and their unexpected instructions can be hurdles in environmental works: "[Currently] they're obsessed with the buffer between the bike lane and the travel lane. [...] You can foresee bikes, but next day it's going to be sidewalks" (from focus group 2). Consultants assumed that requirements for excessive work beyond compliance and extra regulatory hurdles are coming from authorizing agencies. For example, one consultant in focus group 2 reported, "It's easier for GDOT to just not be controversial and just give in."
99

Another consultant in focus group 3 explained, "We're doing a lot of picking on the staff within OES, but I think they're victimized, overpowered by Federal Highway." Procurement: Consultants in focus groups 1 and 2 emphasized that the consultant's level of experience is critical for performance and pointed to drawbacks of the current procurement system. For example, one consultant in focus group 1 noted, "Performance doesn't factor into who gets work awarded next time." The following issues are examples of how procurement can hinder consultant performance (respondents refer mostly to local projects): 1) expectations that re-work due to project changes is costless, 2) the absence of a standard of budget, 3) the lack of detailed task descriptions, 4) the lack of understanding of scoping, 5) double standards between consultant projects and work performed in-house, and 6) delays in the procurement process. Schedules: In all three focus groups, unrealistic schedules were considered another challenge hindering a consultant's ability to submit documents on time. Focus group participants reported that many times they did not obtain a feasible schedule. For example, one consultant in focus group 3 explained, "And even from the beginning of a project, you are asked to comment on a schedule, and you provide comments back and say that this schedule does not fit within your program [...] but, you provide those comments and nothing changes."
4.3.3. Areas for improvement
Consultants offered suggestions for areas of improvement and strategies that OES might pursue. Table 4-4 provides a list of recommended approaches for achieving the timely submission of high quality documents identified by consultants in the pre-focus group survey. Consultants perceived that template modification and an update of the EPM were likely to be effective (4.13 and 4.00). Regular team meetings and NEPA analysts' active project coordination were also perceived as effective (4.08 each).
100

Table 4-4 Effective Approaches to Timely Submission of High Quality Documents

Approach Expanded use and modification of templates Regular meetings with PM, designer, and reviewers after the kick-off meeting Active coordination by NEPA analysts Environmental Procedural Manual update Use of deliverable checklist Avoidance of GDOT staff turnover Easy access to T-Pro and SharePoint Early workshop Trainings for District/ Design Offices On-board training for firms new to GDOT projects Hiring consultant reviewers Dedicated GDOT staff for information dissemination and T-Pro and SharePoint update Expanded use of online tools (T-pro, SharePoint, FTP) Making a pre-submission review step Consultant evaluation system by GDOT Expanded project information in T-Pro comments Flexible review timeline at reviewer's discretion Incentives for timely submission of no-return documents Penalty for delayed submissions of incomplete documents

Mean S.D. 4.13 0.78
4.08 0.81 4.08 0.95 4.00 1.08 3.71 1.14 3.67 0.9 3.50 1.12 3.42 1.04 3.38 1.07 3.33 0.80 3.29 1.06
3.25 1.16 3.17 0.94 2.79 1.29 2.79 1.15 2.71 0.98 2.71 1.27 2.46 1.22 1.88 0.97

N Min Max

24

3

5

24

2

5

24

2

5

24

2

5

24

2

5

24

1

5

24

1

5

24

1

5

24

1

5

24

2

5

24

1

5

24

1

5

24

1

5

24

1

5

24

1

5

24

1

5

24

1

5

24

1

5

24

1

5

In addition to those prescribed approaches, twelve of the focus group participants suggested additional strategies to improve communications and document submission in an open-ended question about effective approaches, and the suggested strategies include the following:
Streamlining redundancies (revising requirements, eliminating narrative, etc.) (4) Reviewer consistent comments (by training, by well-defined guidelines, etc.) (3) Changes of OES staff attitude in communicating with consultants Reducing workloads of OES staff PM training Timely dissemination of information including internal GDOT communications Analyzing communication breaks Allowing supplemental to contract due to project changes
101

Information disclose of both documents and review comments for quality control One consultant provided a specific recommendation with regard to improving the information
available in GDOT tracking systems: "[The] T-Pro database should be amended to include: milestone completions throughout the NEPA process (i.e., air completed, noise completed, HRSR [Historic Resources Survey Report] completed, PIOH [Public Information Open House] completed, etc., Ecology pending because.....), the schedule of design (PFPR [Preliminary Field Plan Review] completed, ROW [Right of Way] delayed because......), etc. This will allow the entire team (consultant, GDOT Design PM, OES, GDOT management) to view over the life of the project the stage at which NEPA is complete to date and provide everyone with a snap shot of what is left to do. It will also assist new and junior staff (GDOT and consultants) involved in the NEPA process [to] understand the overall process and appreciate the length of time to complete all the tasks involved. This also could be used as a future tool by GDOT in understanding historically why NEPA documents are on schedule or not (where are the delays historically)."
Two consultants expressed specific opposition to any incentive or penalty system for performance because many of the causes of delay are out of the control of the subject matter experts. One consultant specifically opposed the development and use of a checklist arguing that "any one-size-fits all solution, like a `deliverable checklist' just creates additional paperwork that will slow down submittals." In contrast, another consultant explained that some report types would benefit from a checklist approach.
Participants in all three focus groups discussed a number of communication and noncommunication strategies and introduced several areas for improvement. It is worth noting that every strategy presented in this section was suggested by focus group participants. Table 4-5 summarizes the suggested communication and non-communication strategies. While there are strategies that OES could implement immediately, there are other strategies that are outside of the control of OES.
102

Table 4-5 Areas of Improvements in OES projects

Strategy OES Intervention

Communication Strategies Template and EPM update Centralized information dissemination
(e.g., Dedicated staff for SharePoint maintenance) Use of checklist Standardized, objective review criteria Early communication

Non-communication Strategies Outsourcing review tasks (divergent
views) Designating task supervisor (e.g., Florida
DOT's decentralized system) Transition meeting Clearly defining roles and
responsibilities

Non-OES Intervention

Interactions with engineers and designers and regular team meetings
Technology options for information sharing (e.g., project log in T-pro, subscription to Department of Natural Resources database)

One-point project management (e.g., project manager)
Overall procedure reform/streamlining (e.g., clear resource identification and detailed engineering studies before awarding environmental task)

There are several communication strategies that are within the OES sphere of responsibilities and are amenable for implementation. First, focus group participants identified the need for updated materials, including the EPM and templates. Second, although OES has already established a centralized information system, the SharePoint site, this resource could benefit from greater attention to updates and knowledge management practices. Further, the site is not currently accessed by all consultants. Consultants suggested designating one OES staff member as a knowledge management specialist to keep SharePoint up-to-date and to inform consultants about changes. Third, in order to ensure a better quality of comments and consistency across reviewers, standardized objective review criteria are needed. Consultants pointed out that fewer rounds of review can lead to lighter workloads for reviewers. Fourth, some consultants suggested the use of checklists for certain project types: "we could simplify all these studies with less writing, all the narratives you're talking about. That's going to have [the] potential to improve so many of these [review variance] factors." Finally, consultants suggested early assignment of reviewers to have communications and upfront team meetings.

103

A variety of additional strategies were suggested by consultants during the focus groups. In many of these suggestions, the consultants were drawing upon their experience working with other state DOT programs on environmental projects. Additional strategies include the following:
1. Consultants stressed that OES has to make sure that roles and responsibilities are clearly defined as well as aligned with GDOT's overarching goal and mission. OES senior managers have to communicate GDOT's mission more clearly to OES reviewers and ensure a common understanding of the main responsibilities of reviewers. This will reduce the variability experienced by consultants in OES reviewer comments.
2. Consultants across all focus groups emphasized the importance of working as a team between environmental specialists, engineers and designers: "through the whole entire process, the environmental people and the engineers truly have to work closely together" (from focus group 2) "The collaboration between design and environmental is essential." (from focus group 3)
3. Technology solutions for information sharing and access were also suggested by consultants: "A log section to [T-pro], it would also help Office of Program Delivery understand why the project's not on schedule, why NEPA's not on schedule." (from focus group 2) Another consultant suggested that OES should emulate South Carolina where the DOT provides other types of information to consultants: "in the state of South Carolina [...] you basically get a subscription to their [DNR] database." (from focus group 3)
4. Consultants suggested that OES consider appointing a project leader to coordinate information across OES sections. An alternative approach suggested by consultants is as follows: "If you were to put the NEPA project managers for GDOT [...] managerially under the project manager, it'd be a lot better because they'd have skin in the game." (from focus group 2) However, other consultants noted that this may require project managers to have a better understanding of environmental procedure.
104

5. Other consultants stressed the importance of holding transition meetings when a project is changing hands between OES reviewers: "a good example would be when OPD changes project managers, they have a project management transition meeting and both the project managers are there and the consultants are there and whoever else is there." (from focus group 1)
6. Each of the focus groups noted that in comparison with other states, OES requires more narrative, more points of data entry, and has less standardization in reporting documents, and streamlining overall process was highly recommended. The following are examples: o In the current practices of GDOT, resource identification and risk assessment that is supposed to be conducted during concept development in the PDP is often conducted with effect assessments simultaneously. This step needs to be streamlined. One focus group participant made this point while contrasting the GDOT system with Florida DOT (FDOT) system: "Basically, we wait until we're into the project to know our risk. Whereas Florida assesses the risk and identifies the resources prior to the project going towards design. The way we do it, the consequence is ... that if you identify it [risk and resources] while you are going through design you open yourself to more risk and schedule delays. " (from focus group 3) o In the current practices of GDOT, many project changes including design changes occur during environmental review. Several consultants indicated that this needs to be avoided by providing an example of FDOT: "they [FDOT] are requiring very, very detailed engineering studies to be done before they will approve even just the very first phase of the environmental document." (from focus group 3) o In the current practices of GDOT, permit applications were prepared after all environmental review tasks, which sometimes requires re-works of the tasks. Consultants (from focus group 3) suggested ways to streamline the process through the following
105

comparison: "North Carolina [DOT] has a good approach with their [NEPA] merger process ... that you have concurrence points ... so you don't step back." Concurrence points are defined steps in the project development process at which participating federal and state agencies sign off and pledge to abide by decisions made unless there is some fundamental change in the conditions of the project.
106

Chapter 5 Conclusion and Recommendations
5.1. Summary and Conclusion
This study is designed to generate alternative strategies that OES may use to promote communication of quality expectations to consulting firms providing environmental services. In this section, we develop these strategies based upon the key findings from our three research tasks:
Task 1: A review of performance data of recent GDOT engineering design projects, Task 2: A comparative review of high-quality and low-quality document case studies, Task 3: Results of three focus groups with representatives from OES consulting firms.
Our analysis has focused on consultant performance in two of the subject areas within the OES sphere of responsibilities: ecology studies and NEPA analysis.15 In each phase of the research, we observed consultant performance in two important ways: 1) the duration of key activities in the development of the engineering design projects that OES activities support; and 2) the number of errors and rounds of OES review associated with documents submitted by consultants.
We find that better communication strategies can assist in addressing both of these problems. However, we also find that improved communications between OES staff and consultants is unlikely to be a panacea for either problem. Improved communication strategies will need to be accompanied by process improvements in OES work flow and enhanced coordination with other units in GDOT (most notably project managers and procurement staff) in order to attain significant improvement in performance.
15 Other subject areas within OES include air and noise pollution, archaeological and historic properties, community resources and environmental justice.
107

The Need for Improvement in the Quality of Environmental Documents The magnitude of the problem confronting OES with regard to the existing level of quality of consultant documents is significant. The experience with Ecology documents provides the clearest indication that a problem exists. In Table 2-4, of the 267 of the Ecology documents in the SharePoint database 172 (or 64.4%) required three or more reviews before being transmitted to federal regulators for review. On average, Ecology consultant reports are returned for correction and revision 2.8 times16. We note that this is probably an undercount. In our examination of the number of document reviews prior to transmission to the federal regulator the upper bound on the range was four, but the actual measure is four or more reviews (61 of the 267 documents reviewed are in this category). Perhaps more telling is only 8 documents (or 3.0% of the 267 documents) were transmitted after the first review.
We also reviewed a small sample of nine of the documents that had been returned three or more times to Ecology consultants that were from randomly selected projects. On average, these nine documents are comprised of 134 pages of reviewable information and contained 24 substantive comments (i.e., an error in the environmental analysis) and 91 non-substantive comments (i.e., grammatical and formatting errors) during the first review. Given that two reviews or higher triggers an in-house workshop between consultants and OES staff, we estimate that roughly 3,440 GDOT/OES manhours are dedicated to the 172 projects that have required three or more reviews over the last two years.17 Here again, our estimate may be an undercount of the actual hours spent on documents requiring multiple rounds of review.
16 This average is based on the information in Table 2-4: (1 review * 8 documents)+(2 reviews *87 documents)+(3 reviews *111 documents)*(4 reviews *61 documents)] / 267 documents = 2.84
17 This is based off of respondent interviews that estimate 8 hours of work following the second or third review triggering a workshop, 8 hours devoted to organizing and conducting the workshop, and 4 hours in final review leading to transmittal.
108

The experience of the Ecology Section is not unique amongst the different specialty areas within OES according to senior OES staff. Ecology documents tend be longer and more complex in terms of the number of technical elements but the pattern of multiple document reviews of consultant reports is common across the specialty areas. This suggests that the savings that could be achieved from a reduction of even a single round of review to GDOT and the OES workload are considerable.
In interviews, OES staff share views reported in earlier studies (Ashuri et al. 2016) that environmental factors are a major risk for schedule delay. We find that the duration devoted to environmental activities in the project design are significantly related to the overall duration of projects (see Table 2-1). However, the time devoted to the document review process is not significantly related to the duration of either the environmental work on a project or the overall project design process (see Table 2-5 and 2-6).
A better way to understand the impact of the document review process is in the opportunity cost to OES of the man-hours devoted to multiple reviews. In leadership interviews, case studies, and focus groups, there is consensus that the dominant task shaping the work of OES staff is document review. GDOT ecologist and NEPA analysts report working with a steady flow of a large number of documents to review throughout the year. The flow of documents is so great that OES has hired consultants to serve as reviewers of documents. Senior managers report that large amounts of their time are devoted to checking the reviews of subordinates, serving as reviewer on particularly complex or time sensitive projects, and troubleshooting problematic document reviews.
The pattern of work reported by OES staff and consultants is indicative of an office that has recently struggled with a time famine. Perlow (1999) observed time famines in engineering teams where work must be performed under tight time frames but are subject to interruptions and changing demands from the external environment. This context creates conditions where workers do not have sufficient uninterrupted time to complete tasks and a culture is created where tasks are completed under intense
109

pressure and crisis. Under such conditions, workers tend to identify key choke points in their areas of responsibilities where they can get maximum production capability and define their work around that point of activity. Interview data with OES staff and focus group data indicate that OES has achieved this level of focus around document review. However, there are other areas of responsibility further upstream that have not received the same levels of concentrated activity.
5.2. Recommendations
The proposed strategies are designed to improve communications with consultants. The key goal driving the design of the strategies is to achieve a more efficient management process for the document flow through OES. Strategies for Improvement in the Quality of Environmental Documents Strategy 1: The most consistent recommendation provided across all the focus groups is that OES should devote greater managerial attention to the EPM and templates for generating environmental documents. OES staff acknowledge that the current EPM and associated templates are out of date and have not been updated in several years due to a lack of available personnel.
OES has attempted to compensate by using Email blasts, GPTQ presentations, and storage of new guidance on SharePoint to inform the consultant community of changes in procedures and reporting formats. Consultants who have worked with OES for several years report maintaining large binders or electronic folders of material to reflect the variety of forms of guidance that have emerged. This has now reached a stage where it is difficult to keep abreast of the current preferred formats for documents. Consultants who have recently begun working with OES report that it is difficult to establish a clear starting point for performance expectations. There are also consultants who are not SharePoint users and have poor access to updated guidance.
110

The absence of up-to-date templates also creates challenges for OES staff. In the case studies, we interviewed OES staff who had recently joined GDOT. They report that their process of training for document reviews of the work product of consultants is done on-the-job and through feedback provided by senior staff. Knowledge that should be in a standardized form is being disseminated in a tacit fashion via word-of-mouth and guidance from senior staff. This adds uncertainty to the work of new OES staff and increases the workload of senior managers.
Strategy 1a: If OES does elect to update the EPM and associated templates, there is a strong consensus amongst the consultants that updates should focus on streamlining OES documents. Consultants who have experience working with other public agencies stated that the current reports required by OES are cumbersome and overly detailed particularly for purposes of a CE assessment. They note that the templates that they use in working with other state DOTs or other federal agencies are shorter, more standardized, are clearer about what the agency expects for meeting the standard of legal sufficiency, and have fewer points in the report where the same data must be entered. According to consultants, the closest agency to GDOT in terms of the document requirements (as well as providing detailed document review) is the National Park Service.
There was a consensus amongst consultants that OES does expect high quality documents. But the perspective was also shared that OES makes life hard on itself by achieving quality through designing a process that requires multiple rounds of review. Consultants would prefer to have a process where the up-front guidance from OES is up-to-date, as standardized as possible, is clear about what will meet the standard of legal sufficiency, and where they can engage with document reviewers earlier in the process to address points of ambiguity.
Recommendation 1.1: Consider conducting a best practice review of templates and communication of guidance amongst state DOTs and any other exemplars in the public transportation policy domain. While consultants understood the OES process to be challenging, they did not have a ready
111

exemplar in the region. Advantages were noted in working with state DOTs in North Carolina, South Carolina, Florida, and Texas but the consultants also acknowledged important differences or recent changes in the states that might not make them a good exemplar.
Recommendation 1.2: If human capital constraints remain a concern, OES should consider hiring a consultant to assist with developing an update of the EPM and templates. There is some risk associated with this approach if it is not done in collaboration with both OES staff and consultants. A few OES staff and consultants expressed concern that consulting firms might not be responsive to template updates or not be informed about the updates on time. One approach to overcoming consultants' resistance and OES staff's late dissemination would be to augment the consultants' work with input from a standing subcommittee within the OES GPTQ process devoted to reviewing templates. Within the consultant community, there are high quality professionals with a deep understanding of state level and OES operations. Some of these are former OES employees, or consultants who have experience serving as document reviewers for OES, and consultants with extensive experience working with other states.
Strategy 2: For some time, OES has been developing a feedback form for assessing consultant performance. This is an important activity and should be fast-tracked to an operational procedure. In our analysis, the incidence of documents that had to be returned three or more times to consultants for correction was not normally distributed across the consulting community. These types of documents are clustered with a subset of firms and consultants.
Both OES staff and consultants acknowledge that at present there is little quality control for the hiring of environmental consultants. OES staff describe the experience of caution when being assigned documents from particular consultants and even some consulting firms. There is a known subset of consultants that are consistent in producing poor quality documents.
Ideally, feedback from OES would provide information to procurement staff that might be used to weigh decisions for awarding contracts. However, even if this goal is not achieved, the use of feedback
112

forms can be used in two important ways. First, it can provide a record of the firms and consultants who consistently perform at a poor level. This can be an important resource for continuity of knowledge within OES. One of the key challenges facing OES (like many public agencies) is high levels of turnover amongst employees. At present, knowledge of low performing consultants is disseminated in a tacit fashion and may not be widely shared. This leads to OES staff learning on-the-job with regard to consultant capabilities. Earlier awareness of past performance levels can give OES staff the ability to budget their time in a more effective manner.
A second possible application of feedback forms is to provide a record of performance to the consulting firm. A comment during the focus groups from consultants who have provided document review services to OES was that the senior managers in the consulting firm might not be aware of the low levels of performance. This limits the ability of senior management to engage in professional development or to discipline low performing workers.
Recommendation 2.1: OES should implement plans for a feedback form that provides performance assessments to the procurement office. The information provided should be in a form that procurement can use as a weighting factor in making project awards. There should be a pilot test for a selected set of contract awards to determine how the feedback information can be used in awarding contracts. There should also be a follow-up assessment with procurement staff to determine if modifications are needed in the form.
Recommendation 2.2.: OES should consider targeted on-boarding for new environmental consulting firms associated with the workshop process. OES already gives staff the discretion to declare the need for an early workshop. Consulting firms that have gone through the workshop report this to be a positive experience and are strongly supportive of this innovation by OES. During the focus groups, consultants were quick to note that they do not want a return to mandatory requirements for training. However, there was room for targeted on-boarding activities for firms new to OES contracts. New firms
113

might be flagged in the OES schedule as a candidate for an early workshop if the OES reviewer sees sufficient errors.
Strategy 3: OES should consider developing procedures for normalizing the range and types of comments offered during the document review process. The edits and guidance provided during document review constitute the period of time in which OES is in most active communication with consultants. One of the strongest points of feedback provided by consultants during the focus groups is that there is wide variability in the types of issues that are identified for correction by OES reviewers. Formats and writing styles that pass muster with one reviewer will be inadequate for another reviewer. This level of inconsistency across document reviewers makes it difficult for consultants to predict OES expectations for quality performance. Many are simply using the format of the most recent report that passed the review process and hope that it continues to meet expectations.
There are several dimensions to this problem raised by consultants. First, consultants do not perceive a consistent standard from OES reviewers regarding legal sufficiency. Second, there is a perception amongst consultants that OES reviewers do not give consistent weight to the goal of project delivery. Respondents in all three focus groups indicated that on a scale of priorities between environmental protection and project delivery some OES reviewers do not strike the appropriate balance and tilt too heavily towards environmental concerns. Third, consultants report that turnover amongst OES staff is common even within the life of a single project. One source of frustration is that when there is turnover, there is rarely a process of hand-off between OES reviewers to facilitate continuity, which can lead to swings wide in performance expectations.
In addition to varying standards of legal sufficiency, consultants report wide variability in the level of decorum used by OES reviewers in expressing criticism of documents. OES reviewers get involved in projects late in the overall design process. Oftentimes their first communication with the consultant occurs after the document has been submitted. The preferred mode of communication for OES staff is
114

through electronic media. This means that the notes and comments provided in the document can seem curt or rude. More important, from the consultants perspective, there is no distinction offered as to which comments are priorities to the reviewer. Several consultants reported in both the case studies and the focus groups preferring to call OES reviewers after receipt of comments to gauge which comments need to be addressed for legal sufficiency and to determine if the reviewer is, in fact, as irritated as the comments seem to indicate (frequently, the reviewer is not).
Recommendation 3.1: OES should consider creating a community of practice for new staff to aid in their training and acculturation to OES standards. Topics of particular importance to the consulting community include: 1) normalization of standards of legal sufficiency across reviewers, 2) best practices in communicating with consultants, 3) being up-to-date on templates, and 4) GDOT's goals and mission.
Communities of practice are informal learning venues anchored by an information technology (IT) hub and meetings for sharing tacit knowledge regarding document review and examples of good and/or challenging practices. The existing SharePoint sites for the Ecology Section and NEPA Section can be ideal vehicles for this type of information exchange. Institutionalizing communications that are currently informal in communities of practice such as a transition meeting can also be utilized to transfer projectbased knowledge and to ensure internal communication after project turnover. Communities of practice have been set up within GDOT before with some success. They are not designed to be a permanent fixture within an organization but can be formed around topics that need the sharing of professional knowledge. An example of such a community is the GDOT RAID group (Roundabouts and Alternative Intersection Design), which started as a community of practice and helped create more formalized processes of information sharing within design processes.
Strategy 4: OES should develop practices for communicating performance quality expectations to the project managers (both within GDOT and in consulting firms). Data from the case studies and the focus groups demonstrate that OES is not the only source of communication regarding quality and performance
115

expectations for environmental activities in the PDP. GDOT project managers and consultant project managers have a strong voice in shaping the performance of OES consultants. The dominant goal shaping these communications is project delivery.
OES consultants note that project managers with high levels of awareness of environmental issues are easier to work with and are effective at integrating environmental information into design processes in a timely way. However, they also report high levels of variability in the level of environmental awareness amongst project managers in both the public and private sector.
Project managers can also be an actor in the document review process. When the project schedule is very tight, OES consultants keep project managers in the loop whenever there are delays or high demands by OES staff for corrections to environmental documents. In both the case studies and in the focus groups, we observed narratives of project managers being called in to troubleshoot a document "held up" in OES.
Coordination and communication with project managers become particularly sensitive when regulatory and procedural changes are introduced by federal agencies. OES is responsible for compiling and coordinating the guidance from environmentally-related authorities at the federal and state levels of government. Within the performance data, an example of this was evident with the application of regulations associated with an endangered bat species whose habitat extended into northern Georgia counties.
Recommendation 4.1: Pilot test targeted communication experiments with project managers and consultants designed to ascertain whether early communications of quality expectations to consultants and project managers have an impact on the duration of projects and the quality of the environmental documents produced. Such an approach would randomly assign a set of projects of similar complexity to a treatment and a control group. The treatment group would receive targeted communications prior to the concept meeting regarding OES performance expectations. The control group would experience the
116

existing patterns of work between OES and project managers. In order to get timely responses, the experiment should focus on a subset of shorter term PCE or CE projects. The goal of the experiment is to see if early communications improve performance by consultants.
Recommendation 4.2: Develop communication strategies that are targeted to project managers. Current practices such as email blasts are targeted to the consulting community. Project managers are also informed of changes in federal rules and procedures. A key question is whether this information is being transmitted at a time and in a form that is likely to be useful to project managers. In interviews, we learned that there have been high levels of turnover amongst GDOT project managers over the last few years. Information transmission needs to be adapted for the additional needs that new project managers may face in their new role.
Strategy 5: The data systems currently maintained by OES and GDOT have proved useful in our understanding of both the durations of project activities and the performance of consultants in the document review process. However, there are areas where data limitations were encountered that, if improved, could provide a stronger foundation of monitoring and assessing performance. We offer the following recommendations based on our observations of the T-Pro, P-6, and SharePoint data:
Recommendation 5.1: T-Pro and P-6 Maintain accurate historical date information for all projects. T-Pro and P-6 are designed to
give an accurate schedule for project delivery. Currently, the baseline and actual date information is not maintained properly. For example, due to missing information, we could not calculate time overruns of environmental summary and project design of 238 and 347 projects, respectively. It would be useful to maintain project schedule properly and to have access to accurate baseline dates and actual dates of performance. It is difficult to get an accurate understanding of the history of a project given the existing data practices. For example, T-Pro does not have a log file of project information. When
117

updates to the schedule are made due to design changes or other factors, the schedule is reset. Historical information about the project is either lost or inaccessible to managers. This makes it difficult to get an accurate historical record for changes in project activities (responses in the pre-meeting survey and the focus groups discussions suggest specific ideas to utilize T-Pro effectively). Certain conditions in a project such as funding have led to a reorganization of the project identifications (PIDs). It would be useful to have a means for linking old and new PIDs so that they can be managed as one project. Currently, there are large numbers of projects without consultant information. Recommendation 5.2: SharePoint As the use of the SharePoint site grows for the Ecology Section and expands into other OES service areas, an approach should be developed for linking the T-Pro and SharePoint data in order to produce a complete record of OES activities. The current SharePoint website can add more user-friendly functions for consultants and reviewers such as a contract search function to fill the form of Document Routing Slip and automated date input system. The current practice of entering document due dates could be improved so that OES and consultant have a better means for tracking the document review process. Document deficiency types can be organized and added in the SharePoint website. The deficiency types will provide a more detailed explanation than the error types, which can be utilized for normalizing document review comments and creating document review standards. Organize the announcements available to the consulting community through SharePoint based upon their contents. This will enhance the ability to search through announcements. Search capabilities can be augmented with the creation of lists of content types.
118

OES has introduced several innovations over the last few years that have improved communications with its consulting community including: a) the introduction of workshops to address multiple rounds of reviews, b) the use of SharePoint and FTP sites for facilitating the transfer of communications and documents, and c) the use of email blasts for announcements and updates. Most of these innovations have been well received by the consulting community as positive steps.
In this research project, we have identified alternative strategies that OES might pursue to be more effective in communicating quality expectations. Collectively, these strategies fall into two broad categories. First, there are efficiency gains to be made if OES will move from a defensive position focused on document review to postures that involve earlier engagement with OES consultants, project managers, and procurement. OES is in the position where it does not control all of the signals nor the means for conveying quality expectations to the consulting community. For this, it requires greater collaboration and communication with other units within GDOT. Second, current patterns of management lead to multiple rounds of review. This pattern should be adjusted through a combination of template improvements and earlier communication engagement with consultants prior to the document submission. Reducing the average number of reviews on documents to a number closer to two has the potential to release large amounts of the available human capital to more productive purposes.
119

Descriptive Statistics A.1. Performance Data Issues
The performance data maintained by GDOT provide a useful foundation for understanding the duration of projects and the duration of activities conducted throughout the project. However, there are issues that limit the scope of the analysis and the ability to draw statistical inferences. These issues fall into two broad classes of limitations: a) issues with data availability, and b) the reliability of the data recorded.
Data Availability. Data entry in the T-Pro and P-6 requires inputs from a variety of GDOT offices as well as from consultants. Some fields in the system are required and have clear responsibilities set forth for data entry. For other fields, GDOT staff and consultants have some discretion on whether to enter information into the database. This discretion leads to the following types of problems in the availability of performance data:
Limitations in Information on Project Changes: In the current structure of T-Pro and P-6, the historic record of changes to the schedule is not accurately captured. Engineering design projects are reciprocal work flows with the inputs from a variety technical specialties and offices flowing back and forth until the final design document is produced. However, T-Pro and P-6 are maintained as updated systems. When changes to the schedule occur, the entire system is updated to reflect the agreed upon performance timeline. The record of the old timeline is not maintained within the system in an effort to eliminate confusion across project team members. While this approach has merit in terms of keeping teams on the same page with regard to performance objectives, it creates difficulties for accurately constructing the history of projects and factors that led to delay. NEPA analysts have developed a Project Change Form to try and
120

capture this history. The information on these forms are not integrated back into the existing databases. Limitations in Consultant Information: Currently, data entry practices for ecology consultant information in the T-Pro deem optional. Many projects in the database do not have the consultant information such as a consulting firm name for ecology tasks and the number of consulting firms involved in the tasks. For example, out of 560 projects in our data set, 250 projects (44.6%) did not have any ecology consultant information. Since one project can have several consultants for multiple technical studies and tasks, the size of actual missing observations might be bigger. It was also difficult to determine if the distribution of projects without this information is systematically different from those with the information. For this reason, we could not include certain consultant information such as number of ecology consultants in the regression models. In the OLS regression analysis, we selectively included some consultant variables, and conducted analyses of models with information about the consultant ecology reviewer, in-house NEPA, and in-house design. For the regression models, out of 560 observations, we lost 237, 243, 237, and 341 observations in Model 1, 2, 3, and 4, respectively. Activity Delay and Time Overruns: We reviewed several environmental activity date entries in P6, and noted that many projects are missing baseline date information for key environmental activities (see environmental activity duration in Table A-1). For example, out of 560 projects that environmental summary duration was calculated, only 319 projects (57%) were able to calculate environmental summary overruns with baseline finish date information. There were also observations that demonstrate inconsistency in the data. For example, out of 319 projects that were able to calculate environmental summary overruns, some projects contain outliers, such as 3977 days overruns and 1688 days underruns. This led us to use activity duration, which is
121

calculated based on hard date information rather than use data on time overruns and/or activity delay.
Data Reliability. A second set of data issues relates to the reliability of the measures recorded. These are issues that stem from the interpretation of the measures recorded in GDOT datasets. The following issues were identified as part of this set:
Change of activity codes in P-6. P-6 has experienced a major change of format. The sample selected for this study covers the past five years of activity, which means our dataset includes data before and after the change. The previous format was not as detailed as the current format. In addition, many projects are missing the activity information, and some available data also shows inconsistency. For example, out of 505 PCE and CE projects, 124 projects were not able to calculate CE state review duration due to missing date information, and 67 projects had negative values in the CE state review duration, which might be caused by a mistake in data entry. In the data modification process, we treated these inconsistent cases as a missing value. Even though we analyzed the projects that have consistent activity information, some activity information might be inaccurate.
PID. There are some cases in which several PIDs were employed for one project. Out of 811 projects that were collected as an original data set, we eliminated 12 projects that have the same T-Pro comment, county name, and project starting date with other PID. In spite of this modification, there might be more duplicated projects in the dataset.
A.2. Environmental Activity Overview
Table A-1 shows the duration of environmental activities. Resource identification such as Ecology Resource Survey Reports took 209 days and technical studies such as Ecology Assessment of Effects Reports took 483 days. Categorical Exclusion (CE) preparation, state review, and federal review took 193,
122

104, and 104 days, respectively. Draft Environmental Assessment (DEA) preparation, state review, and federal review took 707, 543, and 402 days, respectively. Final Environmental Assessment (FEA) preparation, state review, and federal review took 447, 37, and 121 days, respectively. Environmental certification took 37 days for ROW authorization and 18 days for construction authorization. On most environmental activities, local projects took more time to complete than state projects. In particular, local projects took 22, 405, and 47 days longer on state review of CE, DEA, and FEA than state projects, respectively.

Table A-1 Environmental Activity Duration

Environmental Activity
Project Design Environmental Summary Resource Identification Technical Studies CE Preparation CE State Review CE Federal Review DEA Preparation DEA State Review DEA Federal Review FEA Preparation FEA State Review FEA Federal Review ROW Certification LET Certification

Total

N

Mean (Days)

S.D.

452 1209.8 1213.70

560 769.3 1034.17

244 208.7 366.90

543 482.8 727.55

463 193.3 314.51

299 103.7 300.45

378 103.5 229.92

48 706.8 653.75

34 543.4 725.74

41 402.3 561.74

47 446.6 535.93

31 37.4 50.75

46 121.0 183.96

195 36.7 163.01

378 17.7 42.48

Local

N

Mean (Days)

S.D.

153 1301.7 1010.43

197 829.1 1023.19

58 192.7 301.57

191 460.4 572.79

139 192.0 314.00

135 115.9 332.23

148 105.4 265.90

23 806.0 866.36

19 722.2 878.06

23 491.0 660.11

23 138.1 84.03

18 57.4 52.21

23 129.5 218.89

50 27.7 103.33

138 21.8 44.61

State

N

Mean (Days)

S.D.

299 1162.7 1304.62

363 736.9 1040.04

186 213.6 385.61

352 495.0 799.59

324 193.8 315.22

164 93.7 272.17

230 102.3 204.06

25 615.4 361.68

15 316.8 389.63

18 289.0 392.85

24 742.3 617.50

13 9.6 33.78

23 112.6 145.47

145 39.8 179.25

240 15.4 41.12

Table A-2 shows the durations of environmental activities based on different environmental document types.

123

Table A-2 Environmental Activity Duration of the Document Types

Environmental Activity
Project Design Environmental Summary Resource Identification Technical Studies CE Development CE State Review CE Federal Review DEA Development DEA State Review DEA Federal Review FEA Development FEA State Review FEA Federal Review ROW Certification LET Certification

PCE

CE

EA

N Mean S.D. N Mean S.D. N Mean S.D.

218 546.73 559.170 194 1551.10 1047.264 40 3168.10 1702.061

259 264.75 439.325 246 893.02 869.218 55 2592.04 1430.607

76 151.63 176.119 125 263.57 452.249 43 149.86 318.124

247 175.24 269.399 244 634.34 723.824 52 1233.02 1279.521

243 98.12 185.818 220 298.37 386.499

101 82.56 373.641 198 114.51 255.620

140 79.96 240.483 238 117.40 222.826

48 706.75 653.747

34 543.35 725.735

41 402.32 561.738

47 446.60 535.934

31 37.35 50.748

46 121.04 183.964

28 10.25 26.112 131 42.28 194.298 36 36.97 76.627

189 13.64 32.443 162 24.06 53.244 27 8.44 23.253

Table A-3 shows descriptive statistics of task outsourcing. In the data set, 68% of NEPA documents was contracted out. Consultants were hired for 50% of PCE, 77% of CE, and 94% of EA documents. The design work showed a similar pattern with NEPA work: 64% of design works was contracted out, and CE and EA documents showed higher rates in consultant work. Unlike NEPA and design work, fewer consultants (18%) were hired for ecology review work.

Table A-3 Task Outsourcing Based on Environmental Document Type

Consultant NEPA works In-house NEPA works Total NEPA works Consultant ecology review In-House ecology review Total ecology review Consultant design GDOT design Total design

Total
309 143 452 89 401 490 275 153 428

%
68.4 31.6 100.0 18.2 81.8 100.0 64.3 35.7 100.0

Count

PCE CE

86 173

87

53

173 226

45

41

182 168

227 209

80 152

61

83

141 235

% within Document Type

EA PCE CE

EA

50 49.7 76.5 94.3

3 50.3 23.5 5.7

53 100.0 100.0 100.0

3 19.8 19.6 5.6

51 80.2 80.4 94.4

54 100.0 100.0 100.0

43 56.7 64.7 82.7

9 43.3 35.3 17.3

52 100.0 100.0 100.0

124

Table A-4 provides the results of a t-test between groups based on outsourcing information. The difference in the environmental summary duration and project design duration between in-house NEPA group and consultant NEPA group is statistically significant. The projects of in-house NEPA group take 336 and 486 less days in environmental summary and project design, respectively. The difference in the environmental summary duration and project design duration between in-house ecology review group and consultant ecology review group is also statistically significant. The projects of in-house ecology review group take 546 and 442 more days in environmental summary and project design, respectively. The difference in the environmental summary duration and project design duration between GDOT design group and consultant design group is not statistically significant.

Table A-4 Outsourcing and Durations of Environmental Summary and Project Design

In-house NEPA

Environmental

Yes

Summary Duration No

Project Design

Yes

Duration

No

In-House ecology Environmental

Yes

review

Summary Duration No

Project Design

Yes

Duration

No

GDOT design

Environmental

Yes

Summary Duration No

Project Design

Yes

Duration

No

* P < 0.05

N Mean S.D. t p S.E.D 95% CI

143 309

639.6* 976.2*

893.12 1148.00

-3.39 (348)

.001

99.21

[-531.77, -141.51]

109 240

1047.2* 1533.3*

1122.48 1305.94

-3.36 (347)

.001

144.58

[-770.38, -201.66]

401 89

946.6* 401.3*

1159.12 363.00

7.85 (441)

.000

69.51

[408.70, 681.91]

322 74

1370.8* 928.6*

1359.37 661.13

4.10 (234)

.000

107.91

[229.55, 654.77]

153 275

976.8 948.4

1129.90 1044.72

0.26 (426)

.794

108.51

[-184.91, 241.66]

128 212

1424.9 1535.3

1385.13 1169.46

-0.79 (338)

.433

140.46

[-386.64, 165.94]

Table A-5 provides the results of a t-test between groups based on whether the environmental summary was interrupted by regulatory changes. The difference in the environmental summary duration and project design duration between the regulatory intervention groups ("Yes" group) and no regulatory intervention groups ("No" groups) is statistically significant. The differences in environmental summary duration between bat "Yes" group (N=143, M=1148.0) and bat "No" group (N=379, M=637.0) and project
125

design duration between bat "Yes" group (N=115, M=1414.6) and bat "No" group (N=315, M=1171.5) is also statistically significant. Finally, the difference in environmental summary duration between sturgeon "Yes" group (N=145, M=1052.2) and sturgeon "No" group (N=377, M=671.2) is statistically significant.

Table A-5 Regulatory Interventions and Durations of Environmental Summary and Project Design

Intervened by Regulation Change

Bat

Environmental

Yes

Regulation Summary Duration No

Project Design

Yes

Duration

No

Sturgeon Environmental

Yes

Regulation Summary Duration No

Project Design

Yes

Duration

No

* P < 0.05

N Mean

S.D.

t p S.E.D 95% CI

143 379

1148.1* 637.0*

1166.16 975.14

4.66 (221)

.000

109.63

[295.05, 727.16]

115 315

1414.6* 1171.5*

965.29 1311.22

2.09 (274)

.038

116.45

[13.83, 472.33]

145 377

1052.2* 671.2*

1079.30 1027.19

3.74 (520)

.000

101.81

[181.04, 581.07]

116 314

1310.9 1209.1

751.16 1367.46

0.98 (367)

.328

104.02

[-102.8, 306.33]

A.3. Ecology Document Review
Table A-6 shows how many times ecology documents were returned based on document types, levels of complexity, and deficiency types. Overall, 36.1% of ecology documents were transmitted after the first or second round of review, and 40.9 % documents required the third round of review. 23 % of documents were transmitted in its fourth or higher version. Higher level of complexity and substantial error caused more rounds of review (See the percentages of 4th or higher versions based on "Levels of Complexity" in Table A-6). Assessment reports such as Ecology Resource Survey-Assessment of Effects Report (ERS-AOER) and Biology Assessment (BA) and Addendum have comparatively more rounds of review (the percentages of the 4th or higher version transmitted document are 27.6%, 25.0%, and 27.3%, respectively). Permit-related documents such as Individual Permit Application (IPA) and Pre-Construction

126

Notification (PCN) also lead to more rounds of review (the percentages of the 4th or higher version transmitted document are 80.0% and 29.23%, respectively).

Table A-6 Returns of Ecology Documents

Classification

Total

Levels of 1 Complexity 2
3

4

Error Type Non-Substantial Substantial

Ecology ADDM

Document ASR

Type

BA

BV Exemption

BV Mod

BVA

EAOER

ERS-AOER

ERSR

IPA

Memo

PAR

PCN

Permit Mod

PSSR

N %

Transmitted Version

% within Classification

1st or 3rd 4th or 1st or 3rd 4th or

2nd

higher 2nd

higher

274 100

99 112

63 36.1 40.9 23.0

71 25.9

39

27

5 54.9 38.0 7.0

66 24.1

21

32

13 31.8 48.5 19.7

97 35.4

27

39

31 27.8 40.2 32.0

40 14.6

12

14

14 30.0 35.0 35.0

101 36.9

55

37

9 54.5 36.6 8.9

157 57.3

32

73

52 20.4 46.5 33.1

55 20.1

15

25

15 27.3 45.5 27.3

15 5.5

9

5

1 60.0 33.3 6.7

4 1.5

1

2

1 25.0 50.0 25.0

1 0.4

0

1

0 -

100.0 -

1 0.4

1

0

0 100.0 -

-

23 8.4

9

9

5 39.1 39.1 21.7

5 1.8

2

2

1 40.0 40.0 20.0

58 21.2

15

27

16 25.9 46.6 27.6

17 6.2

4

9

4 23.5 52.9 23.5

5 1.8

1

0

4 20.0 -

80.0

26 9.5

14

9

3 53.8 34.6 11.5

9 3.3

2

6

1 22.2 66.7 11.1

24 8.8

9

8

7 37.5 33.3 29.2

2 0.7

2

0

0 100.0

29 10.6

15

9

5 51.7 31.0 17.2

Table A-7 shows the duration of ecology document review. It takes 86 days from document assignment to document transmittal to federal agencies. The first and second rounds of review take 26 and 15 days, respectively. It takes 22 days from the third version submission to document transmittal. The first and second rounds of review meet the OES timelines that allow 30 days for the first round and 2 weeks for the second round.

127

Table A-7 Ecology Document Review Round Durations

Total Review Duration Total 1st Version Review Duration 1st Version: Reviewer to Manager 1st Version: Manager to Consultant
Consultant Work for Version 2 Total 2nd Version Review 2nd Version: Reviewer to Manager 2nd Version: Manager to Consultant Consultant Work for 3rd version
Total 3rd and Higher Version Review

N

Min. Max. Mean

(Days)

267

0

457 85.66

255

0

167 26.52

222

0

126 14.26

215

0

153 13.01

257

0

232 21.45

180

0

172 14.61

210

0

40 7.46

157

0

152 7.11

168

0

284 11.34

172

0

230 22.26

S.D.
70.765 24.307 14.436 21.939 33.615 18.428 8.374 13.681 23.998 32.379

Table A-8 shows the document review duration of ecology consulting firms. The number of documents per firm ranged from 1 to 71 documents, and the document review duration demonstrated a wide range of values from 38 to 327 days. This wide range implies that capabilities of the consulting firms might result in different outcomes in environmental services.

128

Table A-8 Document Review Duration of Ecology Consulting Firms

Firm Name
Total Firm A Firm B Firm C Firm D Firm E Firm F Firm G Firm H Firm I Firm J Firm K Firm L Firm M Firm N Firm O Firm P Firm Q Firm R Firm S Firm T Firm U Firm V Firm W Firm X Firm Y

M (Days)
85.66 50.00 38.00 102.64 327.00 127.00 105.00 49.50 99.00 103.34 64.39 158.00 84.33 113.75 64.04 88.17 71.33 124.29 69.20 111.33 78.50 149.75 50.50 177.00 42.00 88.09

N
267 1 5
25 1 1 1 2 2
41 71
1 6 16 23 6 3 7 10 3 8 8 6 1 1 11

S.D. 70.765
33.369 77.871
54.447 128.693
78.383 56.475
18.381 51.613 57.149 35.617 41.956 38.313 47.490 119.739 62.290 153.447 37.930
43.055

Table A-9 shows how many times ecology documents were returned based on consulting firms.

129

Table A-9 Returns of Ecology Documents of Consulting Firms

Classification
Total Firm A Firm B Firm C Firm D Firm E Firm F Firm G Firm H Firm I Firm J Firm K Firm L Firm M Firm N Firm O Firm P Firm Q Firm R Firm S Firm T Firm U Firm V Firm W Firm X Firm Y

N
274 1 5
25 1 1 1 2 3
42 73
1 6 16 23 6 3 7 10 4 8 8 6 1 1 13

%
100 0 2 9 0 0 0 1 1
15 27
0 2 6 8 2 1 3 4 1 3 3 2 0 0 5

Transmitted Version

1st or 3rd 4th or

2nd

higher

99

112

63

0

1

0

1

2

2

11

12

2

1

0

0

0

1

0

1

0

0

1

0

1

2

0

1

12

14

16

37

26

10

0

0

1

1

1

4

1

10

5

10

10

3

0

4

2

2

1

0

0

2

5

3

6

1

1

2

1

3

2

3

2

5

1

0

5

1

0

1

0

0

1

0

7

3

3

% within Classification

1st or 3rd 4th or

2nd

higher

36.1 40.9 23.0

100.0

20.0 40.0 40.0

44.0 48.0

8.0

100.0

100.0

100.0

50.0

50.0

66.7

33.3

28.6 33.3 38.1

50.7 35.6 13.7

100.0

16.7 16.7 66.7

6.3 62.5 31.3

43.5 43.5 13.0

66.7 33.3

66.7 33.3

28.6 71.4

30.0 60.0 10.0

25.0 50.0 25.0

37.5 25.0 37.5

25.0 62.5 12.5

83.3 16.7

100.0

100.0

53.8 23.1 23.1

Table A-10 shows the document review duration based on improvement types of the engineering design projects. The results were different from the environmental summary duration and project design duration shown in Table 2-1. Unlike the wide variances from 350 to 4,107 days in the environmental summary duration in Table 2-1, the document review duration ranged from 65 to 125 days except traffic management projects, which comprise a small portion of the environmental summary and comparatively narrow variance.

130

Table A-10 Document Review Durations of Project Improvement Types

Improvement Type

M (Days)

N

Total

86.13

312*

Bridge Rehabilitation with No Added Capacity

92.33

3

Bridge Replacement with Added Capacity

117.00

1

Bridge Replacement with No Added Capacity

112.53

60

Construction of New Bridges Construction of New Roads Major Widening

125.00

4

92.64

11

89.07

29

Minor Widening

65.36

11

Other Enhancements

70.52

29

Reconstruction with Added Capacity Reconstruction with No Added Capacity Relocation with No Added Capacity

103.20

5

97.00

2

86.00

1

Restoration, Rehabilitation, & Resurfacing

100.50

4

Safety Improvements

78.94

36

Traffic Management/Traffic Engineering

48.58

19

* Documents that have multiple PIDs were duplicated to match with project information.

S.D. 69.263 41.102
80.047 45.453 63.175 79.521 61.899 44.359 72.220 86.267
165.297 51.380 45.264

Table A-11 shows how many times ecology documents were returned based on project improvement types.

131

Table A-11 Returns of Ecology Documents of Improvement Types

Classification Total Bridge Rehabilitation Bridge Replacement (Added Capacity) Bridge Replacement Construction of New Bridges Construction of New Roads Major Widening Minor Widening Other Enhancements Reconstruction (Added Capacity) Reconstruction Relocation Restoration & Resurfacing Safety Improvements Traffic Management & Engineering

Transmitted Version % within Classification

1st or

4th or 1st or

4th or

N % 2nd 3rd higher 2nd 3rd higher

321 100 107 138 76 33.3 43.0 23.7

3 .9

0

0

3

100.0

1 .3

0

1

0

100.0

62 19.3

23

24

15 37.1 38.7 24.2

4 1.2

0

3

1

75.0 25.0

11 3.4

4

6

1 36.4 54.5 9.1

35 10.9

16

13

6 45.7 37.1 17.1

11 3.4

3

7

1 27.3 63.6 9.1

29 9.0

6 17

6 20.7 58.6 20.7

5 1.6

1

3

1 20.0 60.0 20.0

2 .6

1

0

1 50.0

50.0

1 .3

0

1

0

100.0

4 1.2

2

2

0 50.0 50.0

36 11.2 10 15 11 27.8 41.7 30.6

19 5.9 11

8

0 57.9 42.1

132

Case Study Summary B.1. High-quality Document Case 1
High-quality document case 1 was a road-widening project for a local government. The project design was contracted out to a consulting firm acting as the prime consultant. The prime consultant chose two separate sub-consulting firms to do the ecology and NEPA documentation. Interviewee Characteristics Interviews were conducted with the two firms performing the ecology and NEPA documentation. The ecology studies were conducted by a firm experienced with environmental GDOT contracts, and handled by a consultant who had ten years of experience working as an ecology specialist, six of which came from experience working directly with GDOT. The NEPA documentation was conducted by a small firm, which focuses primarily on NEPA and community engagement processes for transportation projects. The consultant assigned to this project had 27 years' experience in the field, 25 of which came from working directly with transportation projects. Further, interviews were conducted with the ecologist and NEPA analyst at GDOT. The GDOT ecologist had been working at GDOT for the past five years, while the NEPA analyst had been with GDOT for a little less than two years. Project Characteristics Interviewees reported that the project was not highly complex environmentally and that its scope remained the same throughout its duration. The project initially required a NEPA Environmental Assessment (EA), but was later reduced to a Categorical Exclusion (CE). The consultant discovered a rule that allowed the project to be completed under the less intensive
133

CE status and pursued this course with GDOT and the project manager to shorten the project schedule. Thus, the project process was facilitated by the consultants' active stance in looking for ways to shorten the project schedule.
There were no design, procedural, or regulatory changes made, which proved impactful during the life of this project. GDOT did suffer turnover during this time, but the consultants were accustomed to working around this problem and it did not have any significant impact. However, the NEPA consultant noted that, generally, high turnover would represent a challenge for effective communication. Project Communication Overall, this case shows active communication (a) between consultants internally, (b) between reviewers internally, (c) between consultants and reviewers, and (d) between the GDOT project manager and the consultants. The consultants drove communication with all project team members and kept an open line of communication with OES reviewers, making it easier for both sides to address comments quickly.
First, consultants took part in project-related conference calls, which were organized by the prime consultant. These calls facilitated sharing information about the project and discussing on what needed to be done next.
Second, interviews suggest that there was some interaction between reviewers. The GDOT ecologist reported about a transition meeting with the previous reviewer who explained the background and status of the project.
Third, interviews showed early correspondence between the NEPA analyst and the NEPA consultant before the submission of the document. The NEPA analyst was involved in the
134

organization of a public meeting that required frequent communication with the NEPA consultant. There was, however, no communication between the ecology consultant and GDOT ecologist before the document submission. After the documents were submitted, both consultants were proactive in talking to the reviewers about the comments they received, speaking two to three times a week, and were able to resolve their necessary revisions quickly. Reviewers noted that the quality of the first draft was very good and that they only provided minor comments. Similarly, the ecology consultant noted that the GDOT ecologist focused on getting the document approved as quickly as possible and provided a few comments on the technical accuracy and legal sufficiency of the document. They did not lengthen the process by focusing on unimportant minutia within the document. Interviewees reported that the documents only went to one or two rounds of review. Overall, the consultants reported no communication challenges with OES during the lifetime of this project.
Fourth, consultants stated that there was frequent communication with the GDOT project manager as well as the project manager of the local government who was responsible on the design end. The NEPA consultant remarked the active role of the local government's project manager: "He stayed on top of the project and the designers at [prime consultant], just constantly making sure that things went through." Further, the NEPA consultant reported early interactions with the GDOT project manager at concept meetings and Preliminary Field Plan Review/Final Field Plan Review (PFPR/FFPR) meetings.
The majority of the overall communication occurred via email, but the consultants primarily relied on phone calls later in the project's lifespan when clarifying and resolving review comments. Though email served as a functional medium, telephone calls provided a direct form
135

of communication, which was more open and suited to back-and-forth conversations about how to address reviewer comments. Face-to-face meetings with GDOT were viewed as rare because they were difficult to organize due to conflicting schedules and the security protocol at GDOT. Both the ecology and NEPA consultants were familiar with the Environmental Procedures Manual (EPM), but only used it occasionally because it is largely outdated. The ecology consultant also reported using SharePoint regularly, but the NEPA consultant did not since he could not log in to the site and was having trouble gaining access. He had been attempting to get access for a long period of time, but had not had any success with the IT department with getting him set up.
B.2. High-quality Document Case 2
High-quality document case 2 was a bridge replacement project with no added capacity at the state level. GDOT managed the project design, but contracted the ecology work and NEPA documentation out to a large consulting firm accustomed to GDOT work. Interviewee Characteristics The primary consulting firm handled both the NEPA document and the special studies and assigned them to two of their experienced consultants. They had ten and three years of experience in the field as ecologists, and were highly regarded by GDOT staff. The consultants reported that GDOT had been their major client for the past ten years. Further, interviews were conducted with both the GDOT ecologist and the NEPA analyst. The NEPA analyst had 20 years of experience doing NEPA work both in the private and the public sector and joined GDOT two
136

years ago. The GDOT ecologist joined GDOT two years ago after the completion of her graduate studies. Project Characteristics This project required NEPA documentation at the CE level. The consultant reported that the project was environmentally very complicated: "Usually we have bridge placements, fairly straightforward, but we had all of the disciplines involved, and for ecology, we had a lot of different waters for less than a mile."
There were no procedural or regulatory changes made which proved impactful during the life of this project. This project dealt with reviewer and project management turnover, but the consultants did not report any significant impacts associated with the changes. The primary challenge in completing this project was a design change that occurred late in the process. After a large portion of the work on the NEPA documentation had already been done, the project designs were changed, requiring environmental reporting to be done on a larger area than originally stated. This site expansion required additional surveys and reporting that were not previously necessary. Project Communication Overall, this case shows active communication (a) between reviewers internally, (b) between the design project managers and the consultants, and (c) between consultants and reviewers.
First, both the GDOT ecologist and the NEPA analyst inherited the project and actively engaged in knowledge transfer. The GDOT ecologist requested a project update from the GDOT project manager who established a connection with the consultants and the NEPA analyst. The NEPA analyst reported that he was part of a conference call between the consultants and GDOT
137

staff who were involved in the project including an ecologist shortly after his assignment to the project. During this conference call, the ecologist learned more about the project history and the project status. Later in the project, the NEPA analyst was re-assigned and conducted a transition meeting to hand off the project to the next NEPA analyst.
Second, the consultants engaged in active communication with the design team throughout the life of the project. The consultants organized monthly meetings (in-person meetings or conference calls) for the project team to go over their progress and mitigate any potential problems. One consultant highlighted that one of the major communication challenges was the turnover of the project manager: "Because one guy or woman leaves and all that information needs to get passed on to the next one, to the next one, to the next one. So you had to keep them [updated] -- continually updating them."
Third, although there was no specific NEPA or ecology meeting between consultants and reviewers, both the NEPA analyst and the GDOT ecologist occasionally participated in the project meetings with the consultants and design team members. There was also some correspondence between consultants and reviewers before the submission of the documents. The GDOT ecologist noted that the consultants "Were proactive about inviting me on the surveys" and "Took care of things [...] before I even knew they were an issue". After the documents were submitted, the ecology consultant was proactive in talking to the GDOT ecologist about the comments in order "To work things out instead of just replying to [the] comment." The ecology consultant also sent a memo to the ecologist summarizing what they had discussed in their phone call. The GDOT ecologist noted that the quality of the document was very good and only required minor revisions.
138

After the final document was submitted to FHWA, OES handled the communication with the federal agency and disseminated information about the review.
For communication channels, the consultants regularly used email and telephone to communicate with GDOT staff. Further, they initiated face-to-face meetings and conference calls with GDOT staff to discuss the project status and project changes. The consultants reported having used both the EPM and the OES SharePoint site. The SharePoint site was perceived as a useful tool for learning information from GDOT. However, the consultants found the EPM not particularly useful due to it being out of date.
B.3. High-quality Document Case 3
High-quality document case 3 was a bridge replacement project with no added capacity at the state level. GDOT managed the project design internally. The ecology and NEPA documentation were contracted out to the same consulting firm as in high-quality document case 2. Interviewee Characteristics Since the ecology and NEPA documentation were carried out by the same consulting firm as in high-quality document case 2, interviews were conducted with the same individuals. Further, interviews were conducted with both the GDOT ecologist and the NEPA analyst. Both interviewees at GDOT had 10 years of experience working as reviewers at GDOT. Project Characteristics This project required NEPA documentation at the CE level and interviewees indicated that it was not complex. While there were no impactful procedural or regulatory changes, the consultants,
139

however, did point out a number of challenges that they faced during the lifetime of this project, including scoping issues, a project expansion, a schedule change, and OES turnover.
First, the environmental work was complicated by a miscommunication with GDOT early in the process. The consultants indicated that the task order was written in such a way that it would be only one bridge replacement project. Instead, after they had done their project scoping, they found out that there were actually two bridges. This was disruptive for the consultants' work early on, but they were able to adjust and get the project back on track.
Second, the project area was expanded by a significant amount while the consultants were working on the assessment of effects report. The consultants were able to adapt to these changes and worked closely with the design team.
Third, in an effort to fit the project into the expiring fiscal year, the LET date was accelerated and the consultants had a difficult time adapting to this unanticipated schedule change. They were able to manage to meet the schedule as planned, but this change caused their procedure significant stress.
Finally, this project suffered from GDOT turnover and the consultants reported significant impacts on project communication (see below). Project Communication Overall, this case shows active communication (a) between the design team and the consultants, and (b) between consultants and reviewers. Throughout the project, the consultants were driving the communication with OES and the design team.
First, the consultants engaged in a kick-off meeting with the project team for scoping purposes. Further, they were proactive in communicating with the design team throughout the
140

project. The consultants conducted a site visit early in the project which proved to be "Helpful in terms of [...] communicating on everything that needed to be included in his [the designer's] plans." The expansion of the project area created another opportunity to work with design staff.
Second, there was some correspondence between the ecology consultant and the GDOT ecologist before the submission of the documents. The reviewer contacted the ecology consultant and inquired about the project status. The consultant reported that, compared to other projects, the amount of communication with OES was limited because the project was less complicated and did not require as much communication. The majority of communication between the GDOT ecologist and the ecology consultant occurred after the document submission. The ecology consultant experienced some challenges during the review due to the inexperience of the reviewer: "He wasn't as well versed with the templates and the EPM and everything, so [...] that made that review process a bit lengthier." The consultant received a large number of comments and requests for revisions, which he deemed unnecessary. The ecology consultant employed an open line of communication during the environmental summary process and called the reviewer "instead of just submitting" the document.
Interviewees suggest that there was no communication (i.e., transition meetings) between the reviewers after projects were re-assigned. The ecology consultant indicated that the ecologist "brought up an issue that I had already discussed with a previous ecologist but he didn't know that obviously."
The consultants used multiple communication channels in order to communicate with OES and the design team, including in-person meetings, email and phone. The NEPA analyst, on the other hand, preferred emails in order to document the communication with consultants. The
141

EPM was not perceived as a useful tool by consultants because it does not encompass all the changes that occurred during the past few years. The consultants did, however, use the SharePoint site during the project.
B.4. Low-quality Document Case 1
Low-quality document case 1 was a bridge replacement with no added capacity project at the state level. GDOT managed the project design, but contracted the ecology and NEPA documentation out to a consulting firm, which acted as the prime consultant. The prime consultant subcontracted a second firm to complete the NEPA documentation. Interviewee Characteristics The ecology work was conducted by a firm, which is experienced with both ecology and NEPA document preparation. The consultant working on this project had fifteen years of experience in his role as an ecologist, ten of which came from experience working directly with GDOT. The NEPA documentation was conducted by a small firm, which has also conducted NEPA works of High-quality Document Case 1. The consultant assigned to this project had 27 years' experience in the field, 25 of which came working directly with transportation projects. Further, an interview with the GDOT ecologist was conducted. In this case, OES had contracted out the review to a consulting firm, and the reviewer in this case had served as an in-house GDOT ecologist under the contract. The consultant reviewer had mostly been working for consulting firms specializing in ecology, wetlands and associated permitting, but had also eighteen months of experience working as a NEPA analyst and ecologist for GDOT. An interview with the NEPA analyst could not be conducted.
142

Project Characteristics This project required NEPA documentation at the CE level and interviewees characterized this project as being straightforward, simple and not very complicated.
Although the project suffered reviewer turnover, the consultants did not report any significant impacts associated with the changes. The ecology consultant did, however, complain about a lengthy review process caused by unnecessary comments by the GDOT ecologist (see below).
While there were no procedural or regulatory changes during the lifetime of this project, a change to the design of the bridge significantly affected the project schedule. The design team added a right-of-way in a protected species area and informed the consultants about that design change at the FFPR meeting. Since this change was made after the submission of the assessment of effects report, the consultants had to do an addendum, disrupting the project schedule. In addition, the NEPA consultant had to wait for the ecology consultant to be finished with his work before the NEPA reevaluation could be completed. The NEPA consultant estimated that the reevaluation process extended the project for another eighteen months. Project Communication Overall, interviews suggest that there was little communication (a) between the consultants and the design team and (b) between the consultants and the GDOT ecologist.
First, the consultants and the GDOT design team communicated at the PFPR and FFPR meetings. However, there appears to be no communication beyond that. The ecology consultant noted that there would be a "big disconnect" between consultants and designers in general, characterized by the designers' insufficient understanding of ecology requirements. The ecology
143

consultant commented on that as follows: "I don't think they [the design team] quite understood when they made this simple change, that there's a whole ramification that environmental has to come behind and look at."
Second, neither the consultants nor the GDOT ecologist actively initiated and stimulated communications prior to the review process. Rather, communication was limited to email exchanges during the review process. The ecology consultant responded to the reviewers' comments electronically and revised the document accordingly, but did not call to discuss and resolve comments. Interestingly, the interview with the GDOT ecologist indicates that there is the expectation for consultants to manage communications: "I was always available myself over the phone and made myself available if they wanted to discuss things." Both parties were unsatisfied with the overall review process. The ecology consultant complained about lengthy reviews and "picky" comments: "It would be things that would be almost personal preference". The GDOT ecologist, in turn, pointed out the poor quality of the report, which required a workshop.
In terms of communication channels, all parties predominantly relied on emails. Further, the ecology consultant referenced both the EPM and the OES SharePoint site.
B.5. Low-quality Document Case 2
Low-quality documents case 2 was a safety improvement project at the state level for a turn lane addition. GDOT contracted a large consulting firm to handle the project design, ecology studies, and the NEPA documentation.
144

Interviewee Characteristics The prime consulting firm assigned the project to one of its experienced consultants who had been working as an environmental consultant for thirteen years. Further, interviews were conducted with the GDOT ecologist and the NEPA analyst. The GDOT ecologist joined GDOT two years ago after the completion of her graduate studies. The NEPA analyst had over fifteen years of experience in both ecology and NEPA work. Project Characteristics This project required NEPA documentation at the CE level and interviewees characterized this project as being simple, small, and easy. While there were no procedural or regulatory changes that impacted the project, interviewees did point out a number of challenges that they faced during the lifetime of this project.
First, the consultants were confronted with a design change after the PFPR plans had been sent out for review. The State Historic Preservation Office had disagreed with the findings of the historic properties study and the PFPR plans, resulting in the need to rework the design. According to the ecology consultant, this incident affected the timeline significantly.
Second, the consultants reported that their biggest challenge was the lengthy review time and that the reviewers did not "stick to preset OES timelines". The consultants assumed that this was caused by reviewer and project management turnover and the reviewers' high volumes of work.
Third, the project suffered disruptions in its final phase due to a miscommunication with the Natural Resources Conservation Service (NRCS). After the CE document was submitted to the Federal Highway Administration (FHWA), the Federal Highway reviewer requested to conduct
145

farmland coordination with the NRCS. During that coordination, the NRCS reviewer confused this project with another GDOT project that was also being reviewed for farmland impacts and requested changes that did not make sense to the GDOT ecologist and the ecology consultant. It took three months to resolve the confusion and receive approval from FHWA. Project Communication Interviews suggest that there was little communication between the consultants and the GDOT ecologist overall. Also, this project was marked by passive rather than active communication on part of the consultants.
First, there was no communication between the reviewers and the consultants before the document submission, only during the document review. The NEPA consultant noted that he organized a public involvement meeting and extended the invitation to the NEPA analyst who did not attend. However, we did see some communication between the consultants and the GDOT project manager before document submission. For example, the ecology consultant reported that he engaged with the GDOT project manager at coordination meetings.
Second, the communication during the review was very limited as well. For example, the ecology consultant experienced communication difficulties with the GDOT ecologist during that time and complained about the unresponsiveness of the reviewer: "I was sending out emails and leaving voice mails and not getting response for a week or two." Similarly, the GDOT ecologist reported not having a lot of interaction with the consultants except exchanging comments and bringing the consultants in for a workshop. The GDOT ecologist called for a workshop because he was not satisfied with the ecology consultant's document and said that it was lacking "a more robust defense of why they thought it was a habitat".
146

Third, there was some more communication between the GDOT ecologist and the ecology consultant after the document was submitted to the federal agency. While the GDOT ecologist was communicating with NRCS and FHWA (see above), he also reached out multiple times to the ecology consultant by phone and email to get information that NRCS had requested.
Both consultants were responsive to GDOT reviewers during the review processes, employing both phone and email to discuss the review comments and requests with them. Further, the ecology consultant referenced the EPM and used information on the SharePoint site to complete his work, though he found the SharePoint site difficult to access at times. In addition, the ecology reviewer perceived the templates provided by GDOT as challenging since some were out of date, yet were still being offered as forms for him to follow.
B.6. Low-quality Document Case 3
Low-quality document case 3 was a new bridge construction project for a local government. The project design was contracted out in the form of a menu of services to a private firm that acted as the prime consultant. The prime consultant subcontracted a second firm to conduct the NEPA documentation and ecology studies. Interviewee Characteristics Interviews were conducted with the two ecology consultants from the subcontractor consulting firm. The first interviewee had a degree in environmental engineering and had been working in the field of ecology for twelve years, mainly providing ecological services, including air and noise studies. The second interviewee had a degree in environmental science and limited professional experience working as an ecology consultant, stating that this project was the first one she had
147

worked on from beginning to end. The consultants reported that GDOT had been their major client for a number of years. Further, interviews were conducted with the GDOT ecologist and the NEPA analyst. The GDOT ecologist had a background in consulting before he joined GDOT. He reported that he did not have any experience with either ecology or NEPA documentation at the time he started at GDOT, but learned his trade on the job. The NEPA analyst had been working in this position for three years and picked up most of his skills on the job. Project Characteristics This project required an Environmental Assessment under NEPA. While there were no procedural or regulatory changes during the lifetime of this project, interviewees did point out a few challenges that they experienced.
First, consultants identified the review process as their major challenge in this project. After the submission of the first draft, the GDOT reviewer sent back the document without giving it a full review. Consultants reported that the OES reviewer was unsatisfied with the first draft and requested the consultant to provide a better second draft before he would review the document in depth (see project communication).
Second, the ecology consultants had to conduct a large amount of fieldwork, which wound up taking multiple rounds of surveys over a four-year window to complete. After early surveys discovered unexpected environmental subjects and a potential for species on site, a whole new set of studies was required. One of the ecology consultants noted, "Normally we wouldn't go out that many times."
Third, the NEPA analyst reported that a re-evaluation of the Environmental Assessment was required, but he did not recall what had triggered the re-evaluation.
148

Project Communication Interviews suggest that there was little communication between consultants and reviewers overall. Also, this project was marked by passive rather than active communication on part of the consultants.
First, consultants reported that there was some email correspondence with the NEPA analyst and the GDOT project manager before the document submission. The GDOT project manager forwarded information about the FFPR meetings.
Second, nearly all the communication that took place during the project occurred during the review period. One of the ecology consultants described the communication pattern as follows: "it just came down to submitting reports and getting comments." The first version of the document received 55 comments and the OES reviewer sent an email saying that this report had an unusual amount of comments and he would, therefore, not finish the review. The consultants, however, noted that a lot of comments were repetitive and/or non-substantive concerning the use of abbreviations and capitalizations.
The majority of the communication occurred electronically via email and the FTP site. For example, the ecology consultants responded to the reviewer's comments only electronically and did not call him to address his comments. None of the interviewees reported having any inperson meetings during this project. The GDOT ecologist noted that team meetings with consultants may be necessary for more complicated projects, but not for this one. Both consultants were familiar with the EPM, but did not perceive it as a useful tool because it is largely outdated. Further, both consultants reported using SharePoint regularly and found the site the best way to share information.
149

Interview Protocol
C.1. Consultant Interview Protocol
Purpose: This study aims to develop effective strategies for communicating performance expectations between the Office of Environmental Services (OES) and its consulting community. This research will analyze current communication patterns during the environmental procedure, identify factors that facilitate or hinder current communications, and investigate the relationship between communications and project performance. Through this research, the OES will be able to improve its communication practices for on-time, high-quality performance in the environmental procedure.
1) [Personal Background] Tell me a little about yourself. What is your professional background? Were you trained in this specific area? How did you become involved in this area? How much experience do you have in this area? Are you a specialist in this area or a generalist? How long have you worked with this firm? Have you ever worked in the public sector?
2) [Project History] Could you tell me the GDOT project number that you participated in? (Have other project descriptions ready to be sure focus is on the correct project)
a. [Overall Information] What steps (actions) were done to produce the report (ecological studies and NEPA document)? What were your experiences with these? What was the project length? Who was your GDOT contact? What method of communication did you engage in with GDOT? How ecological studies that you participated in informed NEPA document?
b. [Initiation/Pre-award] What was the start of your firm's involvement? What type of involvement were you engaged in? How did you become aware of this opportunity? How did you learn the scope of this work? Who did you first contact at GDOT?
c. [During studies/Post-award] What were the biggest challenges associated with this project? What was the project's difficulty/complexity? Did you experience internal turnover within your firm during this project? Was there turnover in GDOT personnel you worked with? Did GDOT send any rule/regulation/procedure changes affecting the project? Did you refer to any of GDOT's materials (procedural manual, SharePoint site, GPTQ meetings, email blasts)?
150

How often (and how) do you communicate with GDOT (which channels are easy/hard)?
d. [During review/Post-submission] What was the quality of your final product? Was this your best work? Once the report was filed, what was your experience of communicating back and forth with GDOT like? With sending in drafts/corrections What was the level of clarity of GDOT's comments? Was there any incentive to turn in rough work earlier (on time) rather than holding onto the project for longer and submitting a highly polished product? (Motivation to rush?) Did GDOT get any requests for revisions from the federal agency they (GDOT) sent it to (and if so, were you responsible for these changes)? [After project/Post-transmission] Was this a positive experience? Did this project lead to more work?
3) [Comparisons to Other Projects] Are communication patterns with GDOT similar to patterns with other clients? Particularly for other public sector clients with environmental projects? Do other clients engage in pre-study communication, and does that help (revision number, feedback from federal agencies, duration, and quality)?
4) [Suggestions] How could GDOT improve its working relationships? How could GDOT improve communication practices? How could GDOT facilitate communicating quality expectations? What feedback have you provided to GDOT in the past, and did you see results?
5) [Firm Experience] Tell me more about your firm and its operations. Is GDOT a major client for your firm? What are your major business sources? What is your firm's core competency (specialty)? Has your firm undergone any major changes (particularly during this project)? Change in ownership, being bought out etc. What region does your firm service? Does your firm have subcontractors (particularly for this project)?
6) [Personal remark for closing] Did you have a good working relationship with GDOT? What are things GDOT does that positively contribute to your work? Did this project lead you to rethink the way you approached future work? Did you learn anything from it?
151

C.2. OES Staff Interview Protocol
Purpose: This study aims to develop effective strategies for communicating performance expectations between the Office of Environmental Services (OES) and its consulting community. This research will analyze current communication patterns during the environmental procedure, identify factors that facilitate or hinder current communications, and investigate the relationship between communications and project performance. Through this research, the OES will be able to improve its communication practices for on-time, high-quality performance in the environmental procedure.
1) [Personal Background] Tell me a little about yourself. What is your professional background? Were you trained in this specific area? How did you become involved in this area? How much experience do you have in this area? How long have you worked with GDOT? Have you ever worked in the private sector?
2) [Project History] Could you tell me the GDOT project number that you participated in? (Have other project descriptions ready to be sure focus is on the correct project.)
a. [Overall Information] What steps (actions) were done to get the report (ecological studies and NEPA document) approved? What were your experiences with these? What was the project length? Who was your consultant contact? What method of communication did you engage in with consultants? How have ecological studies that you reviewed informed NEPA documentation?
b. [Initiation/Pre-award] What was the start of your involvement? How did you become aware of this project had been assigned to you? How did you learn the scope of this project? Who and when did you first contact at the consulting firm? Did you work with anyone else within GDOT on this project?
c. [During studies/Post-award] What were the biggest challenges associated with this project? What was the project's difficulty/complexity? Did you experience internal turnover within OES during this project? Was there turnover in the consulting firm you worked with? Was there turnover amongst the GDOT personnel working on this project? Did you send any rule/regulation/procedure changes affecting the project to the consultants? Did you send or inform the consultants about any of GDOT's materials (procedural manual, SharePoint site, GPTQ meetings, email blasts)? How often (and how) do you communicate with the consultants (which channels are easy/hard)?
152

d. [During review/Post-submission] What was the quality of the final product? Was it the consultant's best work? Once the report was filed, what was your experience of communicating back and forth with the consultants like? With sending in drafts/corrections What was the level of the consultant's understanding of your comments? Do you think that there was any incentive for the consultants to turn in rough work earlier (on time) rather than holding onto the project for longer and submitting a highly polished product? (Motivation to rush?) Did you get any requests for revisions from the federal agency you sent it to (and if so, were you responsible for these changes)? [After project/Post-transmission] Was this a positive experience?
3) [Comparisons to Other Projects] Have you worked in other public sector agencies? Are communication patterns by GDOT similar to patterns you experienced with other agencies? Particularly for other public sector agencies with environmental projects? Do you engage in pre-study communication? Do other agencies engage in pre-study communication, and does that help (revision number, feedback from federal agencies, duration, and quality)? [If respondent has worked in the private sector] What are communications with consultants like in your private sector experience?
4) [Suggestions] How could OES improve its working relationships with consultants? How could OES improve communication practices? How could OES facilitate communicating quality expectations? What feedback have you received from consultants in the past, and what have you done to see results? Are there opportunities for improving communications with other government agencies involved in environmental review? Are there opportunities for improving communications within GDOT between offices and departments?
5) [Firm Experience] Tell me more about the consulting firm that you worked with on this project. Is the consulting firm a major partner of OES? What do you think is the firm's core competency (specialty)? Would you like to see OES use this firm for more projects? Does this firm have the capacity to take on more projects? Have there been problems or challenges in working with this firm on past projects? Do the consultants in this firm have a good grasp of the quality expectations of OES and GDOT?
6) [Personal remark for closing] Did you have a good working relationship with consultants on this project?
153

What are things consultant does that positively contribute to your work? Did this project lead you to rethink the way you approached future work?
Did you learn anything from it?
154

Case Study Checklist Questionnaire

Name:

Please indicate your opinion about each communication channel.

Communication Channel

Usefulness (How useful _____ is?)

Communication Clarity

Accessibility

(How clear communication (How accessible _____

via _____ is?)

is?)

1= Very useless 2= Useless 3= Somewhat useful 4= Useful 5= Very useful

1= Very unclear 2= Unclear 3= Somewhat clear 4= Clear 5= Very clear

1= Very inaccessible 2= Inaccessible 3= Somewhat accessible 4= Accessible 5= Very accessible

Environmental Procedure Manual

Email Blast

SharePoint website

Quarterly meeting

Email

Telephone/ fax

Workshop

In-person meeting

Other

(

)

155

Please indicate your experience with each communication channel.

Communication Channel

Frequency (How often did you use _____?)

Timing (Multiple Choice) Contents (Multiple Choice)

(When did you use

(For which issue did you

_____?)

use _____?)

1= Never used 2= Occasionally used 3= Frequently used 4= Very frequently used

1= Pre-award 2= Post-award/ Pre-
submission 3= Post-submission 4= Post-review

1= Document request 2= Procedure issues 3= Regulatory changes 4= Document deficiency &
revision 5= Other (please indicate)

Environmental Procedure Manual

Email Blast

SharePoint website

Quarterly meeting

Email

Telephone/ fax

Workshop

In-person meeting

Other

(

)

156

Focus Group Participating Firms

Firm Name Firm A Firm B Firm C Firm D Firm E Firm F Firm G Firm H Firm I Firm J Firm K Firm L Firm M Firm N Firm O
Total

Table E-1 Focus Group Participating Firms

Ecology
1 1
1 1 1 5

Manager 1 1 1
1
1 1
1 7

NEPA 2 1
1 2 2
1 1 10

Total 1 2 2 2 1 1 1 2 2 1 1 1 1 3 1 22

157

Focus Group Protocol
Communications and Performance of Contractors at GDOT's Office of Environmental Services
Focus Group Protocol
Objective:
The focus groups are an instance of observation of an important group of informants on what transpires in an important phenomenological area. In our case, we wish to understand the actions of the consultants as they communicate with GDOT at various levels, especially concerning the preparation and review of environmental documents, beyond what can be found by interviewing individual consultants by observing their interactions and not only the answers to our questions.
Procedure:
The focus group activity will be carried out in two phases. The first phase will be reactions to scenarios of environmental document preparation and review prepared by the team. The purpose of this phase is to foster and observe a dialogue among the groups regarding their experiences in working with GDOT. We intend to observe whether their experiences are similar or different (and in what way) to the scenarios they are presented with.
The second phase will consist in a general exploration about the experiences of the consultants in working on environmental projects for the public sector for which a number of probing questions will be used expecting to encourage comparisons and contrasts among consultants in response to said questions.
Key Theoretical Framing:
The focus groups aim to provide evidence for the hypothesized relationship between communication patterns of consultants and GDOT-OES, and their potential contribution to the relatively low quality of performance by the consulting community (as evidenced by the number of returned documents).
158

PHASE I Moderation Items for Phase 1
The consultants will be presented with two scenarios that reflect a range of performance from the consulting community including some contrasting possibilities. The two scenarios will be the same across all focus groups. If time permits, we can add a third case and vary this across the focus groups. The scenarios will be pre-loaded on the tablets along with a short set of questions that they can react to prior to the beginning of the discussion.
a. Pre-meeting questions Please respond to these questions on a 1-5 scale where "1" means "I completely disagree", "2" means "I somewhat disagree", "3" means "I neither agree nor disagree", "4" means "I somewhat agree" and "5" means "I completely agree".
1. The scenario is close to an experience I (or my team) have had working with GDOT. 2. The scenario presents consultant performance that is worse than actually observed in
my experience. 3. The scenario presents GDOT reviewer performance that is worse than actually observed
in my experience. 4. The scenario is missing important communication channels that I use frequently when
working with GDOT. 5. The performance shown in the scenario, either of GDOT or the consultants, is not
related to communication patterns.
159

b. Assessment of Scenarios as Realistic Representations of Consultant Experience
Similarity of scenarios with experience: Have you had an experience working with GDOT that is similar to these scenarios? Which scenario resonates with you as representative of your working
relationship with GDOT-OES? Are there aspects of each scenario that reflects an experience that you have
had working with GDOT-OES? [Focus on each scenario reviewed] Are there important aspects of your working relationship with GDOT-OES
that are not represented in the scenarios?
Effective use of communication channels: In Scenario 1, we see that GDOT uses a variety of channels for communication including: OES project manual, templates, email blasts, SharePoint, direct communication with project managers, and direct communications with OES staff. Provide your views on the following: Which of the communication channels were not used effectively in each
scenario? How important are each of these communication channels in your own
work with GDOT-OES? Are there any important communication channels missing in the scenarios?
Consultant's Reports and GDOT Responses: In Scenario 2, we see the following performance outcome with regard to the consultant's reports: immediate call to workshop of prime and sub consultants without a round of comments. Provide your views on the following:
Is this situation a direct response of GDOT to perceived level of performance of the consultants?
What factors might have led to this situation? Has your firm run into any of these performance challenges in your
projects with GDOT-OES? How might performance be improved in each scenario? Could better communications produce a better result?
160

Phase 2 Moderation Items for Phase 2:
The questions are not ordered specifically by priority. They may be used in any order depending on how the participants dialogue evolves. Since this is not a Q&A session but rather an elicitation of narratives, the moderating team will use visuals to summarize the progress of the discussion.
Typical Communications: Please describe how communication with GDOT typically unfolds on an environmental project. [Use white board to monitor progress of participant contributions]
Who is your typical first point of contact on a new project? How are you informed that you are responsible for producing a new set of
documents? How many different offices are you likely to interact with on a project? When do you first start talking with someone in OES? What channels of communication do you use on an OES project? How frequently do you interact with someone in GDOT? In what form? What topics/issues prompt direct communication?
Normal Job and Communications: Do you consider it a normal part of your job to coordinate communication across the staff within GDOT?
What leads you to undertake this task? How often do you have to do this? Is this billable? Which offices are you coordinating communication between?
Technological Uncertainty of Environmental Work: When conducting environmental work for GDOT: Are you faced with needing to use or access new methods, approaches or technologies or ones you haven't used before? Do most projects require known technologies and procedures to complete? Are the technologies and procedures required by GDOT up to industry
standards? Have you ever had to adopt new technology or procedures? If so, does this influence your communication patterns with GDOT? Does it influence your ability to produce high quality work product?
161

Technical Complexity: What is the level of technical complexity of most of your projects with GDOT? Are most of your projects focused on producing environmental reports for
OES? What types of factors increase the technical complexity of your reports? Is your work ever a component of a larger design project that your company
is producing for GDOT? How about something more complex like a designbid-build? If your work is a component of a larger project, how does coordination within your company influence your communications with GDOT? Does the technical scope of your firm's work influence your ability to produce high quality reports for GDOT? Knowledge Flow Between Firm and GDOT: Do you mostly provide knowledge to or receive knowledge from GDOT? Do you find it a regular part of your job to improve the knowledge base of GDOT staff? In what ways does this happen? Do you find GDOT to be a source of knowledge from which you can improve your operations? Challenges for High Quality Environmental Documents: What are the typical challenges to producing a high quality environmental document? How frequently does your firm have documents returned for further development and editing? What types of challenges do you experience in generating a high quality report for the GDOT-OES? What are the typical reasons a document is returned? Does your firm track how often and how many times a document is returned? Are revisions on an environmental document billable? How does communication with GDOT influence your ability to produce a high quality environmental document?
o Does this vary between ecological documents and NEPA documents? Are there other factors that are more important than communications in
influencing your ability to produce a high quality report? GDOT Compared to Other Organizations: How does GDOT's process for producing environmental documents compare to your experience in working with other organizations? Other public agencies? Private sector firms? How are communication patterns different with other clients? What level of turnover have you experienced amongst GDOT staff on your
projects?
162

o How does this turnover rate compare with other public agencies? Private firms?
What level of turnover have you experienced with your firm? How are performance patterns different with other clients in terms of
quality? Improvements in GDOT's Processes: GDOT has undertaken efforts at improving communications and its processes of consultant management in recent years. Are the areas where these activities have improved your ability to produce high quality environmental documents? Are there ways in which these activities have hindered your ability to
produce high quality environmental documents? Does GDOT have a clear set of standards for what a high quality
environmental document is? What are these standards? o Is there any variance across your working relationships with GDOT-OES
staff? Contractual Specifications and Obligations: What does the contract with GDOT specify about the document preparation process? Does the contractual obligation or specifics come up in interactions with
GDOT? Are there any items of a contractual nature that are sensitive or a subject of
concern with respect to performance in working with OES?
163

Pre-meeting Survey and Results
Survey Overview
A pre-meeting survey was conducted to help the research team prioritize topics and scenarios for discussion during the focus groups. The survey also helped the research team identify strategies and processes to improve communications between the environmental consulting community and the Office of Environmental Services (OES) at the GDOT.
Along with the focus group invitation, the survey questionnaire was sent to all 40 of the consultants invited to participate in the focus groups. There were 24 survey responses (60%). This sample includes people who answered the survey but did not do the focus group and people who participated in both the pre-meetings survey and the focus group.
164

Survey Results

Q1 - Currently, OES reports a high incidence of returning environmental reports submitted for review. How critical are the following factors to timely submission of high-quality environmental documents?

Question
Overall Complexity of the Transportation Project Complexity of the Environment of the Project Site Project Sponsored by a Local Government Consulting Firm's Experience Working with GDOT Miscommunication with Other GDOT Offices Design Changes
Regulatory/ Procedural Changes
GDOT Staff Turnover
Consultant Turnover
Seniority of GDOT Reviewer
Terms of the Contract with GDOT
Consulting Firm as a Subcontractor

Very Uncritical
0.00%
0.00%
12.50%
4.17%
8.33% 0.00% 0.00% 0.00% 4.17% 0.00% 8.33% 20.83%

Uncritical
0 8.33%
0 4.17%
3 37.50%
1 8.33%
2 16.67% 0 0.00% 0 0.00% 0 8.33% 1 12.50% 0 8.33% 2 37.50% 5 37.50%

Somewhat Critical
2 33.33%
1 29.17%
9 20.83%
2 20.83%
4 20.83% 0 16.67% 0 20.83% 2 20.83% 3 58.33% 2 29.17% 9 16.67% 9 29.17%

Critical
8 25.00% 6
7 25.00% 6
5 16.67% 4
5 33.33% 8
5 41.67% 10 4 25.00% 6 5 50.00% 12 5 29.17% 7 14 12.50% 3 7 41.67% 10 4 16.67% 4 7 8.33% 2

Very Critical

Total

33.33% 8 24

41.67% 10 24

12.50% 3 24

33.33% 8 24

12.50% 3 24
58.33% 14 24 29.17% 7 24 41.67% 10 24 12.50% 3 24 20.83% 5 24 20.83% 5 24
4.17% 1 24

Field
Overall Complexity of the Transportation Project Complexity of the Environment of the Project Site Project Sponsored by a Local Government Consulting Firm's Experience Working with GDOT Miscommunication with Other GDOT Offices

Min. Max. Mean

S.D.

Var.

Count

Bottom 3 Box

Top 3 Box

2.00 5.00 3.83 0.99 0.97 24 41.67% 91.67%

2.00 5.00 4.04 0.93 0.87 24 33.33% 95.83%

1.00 5.00 2.79 1.22 1.50 24 70.83% 50.00%

1.00 5.00 3.83 1.11 1.22 24 33.33% 87.50%

1.00 5.00 3.33 1.14 1.31 24 45.83% 75.00%

Design Changes

3.00 5.00 4.42 0.76 0.58 24 16.67% 100.00%

Regulatory/ Procedural Changes

3.00 5.00 4.08 0.70 0.49 24 20.83% 100.00%

GDOT Staff Turnover

2.00 5.00 4.04 0.98 0.96 24 29.17% 91.67%

Consultant Turnover

1.00 5.00 3.17 0.94 0.89 24 75.00% 83.33%

Seniority of GDOT Reviewer

2.00 5.00 3.75 0.88 0.77 24 37.50% 91.67%

Terms of the Contract with GDOT

1.00 5.00 3.04 1.31 1.71 24 62.50% 54.17%

Consulting Firm as a Subcontractor 1.00 5.00 2.38 1.03 1.07 24 87.50% 41.67%

165

Q2 - If there is any other critical factor, please indicate it. [Note: The following responses are quotes from the survey participants.]
Consultants are continually asked to do more work for less money. This makes completing a thorough internal review of a document prior to submitting to OES more difficult. The situation is not as simple as this. Ultimately, the quality of the initial document depends solely on the consultant team, which prepares it. On that side, document quality is driven by the consultant's field experience, regulatory knowledge, experience and knowledge of GDOT requirements and expectations, writing ability, and QAQC process. There is no such thing as a perfect document. Even if there was a perfect document, GDOT would still return it with two rounds of comments because the majority of the comments (at least in my experience) are petty changes related to reviewer preference. On the GDOT side, reviewers need to adhere to their own templates and focus on comments that relate to technical accuracy and legal sufficiency. Inconsistency among different reviewers suggests inadequate reviewer training; Excessive non-substantive comments solely to appease reviewer preferences; Lack of timely communication between project teams that affects schedules; Over-documentation leading to increasingly larger reports with redundant data presentation; Re-reviewing of the entire revised reports instead of just back-checking the requested changes. Include SME's [Subject Matter Expert's] prior to fieldwork. A critical factor is the lack of oversight by senior GDOT staff. Although senior GDOT staff review all comments, they continue to allow way too many comments to be made that reflect the personal writing style or preferences of the GDOT reviewer. Clarifications on above: Seniority of GDOT reviewer- by this I would like to clarify that more senior reviewers tend to capture bigger picture issues and not focus on minor insubstantial comments that may merit revision. Scope- there are ongoing agency scoping requests that cause expectations of deliverables to change even during review processes. This does not allow for streamlining and is a cause for deliverable approval delay. Evolving Standards- it would be useful to have a running list of current items that could be under consideration for each discipline, so someone who hasn't produced a particular document in a while, e.g. LT [Logical Termini] Form, could see what the latest agency considerations could be and either talk to the GDOT SME about it and avoid a comment during document review. Not being informed of changes in the required reporting policies. Extreme inconsistency between GDOT reviewers. Using consultants to review consultant documents. GDOT internal documents are held to a much lower standard than consultant documents. Communication between GDOT PM, designer, and consultant ecologist on any project changes. Also development of bridge plans and the ability to determine temporary/permanent impacts with bridge design. Sometimes there is an issue regarding the lack of coordination between the various special study groups within OES, the GDOT PM and other GDOT offices. In the past, the assigned OES NEPA person clearly coordinated this effort and was the point person. This was more effective with regard to clarifying report expectations and keeping the GDOT review focused, eliminating the need for multiple reviews and keeping the project on schedule. Lack of experience that OES has with scoping projects. Lack of consistency in the review process. GDOT prepared documents and Consultant prepared documents should be to the same standards. Review should focus more on the technical content rather than the reader-friendly and grammatical perspective. This is especially true when OES is trying to transmit documents on behalf of the consultants to other federal agencies for concurrence. Expectations by other state and federal agencies that must approve or concur on environmental documents once they have been submitted to these agencies by GDOT. Inconsistency between reviewers affects the overall quality of the documents and the efficiency in which they are completed.
166

Q3 - Our preliminary case studies suggest several strategies for improving current communication practices. How effective are the following approaches likely to be on communications and timely submission of high-quality environmental documents?

Question
On-board training for firms new to GDOT projects Expanded use of online tools (T-pro, SharePoint, FTP) Easy access to T-pro and SharePoint Expanded project information in T-pro comments Avoidance of GDOT staff turnover

Very Ineffective

Ineffective

Somewhat Effective

Effective

Very Effective

0.00% 0 12.50% 3 50.00% 12 29.17% 7 8.33%

4.17% 1 16.67% 4 45.83% 11 25.00% 6 8.33%

8.33% 2 4.17% 1 37.50% 9 29.17% 7 20.83%

8.33% 2 37.50% 9 33.33% 8 16.67% 4 4.17%

4.17% 1 4.17% 1 25.00% 6 54.17% 13 12.50%

Total 2 24 2 24 5 24 1 24 3 24

Hiring consultant reviewers

4.17% 1 20.83% 5 29.17% 7 33.33% 8 12.50% 3 24

Making a pre-submission review step Expanded use and modification of templates

16.67% 4 0.00% 0

33.33% 8 0.00% 0

16.67% 4 20.83% 5 12.50% 3 24 25.00% 6 37.50% 9 37.50% 9 24

Use of deliverable checklist

0.00% 0 20.83% 5 20.83% 5 25.00% 6 33.33% 8 24

Environmental Procedural Manual update Flexible review timeline at reviewer's discretion

0.00% 0 16.67% 4

12.50% 3 41.67% 10

20.83% 5 20.83% 5 45.83% 11 24 4.17% 1 29.17% 7 8.33% 2 24

Early workshop

4.17% 1 16.67% 4 25.00% 6 41.67% 10 12.50% 3 24

Dedicated GDOT staff for information dissemination and T-pro and SharePoint update Regular meetings with PM, designer, and reviewers after the kick-off meeting Active coordination by NEPA analysts Trainings for District/ Design Offices Consultant evaluation system by GDOT Penalty for delayed submissions of incomplete documents Incentives for timely submission of no-return documents

8.33% 2 16.67% 4
0.00% 0 4.17% 1 0.00% 0 8.33% 2 4.17% 1 12.50% 3 16.67% 4 20.83% 5 41.67% 10 37.50% 9
29.17% 7 20.83% 5

33.33% 8 25.00% 6 16.67% 4 24
16.67% 4 45.83% 11 33.33% 8 24 16.67% 4 33.33% 8 41.67% 10 24 45.83% 11 16.67% 4 20.83% 5 24 37.50% 9 16.67% 4 8.33% 2 24 16.67% 4 0.00% 0 4.17% 1 24
33.33% 8 8.33% 2 8.33% 2 24

167

Field
On-board training for firms new to GDOT projects Expanded use of online tools (T-pro, SharePoint, FTP)
Easy access to T-pro and SharePoint
Expanded project information in Tpro comments
Avoidance of GDOT staff turnover

Min. Max. Mean

S.D.

Var.

Count

Bottom 3 Box

Top 3 Box

2.00 5.00 3.33 0.80 0.64 24 62.50% 87.50%

1.00 5.00 3.17 0.94 0.89

24 66.67% 79.17%

1.00 5.00 3.50 1.12 1.25

24 50.00% 87.50%

1.00 5.00 2.71 0.98 0.96

24 79.17% 54.17%

1.00 5.00 3.67 0.90 0.81

24 33.33% 91.67%

Hiring consultant reviewers

1.00 5.00 3.29 1.06 1.12

24 54.17% 75.00%

Making a pre-submission review step 1.00 5.00 2.79 1.29 1.66

Expanded use and modification of templates

3.00 5.00 4.13 0.78 0.61

Use of deliverable checklist

2.00 5.00 3.71 1.14 1.29

Environmental Procedural Manual update

2.00 5.00 4.00 1.08 1.17

Flexible review timeline at reviewer's discretion

1.00

5.00 2.71 1.27

1.62

Early workshop

1.00 5.00 3.42 1.04 1.08

Dedicated GDOT staff for information

dissemination and T-pro and

1.00 5.00 3.25 1.16 1.35

SharePoint update

Regular meetings with PM, designer,

and reviewers after the kick-off

2.00 5.00 4.08 0.81 0.66

meeting

Active coordination by NEPA analysts 2.00 5.00 4.08 0.95 0.91

24 66.67% 50.00% 24 25.00% 100.00% 24 41.67% 79.17% 24 33.33% 87.50% 24 62.50% 41.67% 24 45.83% 79.17% 24 58.33% 75.00%
24 20.83% 95.83% 24 25.00% 91.67%

Trainings for District/ Design Offices
Consultant evaluation system by GDOT Penalty for delayed submissions of incomplete documents Incentives for timely submission of no-return documents

1.00 5.00 3.38 1.07 1.15 1.00 5.00 2.79 1.15 1.33 1.00 5.00 1.88 0.97 0.94 1.00 5.00 2.46 1.22 1.50

24 62.50% 83.33% 24 75.00% 62.50% 24 95.83% 20.83% 24 83.33% 50.00%

168

Q4 - If there is any other strategy, please indicate it. [Note: The following responses are quotes from the survey participants.]
PM trainings on schedule considerations in areas of special studies.
Under the current system, there will never be a "no-return" document. That will not happen until the GDOT reviewers are trained to stick to their own templates and focus their comments on issues of technical accuracy and legal sufficiency, as opposed petty issues driven by the personal preference of the reviewer. Ultimately, the power of embarrassment should be considered as a tool. Many consultants would be quite embarrassed if the poor quality of their product were exposed to current and potential clients. Similarly, many GDOT reviewers would be embarrassed if they were called to account for petty, inconsequential comments made on otherwise sufficient documents. More coordination by NEPA analysts would be welcome- almost serving as an "OES PM". More training of district/design offices would be great- particularly in preparing plans appropriate for ecology submission. Regular status meetings help keep everyone on board. Any one-size-fits all solution, like a "deliverable checklist" just creates additional paperwork that will slow down submittals....just like the prime verification letter has. Comment on the last two items: I think the penalty issue is not effective. There are numerous variables in getting environmental documents approved, many of which are out of the control of the SME. Incentives are the same thing... maybe a positive reinforcement could be an incentive, but it may not be within the control of the SME. Get rid of the attitude. There needs to be a shift away from the us vs. them attitude. Streamline the report to be less redundant.
Reducing the number of projects per OES staffer.
Uniform communication of procedural or reporting-content with clear "grandfathering" guidance for projects/assessments already underway or in-review. Elimination of discretionary data and narrative from otherwise legally-sufficient/compliant report documentation. Streamline special study reports and remove redundancies and extraneous information. Some of these reports would benefit from a more check list like approach. 1 - Consultants get paid to produce an approved NEPA document; there should be no need for penalties or incentives; 2 - Supplemental(s) should be allowed more often to the contract to address: Scope Creep, Multiple Reviews by various GDOT OES reviewers, Design delays to force special studies to be redone/amended, etc. 3 TPRO database should be amended to include: milestone completions throughout the NEPA process (i.e. air completed, noise completed HRSR completed, PIOH completed, etc., Ecology pending because.....), schedule of Design (PFPR completed, ROW delayed because......), etc. This will allow the 'entire' team (consultant, GDOT Design PM, OES, GDOT Management) to view over the life of the project the stage at which NEPA is complete to date and provide everyone with a 'snap shot' of what is left to do. It will also assist new and junior staff (GDOT and consultants) involved in the NEPA process understand the overall process and appreciate the length of time to complete all the tasks involved. This also could be used as a future tool by GDOT in understanding historically why NEPA documents are on schedule or not (where are the delays historically). One way to improve the communication is to have consistent reviews and also have well defined guidelines for documents that are regularly updated and disseminated in a timely fashion for the consultant community. Internal GDOT offices' communication via e-mail resulting in changes to project schedule and decisions on key issues need to be shared with the consultants.
Analyzing where communication breaks down on both sides (consultant and GDOT)
Revise reporting requirements.
169

Q5 - Are you participating in OES projects as an environmental consultant or ecology consultant?

Answer Environmental Consultant Ecology Consultant Both Total

% 13.04% 13.04% 73.91%
100%

Count 3 3
17 23

Q6 - How many years have you been working with GDOT?
Answer 1-5 years 6-10 years 11-15 years 16-20 years 20+ years Total Average: 15 years

% 4% 25% 21% 38% 13% 100%

Count 1 6 5 9 3
24

Q7 - In what capacities have you worked on GDOT environmental and/or design projects? (Check all that apply)

Answer Current firm only I have worked for GDOT at other consulting firms prior to my current position Former GDOT employee Total

% 29.17% 79.17%
8.33% 100%

Count 7
19 2
24

Q8 - Within OES, which units have you worked with on projects? (Check all that apply)

Answer NEPA section Ecology section Cultural resource section Air/Noise section Total

% 100.00% 100.00%
87.50% 79.17%
100%

Count 24 24 21 19 24

170

Scenario Assessment Survey
Scenario 1 Narrative18
18 Firms A and B in Scenario 1 are different from the Firms A and B listed in Tables A-8, A-9, and E-1.
171

172

173

174

Scenario 2 Narrative 19
19 Firm C is different from the Firm C listed in Tables A-8, A-9, and E-1.
175

176

177

178

Reference
AASHTO. 2006. Improving the Quality of Environmental Documents. In A report of the Joint AASHTO/ACEC Committee in Cooperation with the Federal Highway Administration: American Association of State Highway and Transportation Officials.
Abudayyeh, O. . 1994. "Partnering: a team building approach to quality construction management." Journal of Management in Engineering 10 (6):26-29.
Ashuri, B., K. Mostaan, and D. Hannon. 2013. How Can Innovative Project Delivery Systems Improve the Overall Efficiency of GDOT in Transportation Project Delivery? In GDOT Research Project No. RP 11-21.
Carvalho, Marly Monteiro de, Leandro Alves Patah, and Digenes de Souza Bido. 2015. "Project management and its effects on project success: Cross-country and cross-industry comparisons." International Journal of Project Management 33 (7):1509-1522. doi: http://dx.doi.org/10.1016/j.ijproman.2015.04.004.
Chan, Edwin H. W., and Ann T. W. Yu. 2005. "Contract strategy for design management in the design and build system." International Journal of Project Management 23 (8):630-639. doi: http://dx.doi.org/10.1016/j.ijproman.2005.05.004.
Gluch, Pernilla, and Christine Risnen. 2009. "Interactional perspective on environmental communication in construction projects." Building Research & Information 37 (2):164-175. doi: 10.1080/09613210802632849.
Gransberg, Douglas D., and Keith Molenaar. 2004. "Analysis of Owner's Design and Construction Quality Management Approaches in Design/Build Projects." Journal of Management in Engineering 20 (4):162-169. doi: doi:10.1061/(ASCE)0742-597X(2004)20:4(162).
Hansen, Roger P., Theodore A. Wolff, and Albert G. Melcher. 2007. "COMMENTARY: NEPA and Environmental Streamlining: Benefits and Risks." Environmental Practice 9 (02):83-95. doi: doi:10.1017/S1466046607070159.
Klimova, Blanka Frydrychova, and Ilona Semradova. 2012. "Barriers to communication." Procedia - Social and Behavioral Sciences 31:207-211. doi: http://dx.doi.org/10.1016/j.sbspro.2011.12.043.
Lamb, Ronald E. 2014. "RESEARCH ARTICLE: Essential Elements of Effective Implementation of the National Environmental Policy Act (NEPA): Agency Decision Making and the NEPA Process." Environmental Practice 16 (4):272-280. doi: http://dx.doi.org/10.1017/S146604661400026X.
Meng, Xianhai. 2012. "The effect of relationship management on project performance in construction." International Journal of Project Management 30 (2):188-198. doi: http://dx.doi.org/10.1016/j.ijproman.2011.04.002.
Minchin, R. E. Jr., and G. R. Smith. 2001. Quality-Based Performance Rating of Contractors for Prequalification and Budding Purposes. In NCHRP Web Document: NCHRP.
NCHRP. 2014. Guidance for Managing NEPA-Related and Other Risks in Project Delivery. Volume 2: Expediting NEPA Decisions and Other Practitioner Strategies for Addressing High Risk Issues in Project Delivery. In Web-Only Document.
Oppermann, F. Yates. 2015. "PERSPECTIVES FROM THE FIELD: Applying LEAN Process Management to the NEPA Process: The Simplified EA/CE." Environmental Practice 17 (04):302-307. doi: doi:10.1017/S1466046615000319.
Perlow, Leslie A. 1999. "The Time Famine: Toward a Sociology of Work Time." Administrative Science Quarterly 44 (1):57-81. doi: 10.2307/2667031.
Sosa, Manuel E., Steven D. Eppinger, and Craig M. Rowles. 2007. "Are Your Engineers Talking to One Another When They Should?" Harvard Business Review November:134-142.
179

Tam, Vivian W. Y., L. Y. Shen, Rico M. Y. Yau, and C. M. Tam. 2007. "On using a communication-mapping model for environmental management (CMEM) to improve environmental performance in project development processes." Building and Environment 42 (8):3093-3107. doi: http://dx.doi.org/10.1016/j.buildenv.2006.10.035.
Tran, D. Q., M. R. Hallowell, and K. R. Molenaar. 2015. "Construction Management Challenges and Best Practices for Rural Transit Projects." Journal of Management in Engineering 31 (5):9. doi: 10.1061/(asce)me.1943-5479.0000297.
Trnka, Joseph, and Elizabeth Ellis. 2014. "ENVIRONMENTAL REVIEWS AND CASE STUDIES: Streamlining the National Environmental Policy Act Process." Environmental Practice 16 (04):302-308. doi: doi:10.1017/S1466046614000313.
Tsai, M. K. 2009. "Improving communication barriers for on-site information flow: An exploratory study." Advanced Engineering Informatics 23 (3):323-331. doi: 10.1016/j.aei.2009.03.002.
Yau, Nie-Jia, and Jyh-Bin Yang. 2012. "Factors Causing Design Schedule Delays in Turnkey Projects in Taiwan: An Empirical Study of Power Distribution Substation Projects." Project Management Journal 43 (3):50-61. doi: 10.1002/pmj.21265.
180