Adaption of drone measurement techniques to roundabouts and innovative intersections / by Michael O. Rodgers, Ph.D. ; Anqi Wei ; Caleb Weed.

GEORGIA DOT RESEARCH PROJECT 22-29 Final Report
ADAPTION OF DRONE MEASUREMENT TECHNIQUES TO ROUNDABOUTS AND
INNOVATIVE INTERSECTIONS
Office of Performance-based Management and Research
600 West Peachtree Street NW | ATLANTA, GA 30308 July 2023

1. Report No. FHWA-GA-23-2229

2. Government Accession No. N/A

4. Title and Subtitle Adaption of Drone Measurement Techniques to Roundabouts and Innovative Intersections

7. Author(s) Michael O. Rodgers, Ph.D. (https://orcid.org/0000-0001-6608-9333);
Anqi Wei (https://orcid.org/0000-0003-1041-1754);
Caleb Weed (https://orcid.org/0000-0002-7351-2416)
9. Performing Organization Name and Address Georgia Tech Research Corporation School of Civil and Environmental Engineering 790 Atlantic Dr. NW, Atlanta, GA 30332 Phone: (404) 385-0569 Email: michael.rodgers@ce.gatech.edu

3. Recipient's Catalog No. N/A 5. Report Date July 2023
6. Performing Organization Code N/A 8. Performing Organization Report No. 22-29
10. Work Unit No. N/A 11. Contract or Grant No. RP22-29

12. Sponsoring Agency Name and Address Georgia Department of Transportation Office of Performance-based Management and Research 600 West Peachtree St. NW Atlanta, GA 30308

13. Type of Report and Period Covered Final Report (April 2022 August 2023)
14. Sponsoring Agency Code N/A

15. Supplementary Notes Prepared in cooperation with the U.S. Department of Transportation, Federal Highway Administration.

16. Abstract

With the advancement in drone data collection and computer-vision techniques, both inexpensive high-resolution video drones and commercial services that offer automatic drone video analysis have become accessible to most transportation agencies. Despite the great potential of using drone video data for the collection of traffic operational and safety related parameters, the absence of userfriendly tools, standard procedures, and training materials remain important barriers to the implementation among state DOTs and other transportation agencies. Therefore, this project sought to accelerate knowledge transfer by producing documentation of drone data collection and analysis methods adopted in previous research studies in the form of standard operating procedures and a series of training resources, supplemented by additional drone video data collected from various types of innovative intersections. In addition, a regional data needs survey was conducted among selected state DOTs to understand their most desired data products, and a real-world case study of truck turning template analysis was carried out to demonstrate one non-traditional drone application.

17. Keywords Intersections; Roundabouts; Drones
19. Security Classification (of this report) Unclassified

18. Distribution Statement No Restriction
20. Security Classification (of this page) Unclassified

21. No. of Pages 168

22. Price Free

ii

GDOT Research Project 22-29 Final Report
ADAPTION OF DRONE MEASUREMENT TECHNIQUES TO ROUNDABOUTS AND INNOVATIVE INTERSECTIONS
By Michael O. Rodgers, Ph.D. Regents' Researcher and Adjunct Regents' Professor
Anqi Wei Graduate Research Assistant
Caleb Weed Graduate Research Assistant
School of Civil and Environmental Engineering Georgia Institute of Technology Atlanta, GA 30332
Contract with Georgia Department of Transportation
In cooperation with U.S. Department of Transportation, Federal Highway Administration
July 2023
The contents of this report reflect the views of the authors, who are responsible for the facts and accuracy of the data presented herein. The contents do not necessarily reflect the official views of the Georgia Department of Transportation. This report does not constitute a standard, specification, or regulation.
iii

Symbol
in ft yd mi
in2 ft2 yd2 ac mi2
fl oz gal ft3 yd3
oz lb T
oF
fc fl
lbf lbf/in2
Symbol
mm m m km
mm2 m2 m2 ha km2
mL L m3 m3
g kg Mg (or "t")
oC
lx cd/m2
N kPa

SI* (MODERN METRIC) CONVERSION FACTORS

APPROXIMATE CONVERSIONS TO SI UNITS

When You Know

Multiply By

To Find

LENGTH

inches feet yards miles

25.4 0.305 0.914 1.61

millimeters meters meters kilometers

AREA

square inches square feet square yard acres square miles

645.2 0.093 0.836 0.405 2.59

square millimeters square meters square meters hectares square kilometers

VOLUME

fluid ounces gallons cubic feet cubic yards

29.57

milliliters

3.785

liters

0.028

cubic meters

0.765

cubic meters

NOTE: volumes greater than 1000 L shall be shown in m3

MASS

ounces pounds short tons (2000 lb)

28.35 0.454 0.907

grams kilograms megagrams (or "metric ton")

TEMPERATURE (exact degrees)

Fahrenheit

5 (F-32)/9 or (F-32)/1.8

Celsius

ILLUMINATION

foot-candles foot-Lamberts

10.76 3.426

lux candela/m2

FORCE and PRESSURE or STRESS

poundforce poundforce per square inch

4.45

newtons

6.89

kilopascals

APPROXIMATE CONVERSIONS FROM SI UNITS

When You Know

Multiply By

To Find

LENGTH

millimeters meters meters kilometers

0.039 3.28 1.09 0.621

inches feet yards miles

AREA

square millimeters square meters square meters hectares square kilometers

0.0016 10.764
1.195 2.47 0.386

square inches square feet square yards acres square miles

VOLUME

milliliters liters cubic meters cubic meters

0.034 0.264 35.314 1.307

fluid ounces gallons cubic feet cubic yards

MASS

grams kilograms megagrams (or "metric ton")

0.035 2.202 1.103

ounces pounds short tons (2000 lb)

Celsius

TEMPERATURE (exact degrees)

1.8C+32

Fahrenheit

ILLUMINATION

lux candela/m2

0.0929 0.2919

foot-candles foot-Lamberts

FORCE and PRESSURE or STRESS

newtons kilopascals

0.225 0.145

poundforce poundforce per square inch

Symbol
mm m m km
mm2 m2 m2 ha km2
mL L m3 m3
g kg Mg (or "t")
oC
lx cd/m2
N kPa
Symbol
in ft yd mi
in2 ft2 yd2 ac mi2
fl oz gal ft3 yd3
oz lb T
oF
fc fl
lbf lbf/in2

* SI is the symbol for the International System of Units. Appropriate rounding should be made to comply with Section 4 of ASTM E380. (Revised March 2003)

iv

TABLE OF CONTENTS
CHAPTER 1. INTRODUCTION .................................................................................... 1 PROJECT BACKGROUND...................................................................................... 1
CHAPTER 2. LITERATURE REVIEW ........................................................................ 5 CHAPTER 3. PROJECT APPROACH.......................................................................... 8
DEVELOPMENT OF DRONE DATA ANALYSIS TOOLS AND PROCEDURES ..................................................................................................... 8
SUPPLEMENTAL DATA COLLECTION AND PROCESSING......................... 9 CASE STUDY OF TRUCK TURNING TEMPLATE ANALYSIS..................... 10 REGIONAL WORKSHOP...................................................................................... 11 CHAPTER 4. REGIONAL SURVEY........................................................................... 13 SURVEY METHODS............................................................................................... 13 ALABAMA ................................................................................................................ 15 KENTUCKY ............................................................................................................. 16 TENNESSEE ............................................................................................................. 18 REGIONAL TRENDS AND NEEDS...................................................................... 19 CHAPTER 5. CASE STUDY TRUCK TURNING TEMPLATE ANALYSIS...... 21 DATA COLLECTION ............................................................................................. 21
Site Selection........................................................................................................ 21 Field Data Collection .......................................................................................... 22 DATA PROCESSING .............................................................................................. 26 TURNING TEMPLATE ANALYSIS ..................................................................... 28 STUDY CONCLUSION........................................................................................... 30 CHAPTER 6. CONCLUSIONS AND RECOMMENDATIONS ............................... 32 PROJECT SUMMARY............................................................................................ 32 DRONE VIDEO DATABASE DEVELOPMENT ................................................. 33 ROUTINE USAGE OF DRONE DATA COLLECTION APPROACH ............. 34
v

APPENDIX A. INNOVATIVE INTERSECTIONS USED FOR DATA COLLECTION ......................................................................................................... 35
APPENDIX B. STANDARD OPERATING PROCEDURES FOR CHARGING DRONE BATTERIES AND REMOTE CONTROLLER ............. 42
APPENDIX C. STANDARD OPERATING PROCEDURES FOR OPERATING THE DJI INSPIRETM 2 DRONE ..................................................... 55
APPENDIX D. STANDARD OPERATING PROCEDURES FOR FIELD DATA COLLECTION ............................................................................................. 79
APPENDIX E. EXAMPLE OF TYPICAL ROADWAY ELEMENTS................102 APPENDIX F. STANDARD OPERATING PROCEDURES FOR DRONE
CAMERA CALIBRATION................................................................................... 105 APPENDIX G. STANDARD OPERATING PROCEDURES FOR SAFETY ....... 121 APPENDIX H. STANDARD OPERATING PROCEDURES FOR DRONE
VIDEO DATA ANALYSIS.................................................................................... 133 REFERENCES.............................................................................................................. 155
vi

LIST OF FIGURES Figure 1. Photo. Example of drone-based video analysis technique (Gbologah et
al 2022)........................................................................................................................... 3 Figure 2. Map. Responses to regional drone usage survey of southeastern states............ 14 Figure 3. Photo. The location of the selected truck stop................................................... 22 Figure 4. Photo. Flight restriction information of the two selected truck stops
shown in the B4UFLY app........................................................................................... 23 Figure 5. Photo. Overhead views of the four data collection locations at the
selected truck stop ........................................................................................................ 26 Figure 6. Photo. Labeled locations of the truck's outer front wheel and inner rear
wheel during a right turn in an intersection next to the truck stop ............................... 28 Figure 7. Chart. The distribution of minimum turning radii and inside turning radii....... 29 Figure 8. Diagram. Connecting the Batteries with the Charging Hub Figure 9. Diagram. Intelligent Flight Battery Figure 10. Photo. Figure 11. Photo. Figure 12. Diagram. Figure 13. Diagram. Figure 14. Figure 15. Figure 16.
LIST OF TABLES Table 1. Survey respondents ............................................................................................. 14 Table 2. Descriptions of the Battery Level Indicators while Charging.............48 Table 3. Descriptions of the Battery Level Indicators for Battery Protection......49
vii

AASHTO

LIST OF ABBREVIATIONS American Association of State Highway and Transportation Officials

AGL

Above Ground Level

CFI

Continuous-Flow Intersection

DDI

Diverging Diamond Interchange

DLT

Displaced Left-Turn Intersection

DOT

Department of Transportation

FAA

Federal Aviation Administration

FHWA

Federal Highway Administration

GDOT

Georgia Department of Transportation

MPH

Miles per hour

NCHRP National Cooperative Highway Research Program

O-D

Origin-Destination

R-CUT

Restricted Crossing U-Turn Intersection

Rd

Road

SOP

Standard Operating Procedure

SPUI

Single-Point Urban Interchange

St

Street

STRIDE

Southeastern Transportation Research, Innovation, Development and Education Center

TRB

Transportation Research Board

viii

UAS UAV UTC

Unmanned Aircraft Systems Unmanned Aircraft Vehicle University Transportation Centers

ix

EXECUTIVE SUMMARY
Many state DOTs have identified the potential value added by drones to their agencies and as a result drone technology has increasingly shifted from being employed primarily in research contexts to a broader collection of applications. However, most drone use by DOTs has until now been restricted to specific tasks like bridge or pavement condition surveys or to assist in the department's response to spontaneous events like crashes or natural disasters. The main impetus for the project resulting in this report was the largely untapped potential value drones have to offer DOTs as tools for routine transportationrelated data collection. Using drone technology for the collection of road transportation data offers many advantages over traditional methods of data collection and such techniques have become feasible alternatives to traditional methods. Drone-based data collection is cheaper, safer, faster, and more accurate than manual data collection in most cases. It also allows for the collection of accurate data previously unavailable or difficult to obtain, such as for alternative intersections or complex geometries. Beyond the ease of acquisition, the form of data collected via drones (namely, videographic and photographic) enables analysts to take advantage of new suites of machine vision software to quantitatively and systematically extract relevant traffic and geometric parameters. Many commercial machine vision software services specializing in transportation are readily available on the market and can be used to transform dronebased traffic videos into useful data which can be used to evaluate road system performance and inform transportation systems management and planning decisions. Such services are affordable and offer quick and accurate data processing. Collecting data
x

with drones on a routine basis that can be so easily processed adds value for DOTs by creating a repository of historic transportation activity data that can be used to validate activity models, track system performance across time, and solve countless transportation-related problems. A primary barrier to adoption of routine transportation data collection by drone identified by the team is the general lack of guidance on how to use drones for such purposes and on how to analyze the collected data effectively. While drones are used increasingly often in a growing number of ways, the technology is still relatively new and has not yet reached full maturity. While there is no one roadmap for implementing drone systems for data collection by DOTs, making available resources to accelerate the maturation of drone-based data collection tools and techniques would accelerate adoption by improving knowledge transfer between researchers and agencies. GDOT and the STRIDE UTC have positioned themselves as leaders in the utilization of drone technology for solving transportation problems via the conduction of several innovative research projects utilizing drone data collection and analysis techniques (e.g., Gbologah et al 2022; Rodgers et al 2023). Leveraging the experiences of the research team gained through those projects, this project sought to accelerate knowledge transfer by producing documentation of innovative drone data collection and analysis methods in the form of standard operating procedures and a demonstration of the real-world value of those methods in a case study. This project also produced a series of training resources to be used to train personnel, accelerating the integration and growth of innovative dronebased data collection activity.
xi

This report documents the theory and techniques used to quantitatively extract traffic parameters from videos collected by drone, our experiences employing and refining those techniques through the execution of prior research projects, and the fabrication of opensource standard operating procedures, training videos, and training datasets based on those experiences. Chapters 1, 2, and 3 provide background on the current state of dronebased data collection techniques and their advantages over traditional data collection approaches. Chapter 4 outlines the design and findings of a survey of state DOTs in the Southeast region conducted to contextualize transportation-related drone usage in the region. Chapter 5 details a real-world case study using videos collected by drones to validate truck turning templates intended to demonstrate one example of how dronebased data collection and associated innovative analytical tools and techniques can be applied to solve transportation problems. Lastly, the appendices contain standard operating procedures for using drones to collect and analyze transportation data. The standard operating procedures, training videos, and other supporting information can be accessed at https://sites.gatech.edu/drones/. We hope that the contents of this report serve to accelerate the maturation of drone techniques and bring their use into routine service to improve the accuracy and availability of transportation-related data, revolutionize analytical processes and possibilities, and thereby bring substantial value to agencies in the Southeast and nationwide.
xii

CHAPTER 1. INTRODUCTION
PROJECT BACKGROUND Many state departments of transportation (DOTs) and practitioners have embraced the emergence of unmanned aerial vehicle (UAV or drone) technology in recent years. A 2019 AASHTO survey found that 36 out of 50 state DOTs are funding centers or programs to operate drones (AASHTO 2019). Using drones for field measurements offers many advantages over traditional methods of data collection and allows for the creation of novel streams of photographic or video data with their enhanced maneuverability, making drone technology valuable for a wide variety of applications in transportation systems management. The unique operational attributes of drone technology offer advantages over traditional methods of transportation data collection, especially considering contemporary trends in transportation systems design. Roundabouts and other innovative intersections, such as Diverging Diamond Interchanges (DDI), Reduced-Conflict U-Turn Intersections (RCUT), Continuous Flow Intersections (CFI), Quadrant Intersections, Continuous Green-T (Turbo-T) Intersections, and others have become popular alternatives to traditional stopcontrolled or signalized intersections due to their safety and operational advantages. However, operational analysis of these alternative intersections is typically more complex than their traditional counterparts due to their more complex geometries. Manual collection of operational and safety related parameters at alternative intersections poses many logistical challenges, resulting in increased costs, lead times, and greater safety risk. STRIDE and several state DOTs have undertaken various research projects to
1

improve tools and techniques for navigating these challenges, including using drone systems to collect high-resolution video data. Using drones to evaluate these intersections allows a field team to pilot the drone from a safe location far from active right-of-way, greatly improving worker safety. Collecting videos from drone-mounted cameras has been found to reduce time and cost of data collection and increase the accuracy of measurement of detailed operational and safety parameters of alternative intersections. Despite their versatility, most drone usage occurs for specialized purposes like bridge or pavement inspections, crash investigations, and other site assessments, rather than on a more routine basis. And while state DOTs and researchers have acquired video drones and experimented with their use, most operational usages have focused on direct use of the video (e.g., by visual examination of traffic flows and driver behavior) rather than the quantitative extraction of operational and safety parameters from this video stream using machine-vision or manual tools. A drone-based data collection technique, especially given the rapid expansion in low-cost availability of reliable and easy-to-use artificial intelligence and machine vision tools, has immense potential as a routine method of traffic operations management and analysis and is of sufficient maturity to enter routine service and make important contributions to transportation systems management and design by both reducing the time and cost of collection and increasing the accuracy of measurement of detailed operational and safety parameters of road network features. As with many emerging technologies, routine conduction of drone-based data collection has been hampered by the absence of user-friendly analysis tools, standard procedures, and training resources. Many of the methods used by researchers for specialized applications must be documented and translated into a set of standard procedures that can
2

reliably produce quantitative measurements of traffic conditions (e.g., traffic volumes, individual vehicle speeds and trajectories, vehicle classification, distribution of following distances, frequency of turning movements and lane changes, etc.) for subsequent analysis. Making these resources, as well as a repository of open-source drone video data to be used as a training resource, available to the public will help accelerate knowledge transfer and reduce barriers to implementation for routine drone-based data collection techniques so state DOTs can more easily reap the benefits that routine drone-based data collection can offer for transportation systems management.
Figure 1. Photo. Example of drone-based video analysis technique (Gbologah et al. 2022)
As leaders in the application of drone-based video techniques to the measurement of traffic-related parameters, GDOT and STRIDE are well-positioned to produce these resources by leveraging their experience and drawing from the approaches previously used by the research team for investigation of roundabouts. This report outlines the development process for the necessary tools, procedures, and training that will allow state
3

DOTs to effectively use high-resolution drone video collection and machine-vision data reduction as a part of normal data collection and analysis activities.
4

CHAPTER 2. LITERATURE REVIEW
Drones represent a highly versatile technology that has many valuable applications to transportation systems engineering and management. While the most common use of drone technology is for infrastructure condition monitoring and inspections, there are many examples of drones being used to extract quantitative traffic, safety, and geometric parameters. Previous GDOT and STRIDE research projects have utilized drones to evaluate roundabouts. Gbologah et al. (2022) collected observational data using a camera-equipped drone at twelve roundabouts in the Atlanta metropolitan area and established an analytical framework using commercial machine vision analysis to extract vehicle trajectories, spacings, and conflicts to help inform a predictive model showing how critical headways and driver behavior (e.g., gap acceptance) were affected by geometric, environmental, and operational factors (Gbologah et al. 2022). Rodgers et al (2021) used aerial photography conducted by drone to measure illuminance at 77 roundabouts across Georgia to help establish the relationship between illumination levels and nighttime safety at rural and suburban roundabouts (Rodgers et al. 2023). Other studies have demonstrated effective use of drones for traffic data collection at roundabouts (Khan et al. 2018; Brahimi et al. 2020). Drones have also been used for active transportation planning. Kim (2020) developed a method for using camera-equipped drones to record videos of active transportation infrastructure and for extracting parameters, such as pedestrian and bicycle counts, from the videos into a spatiotemporal dataset (Kim 2020). Another study used drone-based video data collection techniques to collect pedestrian flow characteristics and data on
5

pedestrian activity in a pedestrian shopping mall network in Thailand (Sutheerakul et al. 2017). Drone videos have also been used to detect and evaluate traffic congestion. Utomo et al. (2020) use a deep learning system to detect and classify vehicles and traffic density from live drone video streams (Utomo et al. 2020). Kumar et al. (2021) defined a novel Internet of Vehicles system for controlling a traffic surveillance drone swarm and managing large amounts of drone-based video data in a way that allows transportation system managers to extract the information they need quickly in real-time to divert traffic and manage flow in an intelligent transportation system (Kumar et al. 2021). On a larger scale, UAS have been used to create big datasets of multimodal urban networks. The New Era of Urban traffic Monitoring with Aerial footage (pNEUMA) experiment utilized a swarm of ten drones hovering in coordination of the central business district of Athens, Greece to record streams of traffic across a 1.3 km2 area with a road network of more than 100 km-lanes and 100 intersections during morning peak hours for a five-day work week. The experiment recorded and analyzed over half a million trajectories. The dataset was made freely available as part of an open science initiative to encourage subsequent analyses of the data and generate insights into congestion and travel behavior. The big data approach to drone data collection yielded what is likely the most detailed and comprehensive urban multimodal field dataset for traffic microsimulation research and will be available to aid in the development and validation of future models (Barmpounakis 2020). Other ways drones have been used are to monitor transportation-related air pollution by measuring particulate concentrations adjacent to roadways, to measure the effects of
6

transportation activity on urban land surface temperatures using thermal imagery, and as part of an Internet of Things cyber-physical framework for priority evacuation of highly panicked or distressed individuals during disaster evacuations (Villa et al. 2017; Naughton & McDonald 2019; Sahil et al. 2022). There are also emerging concepts of using surveillance drones as mobile infrastructure elements to wirelessly communicate with drivers and their vehicles in interconnected intelligent transportation systems that can convey critical information to travelers such as crashes and congestion and suggest or enforce detours as appropriate (Hadiwardoyo et al. 2018). The wide swath of examples of drone usage shows that drones are quickly becoming important tools to solve many transportation-related problems.
7

CHAPTER 3. PROJECT APPROACH
Since this project aims to expand the range of applications of high-resolution drone video collection method among transportation agencies, the research team first conducted a data needs survey among selected state DOTs within the southeast region to understand the current progress on their drone program development and determine their current most desired data products. Then based on the collected survey results, the research team standardized and generalized the drone data collection and analysis approaches used in previous projects for the extraction of operational and safety related parameters, and developed a set of corresponding "standard operating procedures" that can be used by existing technical personnel with minimal additional training. To test the applicability of the developed drone data analysis tools, additional nighttime images and traffic operation videos were collected by drones from various types of innovative intersections in consultation with GDOT personnel, including restricted crossing U-turn intersections (RCUT), diverging diamond interchanges (DDI), etc. A case study, which attempts to validate current truck turning template through analyzing video data collected by drones from a truck stop, was also carried out to demonstrate a non-traditional application of drone data collection. Finally, an in-person training will also be conducted at a research workshop hosted by the project team in cooperation with GDOT personnel.
DEVELOPMENT OF DRONE DATA ANALYSIS TOOLS AND PROCEDURES Guided by the regional survey results, the research team leveraged the data analysis tools and procedures used for earlier STRIDE and/or GDOT research studies (e.g., GDOT RP
8

18-25 Evaluation of Factors Influencing Roundabout Performance, GDOT RP 19-11 Safety and Illumination of Rural and Suburban Roundabouts (Phase II), and similar studies) to develop more robust tools, methods and procedures for drone data collection and analysis of traffic operational and safety parameters from innovative intersections. Six standard operating procedures (SOPs) were created to provide instructions related to drone equipment charging, drone operation, drone camera calibration, field data collection, safety guidelines, and drone data analysis process. When used in conjunction with commercially available computer vision techniques (e.g., Data from Sky), these SOPs will facilitate the reduction and calibration of these drone video data into standardized data products and formats that can be used for a variety of subsequent analyses. These tools will also emphasize ease-of-use to encourage adoption by transportation and planning agencies, consultants, researchers, and students to meet a broad range of management, research, and educational needs.
SUPPLEMENTAL DATA COLLECTION AND PROCESSING In consultation with GDOT, to develop and test the developed drone data collection and analysis tools, the research team collected video recordings of traffic operations using drones from 12 roundabouts, and other innovative intersections including one restricted crossing U-turn intersection (R-CUT), one diverging diamond interchange (DDI), one single-point urban interchange (SPUI), and one displaced left-turn intersection (DLT) in Georgia. Among the selected roundabouts, two sites that were visited and studied in previous projects are used as control sites to see if any significant changes will occur in current driver behaviors compared to the previous observation. In addition, the team also
9

collected nighttime images from five different roundabouts. The detailed information and visited dates of each site can is provided in Appendix A. The field data collection was carried out by a two-member team, both members holding a remote pilot certificate obtained from the Federal Aviation Administration (FAA) for drone operations. All collected video files were uploaded to DataFromSky AERIALTM platform for automatic object identification and trajectory extraction, and geo-registration was also performed on the corresponding processed tracking log files for the convenience of potential users. In addition, the data collection activities based on the developed SOPs were video recorded for demonstration purposes. The recorded videos include a narrative track to describe details of each step involved in the drone operation and data collection process. Drone data processing and analysis tutorials were also produced to provide potential users a guide as to how to use the tools developed in this project.
CASE STUDY OF TRUCK TURNING TEMPLATE ANALYSIS Since the survey results from GDOT suggested that many transportation agencies are interested in learning the actual field observations of truck turning paths from an overhead view to better accommodate large trucks within intersections, a case study was conducted to attempt to analyze truck turning movement data collected from a selected truck stop and its adjacent intersection area to validate existing turning templates. All processes involved in the case study were documented and the study results are discussed in Chapter 5. While most current usages of drone data have focused on direct analyses such as bridge inspection, damage control, etc. among transportation agencies, this case
10

study also helped demonstrate a non-traditional application of drone video data collection.
REGIONAL WORKSHOP The research team will conduct a two-day regional workshop on "Advances in Roundabouts and Innovative Intersections" in the Atlanta, Ga area as a part of GDOTs regular training program. The drone-focused training will focus on four major themes:
A technical program to describe the results and conclusions regarding recent advances in our knowledge of the design, operational characteristics, safety, and usage of Roundabouts and Innovative Intersections in the Southeast. These presentations will focus on research by STRIDE and State DOTs within the Southeast.
A panel and peer-exchange program focusing on the experiences and challenges faced by State and Local DOTs within the Southeast on the implementation and impacts of Roundabouts and Innovative Intersections within their jurisdictions.
A demonstration program showing various tools and techniques developed by STRIDE and other researchers related to Roundabouts and Innovative Intersections.
An optional one-day training program to provide hands-on training for personnel in the use of the procedures and techniques developed in this study.
The major target audiences for participation in this workshop, besides GDOT personnel, will be technical staff from state and local DOTs, consultants, students, and researchers within the STRIDE region (Region IV). The technical program will also be designed to
11

be accessible to planners and decision makers, and will serve to familiarize the participants with additional STRIDE research on Roundabouts and Innovative Intersections including extensive opportunities for interactions between researchers, students, and a broad range of practitioners.
12

CHAPTER 4. REGIONAL SURVEY
SURVEY METHODS To understand how drones are being used among state DOTs in the southeast region, phone interviews were conducted with state practitioners. Interviewees held different roles and ranks depending on the state, but all were involved somehow with their state's transportation-related drone systems. Invitations to participate in the survey were sent to a variety of practitioners from the southeastern states (Georgia, Florida, Alabama, North Carolina, South Carolina, Tennessee, and Kentucky). Interviews were successfully conducted by telephone with representatives from Alabama, Tennessee, and Kentucky. The responses collected helped to contextualize regional drone activity and highlighted certain trends in experiences, implementation challenges, and future plans. The interviews were highly conversational with the goal of discovering answers to the questions:
How are drones being used by your state's DOT today? What challenges has your department encountered in implementing your
state's drone program and how have you navigated those challenges? How else would you use drones if you had the tools and resources to do so? What do you anticipate your department will be using drones for in the future?
13

Figure 2. Map. Responses to regional drone usage survey of southeastern states As expected, the most common usage of drones in the region is infrastructure and construction site surveys. All states indicated the use of drones for such purposes, with all but Tennessee using their own drones (Tennessee respondents indicated most drone needs are contracted out). The responses collected from the survey indicated the southeast states have drone programs in various stages of maturity. In the following sections, findings of the survey for each state are summarized.

State AL TN KY

Table 1. Survey respondents

Respondent(s) JD D'Arville Isaac Pratt; Sydney Fuhring Justin Wilson

Role UAS Program Administrator, ALDOT Planning Specialists, Long Range Planning Division, TNDOT UAS Subject Matter Expert; District Traffic Engineer, KYDOT

Date of survey 6/2/2023 6/6/2023 6/13/23

14

ALABAMA Of the responding states, Alabama has the most developed and mature drone program. In 2014, the state government commissioned a UAS task force and instructed Alabama DOT to create a statewide plan for using drones to assist transportation engineering and construction, aerial photogrammetry, and public utility works. Since then, the department has been built from the ground-up and the state DOT has produced and refined very finetuned and concise standard operating procedures based on their nearly a decade of experience. Current drone activity in Alabama, besides surveys, includes site monitoring, alignment data collection, and other photogrammetry. Alabama also uses tethered drone systems to monitor traffic conditions at peak hours and live video feeds from drones are used to inform signal switches to optimize traffic flow. Rather than collect traffic data to analyze and quantitatively extract operational parameters from, most of the drone-based traffic video in Alabama seems to be used for surveillance and for real-time operational decision making. Despite the lack of drone usage for routine collection of transportation related data, Alabama DOT has been an early adopter of machine vision technology for processing aerial photographs taken during site inspections. Analysts are assisted by commercial artificial intelligence packages to georegister photographs and process them into useful formats. Given the state's mature drone department, Alabama has successfully navigated many barriers during the implementation and growth of their drone program. A significant challenge cited by the respondent was the process of development of standard operating
15

procedures into a concise set of guidelines of appropriate and effective size and detail. The respondent stressed the importance of quality standard operating procedures for drone-based data collection for making techniques and tools more accessible to practitioners with minimal additional training. Having standard procedures also helps Alabama stay abreast of evolving federal regulations by setting clear guidelines for drone operation. With time and experience, Alabama DOT has wrangled down their standard procedures to what they believe is a nearly optimal size and scope, and they hope that the standardization of drone procedures will help accelerate the state into a more routine use of drones across more applications. Other challenges mentioned by the respondent include navigating privacy concerns, which can be an ethical and political barrier to expanded drone activity moving forward. In the future, Alabama expects growth in drone usage for traffic surveillance. Drone technology advancements resulting in longer flight times and greater ranges will increase possible coverage and feasible flight patterns, which will in turn enhance the value of investments in drones. Future development in artificial intelligence capabilities will also enable faster and more detailed processing of drone videographic and photographic data. Finally, Alabama expects to acquire drone-based LiDAR equipment to enhance the capabilities of their aerial mapping operations in the near future.
KENTUCKY Kentucky's drone program is a newer program, but still conducts a lot of drone activity in the state. The respondent indicated most of the work they do with drones is surveying construction sites periodically to generate a stream of project-level data showing project
16

progress over time. The primary activities of Kentucky's drone program are photogrammetry and aerial mapping, in addition to bridge and pavement condition surveys. While they have conducted traffic counts using camera-equipped drones, this did not seem to be a common occurrence. Kentucky does not yet have standard procedures for their drone program, but their first set of standard procedures is currently being developed through a project at the University of Kentucky. Interestingly, the respondent indicated the data collected with drones is almost entirely photographic as opposed to videographic, as most drone usage in the state is aerial mapping. Kentucky is in the midst of implementing a digital delivery initiative, streamlining project deliverables and making everything based in the cloud. As part of this, they use commercially available cloud-based drone mapping software in their analytical workflow to process aerial photographs. Their software allows analysts to extract roadway characteristics, pavement condition, and alignment data from photographs taken by a camera-equipped drone. The respondent listed many challenges encountered by their department during the development of their drone program and the use of drones for data collection. Kentucky DOT lacks tethered drone systems and flight time and distance constraints have been significant challenges for their survey work. A challenge mentioned by the respondent that was unique was figuring out how to transform aerial data into a format that people are familiar and comfortable with enough for it to be informative and helpful for other analysts and engineers. Most staff have been familiar only with two-dimensional plans for their entire careers, so getting people to transition and learn how to take advantage of multidimensional data with frequent cross sections and other metadata made possible by
17

photogrammetry and LiDAR has been challenging for the state. The respondent suggested that staff needed to be retrained in many cases on how to use the new data streams and forms effectively. Additionally, Kentucky has anticipated and begun to encounter data management challenges. A large barrier to implementation and expansion of drone data collection was the development of a system for storing the millions of highresolution photographs taken by drones. The most immediate future evolution of drone technology expected to grow in Kentucky is the acquisition of LiDAR equipment to enhance their aerial photography capabilities. However, this may be some time away, as the existing photogrammetry equipment has worked well and was an expensive investment by the state. The respondent suggested it may be a while before funding can be made available for new drone technology. As the technology becomes available, they expect to be using drones for more traffic operations management applications in the future, but it has not been a priority in the state yet.
TENNESSEE The respondents we were able to interview from Tennessee were unaware of any inhouse drone activity in the state and indicated that the state contracts out all their drone work. However, other resources indicate that the state does use drone systems for several applications. Tennessee Critical Incident Response Team (CIRT) uses drones for crash site investigations (TNDHS n.d.). Tennessee DOT also utilizes drones for routine infrastructure surveys as well as for structural surveys to assess damage to transportation infrastructure after natural disasters (Li et al. 2022). However, we were unable to find any documented examples of drone usage for transportation systems management or the
18

collection of traffic-related data. The respondents indicated that the collection of transportation data is conducted manually across the state by on-road vehicles equipped with camera systems, indicating a large opportunity for expanded drone-based data collection techniques. Challenges encountered by the state in their development and implementation of drone systems can be speculated. A recent bill introduced in state legislature seeks to limit the manufacturers of drones that state agencies can purchase and operate, which has thrust the immediate future of drone usage in Tennessee into some degree of uncertainty (O'Brien 2023). If agencies were required to replace their drones with approved makes, it could be prohibitively expensive and have a negative impact on the growth of dronebased data collection techniques.
REGIONAL TRENDS AND NEEDS The responses to the survey yield two general observations about the state of drone usage in the southeast region. First, southeastern states are at different stages of implementation and maturity of their transportation-related drone programs. The priorities of each state DOT will vary as a product of their local conditions and there is no standard roadmap for drone program development. However, having access to a set of open-source standard procedures to act as an example that can be mimicked or adapted depending on the needs of a specific state would be a valuable resource that can help to accelerate program implementation. Second, as is supported by the literature, traffic-related applications of drone-based data collection techniques generally lag behind other drone usage such as that for conditions
19

surveys and site assessments. There seem to be less examples of routine drone-based data collection activity than expected. Incorporating drone-based data collection into routine traffic operations management represents an opportunity for state DOTs that we believe is being undervalued.
20

CHAPTER 5. CASE STUDY TRUCK TURNING TEMPLATE ANALYSIS
While currently most roadway facilities are designed following guidelines provided by existing manuals developed based on design vehicles, it is found that the recommended dimensions sometimes don't match with the actual field observations and often tend to overaccommodate large trucks. Therefore, this case study attempts to analyze truck turning movement data collected from a selected truck stop and its adjacent intersection area to validate existing turning templates. Additionally, as drones were used for the truck turning movement data collection, this case study also demonstrates one potential application of the drone-based video data collection method in the quantitative extraction of operational and safety parameters from video streams.
DATA COLLECTION
Site Selection Since this study focuses on analyzing truck turning templates through field observation, the potential sites should have relatively high truck volumes to increase data collection efficiency. Within the city of Atlanta, the research team was able to identify two truck stops as potential data collection sites. However, since one site was later found out to be located within a flight restriction zone, so the `PilotTM travel center' located next to the intersection of Interstate 285 and Bouldercrest Road (Figure 3), was selected for the truck turning movement observations.
21

Figure 3. Photo. The location of the selected truck stop (Google Maps)
Field Data Collection To capture more accurate truck turning movements within the study area, a Zenmuse X5S camera attached to a DJI Inspire 2 quadcopter drone was employed for data collection due to their high image quality and ability to provide a top-down view that could largely eliminate image distortion. Before departure, the research team first checked whether any flight restrictions were in place in the selected truck stop location on the scheduled data collection date using the `B4UFLY' app. Recommended by the FAA (Federal Aviation Administration), this app uses interactive maps to help inform recreational users about the places where they can and cannot fly their drones. An example of the flight restriction information of the selected truck stop is shown in Figure 4, compared to the flight restriction shown for another truck stop that is a controlled airspace.
22

Figure 4. Photo. Flight restriction information of the two selected truck stops shown in the B4UFLYTM application
If the selected area happened to be a controlled airspace or a temporary no fly zone, drone pilots could also try submitting requests to obtain airspace authorization using the `Air Control' app. For certain locations, the request process can sometimes take up to 2 weeks, so drone pilots should always check the flight restriction information at least 2 weeks before the scheduled data collection date. In addition, the research team also checked the local weather forecast to ensure the data collection would be performed
23

under good weather conditions where there is no rain/snow/fog, with the temperature being within the range of -4 to 104 F and the wind speeds being below 15 MPH. On the field visit day, the research team also made sure all the drone intelligent flight batteries and remote controller were charged to full, and there was enough storage space in the drone SD card. Specific details regarding the charging process of the drone equipment can be found in Appendix B. After arriving at the selected truck stop, the research team parked its vehicle in a nearby parking lot where the pilot could always have a clear line of sight to both the drone and surveyed area while minimizing any distractions to drivers within the study area. After making sure that the drone equipment was in good condition and the surrounding environment was clear of overhead powerlines and/or tree branches, the drone was flown to four different locations within the truck stop to capture different truck turning templates. The overhead view of the selected four locations is shown in Figure 5.
24

25

Figure 5. Photo. Overhead views of the four data collection locations at the selected truck stop
During flight, the drone's position and the camera settings should remain unchanged once the drone arrives at the destination to increase the precision of automatic video analysis. Due to FAA regulations, the maximum height of the drone was limited to an altitude of 390-395 feet above ground level (AGL). This elevation allowed all types of trucks' turning movements to be captured within the camera frame. For each location, two videos of approximately 10 minutes duration were taken to record traffic operations within the truck stop and nearby intersections. Throughout the data collection process all team members wore the safety vests and avoided entering any off the traffic lanes.
DATA PROCESSING In total, 8 videos were recorded at the selected truck stop, and were first uploaded to the DataFromSky AERIALTM platform for automatic object identification and trajectory extraction. The processed videos were returned in the form of tracking log files tied to the
26

original video data files, and these tracking log files contained information about traffic analysis scenes and detected and/or annotated vehicle trajectory data. Since objects in the original tracking log files are measured in units relative to the spatial resolution of individual pixels, geo-registration was required for each file to convert the initial coordinate system into the WGS-84 coordinate system. This step was performed in the DFS Viewer software based on known latitude and longitude information of at least four reference points selected within the video frame. Then by visually checking each truck's movements within the obtained video files, 34 trucks were identified to have made turns in the study area during the data collection period, including 17 WB-67 trucks, 14 WB-62 trucks and 3 WB-40 trucks. Although the DFS Viewer software provides functions to automatically extract vehicles' trajectory data at every frame (0.03 seconds), because the trajectory is calculated based on the center point of each vehicle's bounding box, which is identified upon vehicle's appearance in the video frame, thus, as the vehicle gets closer to the edges of the video frame, the deviation between the identified and actual vehicle center points will get larger as well, resulting in errors in the truck turning template analysis. Therefore, since the truck turning path is bounded by the traces of both outside front wheel and inside rear wheel, the coordinates of each truck's outside front wheel and inside rear wheel at every 20 frames (approximately 0.67 seconds) during each turn were manually extracted to increase the calculation accuracy. The `show position' function in the DFS Viewer software was used to obtain the geolocation of truck wheels at different time stamps. Figure 6 shows on example of the traces formed by the labeled locations of a truck's outer front wheel and inner rear wheel at every 20 frames during a right-turn movement.
27

Figure 6. Photo. Labeled locations of the truck's outer front wheel and inner rear wheel during a right turn in an intersection next to the truck stop
TURNING TEMPLATE ANALYSIS For the turning template analysis, it is assumed that both trucks' outer front wheels and inner rear wheels follow the circular arc during the turn. Based on the extracted coordinates of truck wheels, the least square fitting algorithm was applied to find the best-fitting circle that represents the paths of outer front wheels by minimizing the sum of squared distances from the fitted curve to the input data points. The `scipy.optimize.leastsq' function in Python was used to execute the least-squares algorithm and both circle center and radius were obtained from the output results. Using the same definition provided in the AASHTO "green book", the radius of the fitted circle based on each truck's outer front wheels was considered as the minimum turning radius of that truck. And the corresponding fitted circle center was also used with the trucks'
28

inner rear wheel coordinates to calculate the inside turning radius. The distribution of the calculated minimum tuning radii and inside turning radii are presented in Figure 7. While the majority of the minimum turning radii are within the range of 42 to 72 feet, 6 trucks were observed to have turning radii smaller than the design vehicle values provided by AASHTO green book (40 feet for WB-40 and 45 feet for WB-62/67).
Figure 7. Chart. The distribution of minimum turning radii and inside turning radii
Among all the truck turning movements analyzed from the collected video files, 7 trucks were observed to have made a left turn, and their average minimum turning radius is 53.15 feet, while the other 27 trucks that have made a right turn were found to have an average turning radius of 58.62 feet. Potential reasons that could lead to this phenomenon are that as drivers' seats are located on the left side of the trucks in the U.S., drivers tend to have a better line of sight and are more willing to make tighter turns during a left-turn maneuver compared to a right-turn one. Other than different turning movements, truck wheelbase can also influence the minimum turning radii. In general, trucks with a larger wheelbase tend to have less maneuverability and longer distances between the pivot point and front wheels when making a turn, which
29

would require a larger turning radius. However, in this study, compared to the average minimum turning radii of WB-62 trucks and WB-40 trucks being 60.91 feet and 59.11 feet respectively, WB-67 trucks were found to have an average minimum turning radius of 52.61 feet, which is smaller than either WB-62 or WB-40. Since WB-67 trucks had the greatest number of left turns observed, other factors like steering angles, tire characteristics, and different driving behaviors within the truck stop could also influence the minimum turning radii. In addition, as the width of WB-40 or WB-62/67 trucks is usually between 8 and 8.5 feet, the turning radius of the centerline of trucks' front axle can also be obtained by subtracting 4 feet from the calculated minimum turning radius.
STUDY CONCLUSION In this case study, different types of trucks' turning movements were observed and the corresponding trajectory data were extracted at every 20 video frames (0.67 seconds) to validate existing turning templates within a selected truck stop and its adjacent intersection area. Results indicated that for trucks with left-hand steering, drivers tend to make tighter turns with smaller minimum turning radius during the left-turn movement. While most trucks were found to have larger minimum observed turning radii than the design vehicle values provided in the AASHTO green book. Three WB-62 and three WB-67 trucks were observed to have smaller minimum turning radii, indicating that greater maneuverability and stability available in modern trucks. This case study was conducted in conjunction with research project 22-29 to demonstrate one non-traditional application of the drone-based video data collection method. While
30

currently many transportation agencies have acquired and experimented with drones, most of their usages have focused on the direct analysis of images/videos like conducting bridge inspection or traffic counts, rather than the quantitative extraction of operational and safety parameters. This case study, along with two other previous studies mentioned in this report, have proven the benefits of high-quality and efficient data collection using drones, as well as the feasibility of incorporating drone usage into routine data collection process for other potential applications.
31

CHAPTER 6. CONCLUSIONS AND RECOMMENDATIONS
PROJECT SUMMARY With the advancement in drone data collection and computer-vision techniques, both inexpensive high-resolution video drones and commercial services that offer automatic drone video analysis have become accessible to most transportation agencies. Despite the great potential of using drone video data for the collection of traffic operational and safety related parameters, the absence of user-friendly tools, standard procedures, and training materials remain important barriers to the approach implementation among state DOTs and other transportation agencies. Therefore, this project aims to provide the necessary tools, procedures, and training to allow State DOTs to effectively use highresolution drone video collection and machine-vision data reduction as a part of normal data collection activities. In this project, the research team first conducted a data needs survey among selected state DOTs within the southeast region. Then a set of standard operating procedures that provide recommended practices for drone data collection and analysis was developed to help potential interested users reliably produce quantitative measurements of traffic operations based on the drone data collection approach. To test the feasibility of the developed analysis tools, additional nighttime images and traffic operation videos were collected from various types of innovative intersections in consultation with GDOT personnel, and at the same time, training videos that illustrate usage of the data collection procedures were created. Additionally, the research team conducted a case study at a selected truck stop to collect and analyze different types of trucks' turning movements to
32

validate current truck turning templates. Such non-traditional application shown in this case study has proven the benefits of high-quality and efficient data collection using drones, as well as the feasibility of other potential applications. Finally, in-person training will also be conducted at a research workshop hosted by the project team in cooperation with GDOT personnel. Presentations from this workshop along with other project material will be posted to the online portal containing the other project materials (https://sites.gatech.edu/drones/).
DRONE VIDEO DATABASE DEVELOPMENT Through previous STRIDE and/or GDOT research studies (e.g., GDOT RP18-25 Evaluation of Factors Influencing Roundabout Performance), the project team was able to collect video data of vehicle operations from 12 roundabouts and nighttime images from another 80 roundabouts in Georgia. While in this project, more video and image data were captured from 19 roundabouts and 4 other types of intersections to test the developed analysis tools and calibrate previous modeling results. These recently captured data were then geo-coded and combined with the previous obtained dataset to form into a database of vehicle operational videos within both roundabouts and selected innovative intersections. Based on this data resource, information concerning driver behavior, driving environment, and intersection geometric characteristics, etc. can be extracted through video data processing. And with the extracted information, various types of analyses can be performed without additional costs and the necessity of conducting different field data collection activates for different study purposes.
33

If the drone-based data collection technique enters routine service, and the obtained drone video/image data could be georeferenced and carefully archived, then these collected sample data could also be built into a repository of traffic conditions from different time periods for the benefits of future studies such as before-and-after comparison studies.
ROUTINE USAGE OF DRONE DATA COLLECTION APPROACH Given the results obtained from the regional survey among different state DOTs, it is observed that while currently many transportation agencies have made progress on their own unmanned aircraft system programs, most of them have focused on the direct information extraction from captured videos, like bridge inspection or damage assessment, rather than the quantitative extraction of operational and safety parameters. Since previous studies and practices have demonstrated numerous advantages of drone usage in high-quality data collection as well as the potential of other non-traditional applications, the incorporation of drones into routine data collection activities can bring important contributions to transportation system management and design by both reducing the time and cost of for collection and increasing the accuracy of measurement of detailed operational and safety parameters of intersections.
34

APPENDIX A. INNOVATIVE INTERSECTIONS USED FOR DATA COLLECTION

INTERSECTION ID: #1

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Houze Rd @ Hembree Rd Roundabout 34.061266, -84.346205 2017 4 100 150

Presence of Truck Apron:

Yes

INTERSECTION ID: #2

Road Names:

John Ward Rd SW @ Cheatham Hill Rd

Intersection Type:

Roundabout

Geolocation:

33.93735, -84.6063

Opening Year:

2016

Number of Legs:

3

Diameter of Central Island (ft): 118

Inscribed Circle Diameter (ft): 148

Presence of Truck Apron:

Yes

35

INTERSECTION ID: #3

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

SR 155 @ Fairview Rd Roundabout 33.610931, -84.164819 2013 4 115 150

Presence of Truck Apron:

Yes

INTERSECTION ID: #4

Road Names:
Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft): Presence of Truck Apron:

The Prado NE @ Montgomery Ferry Dr NE Roundabout 33.794783, -84.379147 2014-2017 3 95 130 No

INTERSECTION ID: #5

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Mayfield Rd @ Bates Rd Roundabout 34.09215, -84.31308 2015 4 50 80

Presence of Truck Apron:

Yes

36

INTERSECTION ID: #6

Road Names:

McClure Bridge Rd @ Irvindale Rd NW

Intersection Type:

Roundabout

Geolocation:

34.006621, -84.151016

Opening Year:

2012

Number of Legs:

4

Diameter of Central Island (ft): 120

Inscribed Circle Diameter (ft): 170

Presence of Truck Apron:

Yes

INTERSECTION ID: #7

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Mayfield Rd @ Bethany Rd Roundabout 34.09602, -84.326279 2020 4 80 115

Presence of Truck Apron:

Yes

INTERSECTION ID: #8

Road Names:
Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft): Presence of Truck Apron:

Hembree Rd NE @ Meadow Dr Roundabout 34.022867, -84.450548 2017 4 115 150 Yes

37

INTERSECTION ID: #9

Road Names:

Post Oak Tritt Rd @ Hembree Rd NE

Intersection Type:

Roundabout

Geolocation:

34.012416, -84.459491

Opening Year:

2020

Number of Legs:

3

Diameter of Central Island (ft): 80

Inscribed Circle Diameter (ft): 115

Presence of Truck Apron:

Yes

INTERSECTION ID: #10

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Klondike Rd @ Rockland Rd Roundabout 33.675972, -84.114861 2009 4 100 135

Presence of Truck Apron:

Yes

INTERSECTION ID: #11

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Holly Springs @ Davis Rd Roundabout 34.026693, -84.468304 2013 4 85 115

Presence of Truck Apron:

Yes

38

INTERSECTION ID: #12

Road Names:
Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft): Presence of Truck Apron:

Hopewell Rd @ AC Smith Rd Roundabout 34.323761, -84.0732 2012 4 55 90 Yes

INTERSECTION ID: #13 Road Names: Intersection Type:
Geolocation: Opening Year:

SR 53 @ SR 400 Continuous-flow intersection (CFI) 34.363371, -84.036329
2017

INTERSECTION ID: #14 Road Names:
Intersection Type:
Geolocation: Opening Year:

Pleasant Hill Rd @ Interstate

85

Diverging

Diamond

Interchange (DDI)

33.952258, -84.129957

2013

39

INTERSECTION ID: #15
Road Names: Intersection Type:
Geolocation: Opening Year:

Lenox Rd @ SR 400

Single-point

urban

interchange (SPUI)

33.851916, -84.369757

2017

INTERSECTION ID: #16
Road Names: Intersection Type: Geolocation: Opening Year:

SR 20 @ Simpson Mill Rd Restricted Crossing U-Turn intersection (R-CUT) 33.405933, -84.215544 NA

INTERSECTION ID: #17

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Main St @ Haney Rd Roundabout 34.108472, -84.5175 2010 3 70 105

Presence of Truck Apron:

Yes

40

INTERSECTION ID: #18

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Bell Rd @ Boles Rd Roundabout 34.039392, -84.156036 2016 3 85 125

Presence of Truck Apron:

Yes

INTERSECTION ID: #19

Road Names: Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft):

Arnold Rd @ Hutchins Rd Roundabout 33.902222, -84.050556 2007 3 100 140

Presence of Truck Apron:

Yes

INTERSECTION ID: #20

Road Names:
Intersection Type: Geolocation: Opening Year: Number of Legs: Diameter of Central Island (ft): Inscribed Circle Diameter (ft): Presence of Truck Apron:

Indian Shoals Rd @ Masters Rd Roundabout 33.914775, -83.843114 2011-2013 4 105 140 Yes

41

APPENDIX B. STANDARD OPERATING PROCEDURES FOR CHARGING DRONE BATTERIES AND REMOTE CONTROLLER
I. General This standard operating procedure (SOP) provides instructions on how to charge the intelligent flight batteries and remote controller of the DJI Inspire 2 drone. The drone has a dual battery design which requires a set of two batteries during each flight. Each pair of batteries can provide a maximum of 25-minute flight time for the drone with the Zenmuse X5S camera. It is necessary to charge the batteries after each flight to ensure they have enough power before next usage. The batteries and remote controller can be charged by using a Inspire 2/Ronin 2 180 W battery charger and a battery charging hub. There's also a DJI battery station available for the same purpose of battery charging, but it will not be discussed in this document. II. Charging Equipment A. DJI Inspire 2/Ronin 2 180 W Battery Charger The DJI Inspire 2 180W battery charger, as shown in Figure, is used to charge the Inspire 2 remote controller and intelligent flight batteries. It is manufactured by DJI and the official sales price is $119 before tax. The input voltage requirements for the charger are 100 - 240 V ~ 50/60 Hz 2.9 A, and the output voltage is 26.1 V with a maximum current of 6.9 A.
42

Figure 8. Photo. DJI Inspire 2/Ronin 2 180 W Battery Charger
The charger will require a separate AC cable and the compatible model is the 180 W Power Adaptor AC Cable, as shown in Figure. The cable can be purchased from the DJI official store, and the price is $11 before tax. The charger can directly be used for charging the remote controller, but to charge the flight batteries, an additional charging hub is required.
Figure 9. Photo. 180 W Power Adaptor AC Cable
B. DJI Inspire 2 Intelligent Flight Battery Charging Hub The DJI Inspire 2 Intelligent Flight Battery Charging Hub is designed for use with the Inspire 2 Battery Charger. It is available in the official store with a cost of $129 before tax. As demonstrated in Figure, four flight batteries can be placed in the charging hub at once,
43

but only a maximum of two batteries will be charged at a time. The charging hub is designed to charge batteries in a descending order based on battery power levels, and if batteries are paired, the pair with more stored power will be charged first. The Micro USB port is used for firmware updates.
Figure 10. Photo. DJI Inspire 2 Intelligent Flight Battery Charging Hub
III. Charging Procedures A. Charging the Intelligent Flight Battery The compatible battery for DJI Inspire 2 is the TB50 Intelligent Flight Battery. The battery type is LiPo 6S with net weight being 515 g. The front and back images of one battery can be seen in Figure . Each battery has a capacity of 4280 mAh and voltage of 22.8 V. The batteries can only be charged with an appropriate DJI approved charger, and they will stop charging if high amperage (more than 10A) is detected. The maximum charging power is 180 W, and the charging temperature should be within the range of 41 to 113 F (5 to 45 C). When fully charged, each pair of batteries can provide a maximum of 25-minute flight time under ideal flight conditions. The operating temperature should be within the range of 14 to 104 F (-10 to 40 C).
44

Figure 11. Photo. DJI TB50 Intelligent Flight Battery
Step 1: Remove the batteries from the drone To remove batteries from the drone, first press the battery removal button located at the top of the drone (marked in red in Figure) and the batteries will be released. Then, slide the batteries off the drone, as demonstrated in Figure.
Figure 12. Photo. Removing the batteries from the drone
Step 2: Connect to a power source Connect the DJI Inspire 2 180W battery charger to a power outlet (100-240 V, 50/60 Hz) using the AC cable, then uncover the rubber cover on the power port ([1] in Figure ) located at the top of the charging hub, and connect the charging hub to the battery charger, as demonstrated in Figure .
45

Figure 13. Diagram. Connecting to a power outlet
Step 3: Connect batteries with the charging hub Press the cover release button ([5] in Figure ) and open the corresponding charging port cover ([3] in Figure ). Align the grooves on the Intelligent Flight Battery with the battery slot tracks, and insert the batteries into the charging port to begin charging. This process is illustrated in Figure . It is recommended that each battery pair is charged and discharged simultaneously so that their service life can be prolonged, and a better flight experience can be provided.
Figure 14. Diagram. Charging Hub
46

Figure 15. Diagram. Connecting the Batteries with the Charging Hub
The charging progress can be monitored through Status LEDs located at the top of the charging hub ([6] in Figure ) and battery level indicators located at one side of each battery (LED 1-4 in Figure). For the charging hub status LEDs, if the charging operation is correct, then users will expect to see a green blinking light for the batteries that are being charged, and a solid yellow light for the batteries that are ready to be charged. Table describes the detailed meaning of each possible LED pattern.
Table 2. Descriptions of the Charging Hub Status LEDs
In addition to the charging hub status LEDs, the LEDs located on one side of each battery can also display the current battery level when being charged. The battery levels corresponding to each LED pattern are listed in Table .
47

Figure 16. Diagram. Intelligent Flight Battery Diagram Table 3. Descriptions of the Battery Level Indicators while Charging
Flight batteries are designed to have smart charge-discharge functionality to prevent battery damage. When a charging error is detected, battery level indicators can also inform users of the exact issue through different blinking patterns.
Table shows battery protection mechanisms and corresponding LED patterns. 48

Table 4. Descriptions of the Battery Level Indicators for Battery Protection
Step 4: Finish charging and remove batteries from the charging hub It takes approximately 1.5 hours to fully charge an Inspire 2 Intelligent Flight Battery. If both batteries and remote controller are being charged at the same time, then the battery charging time may be longer. When batteries are fully charged, the charging hub status LED as well as all the battery level indicators will display a solid green light, as shown in Figure , then charging will automatically stop.
49

Figure 17. Photo. Charging Hub Status LEDs Display a Solid Green Light
Additionally, the buzzer located at the bottom of the charging hub will begin beeping when charging is complete. It will beep quickly when a battery pair is fully charged, and if all four batteries are fully charged, the buzzer will switch into an alternating two short and one long beeps. This beeping pattern will last for about 1 hour to remind users to remove the batteries. Once charging is finished, simply press the battery release button ([5] in Figure ) again, remove the batteries, and close the corresponding charging port cover. Then disconnect the charger with the charging hub and replace the rubber cover on the power port ([1] in Figure ). B. Inspire 2 Remote Controller Charging The Inspire 2 remote controller is powered by a 2S rechargeable battery with a capacity of 6000 mAh. The battery level is indicated by the battery level LEDs on the front panel of the remote controller, as shown in Figure. The Inspire 2 remote controller can be charged by connecting it to an Inspire 2 Intelligent Flight Battery charger.
50

Figure 18. Photo. Inspire 2 Remote Controller
Step 1: Check the battery level of the remote controller When the remote controller is powered off, press the power button located on the front panel of the remote controller once and the battery level LEDs will display the current battery level.

Battery LEDs

Level

Figure 19. Diagram. Status LEDs on the Front Panel of Inspire 2 Remote Controller
Step 2: Connect the remote controller with the charger Similar to battery charging, first connect the DJI Inspire 2 180W battery charger to a power outlet (100-240 V, 50/60 Hz) using the AC cable, then uncover the rubber cover on the charging port located on one side of the remote controller. Connect the remote controller

51

with the battery charger using cable B, as demonstrated in Figure. Users should see the battery level indicators blinking to display the charging progress.
Figure 20. Diagram. Connecting the Remote Controller with the Battery Charger
Step 3: Finish charging and disconnect the remote controller The remote controller will take approximately 3 hours to be fully charged. If the charger is charging both batteries and remote controller together, the charging time may be longer. When charging is completed, all the battery level LEDs will display a solid white light. Disconnect the remote controller from the charger and replace the rubber charging port cover on the remote controller. To power off the remote controller, simply press and hold the Power Button until all the LEDs are turned off.
52

Table 5: Procedure Checklist of Charging Drone Batteries and Remote Controller

Procedure Checklist of Charging Drone Batteries and Remote Controller

Project Number:

Comments

Date: General Information
Time:

Operator:

Required Equipment DJI Inspire 2/Ronin 2 180 W Battery Charger Charging Equipment 180 W Power Adaptor AC Cable DJI Inspire 2 Intelligent Flight Battery Charging Hub

Check

Charging Procedures for TB50 Intelligent Flight Battery

Check

Charging Environment Battery Removal
Connect the charging hub to a
power source
Connect batteries with the charging
hub
Monitor the charging process
Finish charging

Ensure the charging temperature is within the range of 41 to 113 F (5 to 45 C)
Ensure the charging power do not exceed 180 W
Remove batteries from the drone by pressing the battery removal button Connect the DJI Inspire 2 180W battery charger to a power outlet (100-240 V, 50/60 Hz) using the AC cable Uncover the rubber cover on the charging hub power port, and connect the charging hub to the battery charger Press the cover release button and open the corresponding charging port cover
Align the grooves on the Intelligent Flight Battery with the battery slot tracks Insert batteries into the charging port to begin charging Monitor the charging process through the charging hub's status LEDs and battery level indicators (detailed meaning of different patterns can be found in the SOP document) Ensure the charging hub status LED and all the battery level indicators display a solid green light

53

Press the battery release button to remove the batteries and close the corresponding charging port cover
Disconnect the charger with the charging hub and replace the rubber cover on the power port

Charging Procedures for Inspire 2 Remote Controller

Check the battery level
Connect the remote controller with the
charger

Press the power button located on the front panel of the remote controller once and the battery level LEDs will display the current battery level Connect the DJI Inspire 2 180W battery charger to a power outlet (100-240 V, 50/60 Hz) using the AC cable Uncover the rubber cover on the charging port located on one side of the remote controller
Connect the remote controller with the battery charger using a different cable
Ensure all the battery level LEDs will display a solid white light

Finish charging

Disconnect the remote controller from the charger and replace the rubber charging port cover on the remote controller

Press and hold the Power Button until all the LEDs are turned off to power off the remote controller

Operator Signature

Check

54

APPENDIX C. STANDARD OPERATING PROCEDURES FOR OPERATING THE DJI INSPIRETM 2 DRONE
I. General This standard operating procedure references the user manual provided by DJI manufacturer in terms of the operating procedure of the DJI INSPIRETM 2. The INSPIRETM 2 is a filmmaking drone that integrates an HD video transmission system and 360 rotating gimbal. The drone weighs 7.58 lbs (3440 g, including two batteries, without gimbal and camera), and the maximum takeoff weight is 9.37 lbs (4250 g). The camera unit is independent from the image processor, and a Zenmuse X5STM camera is used to capture RAW videos for the data collection task. The drone has a dual battery system that prolongs the flight time to a maximum of 25 minutes, and its operating temperature should be within the range of -4 to 104 F (-20 to 40 C). Figure 28 shows the front and overhead views of a DJI INSPIRETM 2 drone.
Figure 28. Photo. DJI INSPIRETM 2 Drone
55

Figure 29. Diagram. DJI INSPIRETM 2 Drone II. Operating Procedures
The drone operating procedures can be separated into three stages: pre-flight, during flight and post-flight. During each stage, the drone will have a corresponding flight mode. For the pre-flight stage, the drone should change from travel to landing mode, then after takeoff, the drone will transform into flight mode, and once the drone is landed, it will switch back to the initial travel mode.
56

Figure 210. Diagram. Different Forms of Drone during Different Flight Modes This section provides instructions and diagrams of procedures involved in each stage. Additionally, it is also recommended for users to watch the tutorial videos available at www.dji.com or in the DJI GOTM 4 app, and refer to the user manual provided by the manufacturer for further detailed information and other special situations that are not discussed in this document. The user manual can be downloaded at https://www.dji.com/downloads/products/inspire-2. A. Pre-flight Preparation
1. Download DJI GOTM 4 app The operation of Inspire 2 requires the usage of DJI GOTM 4 app or other apps compatible with DJI aircraft during flight. If the drone is not connected to the app, then the flight is restricted to a height of 98 ft (30 m) and distance of 164 ft (50 m) for safety reasons. The DJI GOTM 4 app can be found in the App Store or Google Play Store and may be downloaded to a mobile device. Figure 211 shows the icon and user interface of the app.
57

Figure 211. Diagram. User Interface of the DJI GOTM 4 App 2. Install Batteries and Check the Battery Level Step 1: Press the power button located on one side of the battery once, then press again and hold for 2 seconds to power it on. The power LED will turn red, and the battery level indicators will display the current battery level. If the battery level is low, it is recommended to charge it to full capacity before flight. The battery charging procedures can be found in the standard operating procedure 1 document. Step 2: If the battery pair is fully charged, insert them into the battery slots located at the end of the drone. It is encouraged to use paired batteries for better performance. It should be noted that the battery slot located on the right-hand side should be used when using only one battery to supply power.
Figure 212. Diagram. Insert Batteries into the Drone Battery Slots 58

3. Attach the 1550T Quick Release Propellers Step 1: Pair the propellers and motors with arrows of the same color (red or white). Step 2: To attach the propeller onto the motor, press down the spring pad, then hold it and rotate the propeller lock until the arrows are aligned and a click sound is heard. Step 3: Check the propellers and ensure they are in good condition and installed correctly.
Figure 213. Diagram. Steps to Attach the 1550T Quick Release Propellers 4. Insert Micro SD Card if Necessary Since the drone is used for video/image data collection, it is necessary to equip the drone with a Micro SD card to store the captured photos and videos. The Inspire 2TM comes with a 16 GB Micro SD card and supports up to a 128 GB Micro SD card. A UHS-3 type Micro SD card is recommended, because the fast read and write capability of these cards enables users to store high-resolution video data. The Micro SD card can be inserted into the camera Micro SD card slot ([8] in Figure 29) shown in Figure 214 before powering on the Inspire 2. The inserted card can activate still capturing and video recording.
59

Figure 214. Diagram. Insert Micro SD Card to the Slot 5. Unlock Travel Mode The default mode of the drone is set to travel mode when it is powered off. For takeoff, it is necessary for the drone to switch to landing mode first, and the procedure is as follows: Step 1: Place the drone in an open area on a flat ground. Step 2: Press the power button ([13] in Figure 29) a minimum of five times. Then the battery level indicators ([14] in Figure 29) will display a green light to show the battery level.
Figure 215. Diagram. Steps to Power on the Drone Step 3: Lower the landing gear to Landing Mode and power on automatically.
60

Figure 216. Diagram. Drone Changing from Travel Mode to Landing Mode 6. Mount the Zenmuse X5STM Camera to the Gimbal Step 1: Take the Zenmuse X5STM camera out of the box and remove the lens cap ([7] in Figure). Find the gimbal cap, and rotate to remove it from the camera.
Figure 30. Diagram. Zenmuse X5STM Camera Step 2: Press the gimbal and camera release button ([4] in Figure 29) on the Inspire 2 and hold. Rotate to remove the gimbal cap from the drone. Step 3: Align the white dot on the gimbal of the camera to the red dot on the drone and insert the gimbal. Step 4: Rotate the gimbal lock to the locked position by aligning the red dots on the gimbal and drone.
61

Figure 17. Diagram. Steps to Mount the Zenmuse X5STM Camera to the Drone The camera can only be mounted to the drone when the drone is in landing mode. When the camera is mounted correctly, the drone will perform an automatic calibration sequence by spinning the gimbal on its own. 7. Prepare the Remote Controller Step 1: Adjust the mobile device holder to the desired position and adjust the antenna located at the back of the remote controller. Step 2: Press the button on the side of the mobile device holder to release the clamp, adjust it to fit the size of the mobile device and then attach the mobile device. Step 3: Connect the mobile device to the remote controller with a USB cable. Plug one end of the cable into the mobile device, and the other end into the USB port on the back of the remote controller.

Commented [CW1]: This can probably be combined into one step

Figure 18. Diagram. Steps to Prepare the Remote Controller 62

Step 4: Press and hold the Power Button to power on the remote controller, and a beep sound will be heard when it powers on. The LEDs on the front panel indicate the battery level of the remote controller.
Figure 19. Diagram. Power on the Remote Controller Step 5: Launch the DJI GOTM 4 app on the mobile device. Enter camera view and then tap "Linking Remote Controller" button as shown below. When the remote controller is ready to link, the status indicator located at the front panel will blink blue with a beep sound.
Figure 20. Diagram. Link Remote Controller Window in the DJI GOTM 4 App 63

Step 6: Locate the Linking button on the drone, as shown in Figure 21. Press the Linking button to start linking and the remote controller status LEDs will blink green rapidly. When linking is completed, the status LEDs will display a solid green light.
Figure 21. Diagram. Location of the Linking Button on Inspire 2TM 8. Preflight Checklist Once all the previous steps are completed, the drone will be ready to take off. Users are encouraged to use the following checklist to ensure every requirement is satisfied for a safe flight: Remote controller, Intelligent Flight Battery, and mobile device are fully charged. Propellers are mounted correctly and firmly. Micro SD card has been inserted, if necessary. Gimbal is functioning normally. Motors can start and are functioning normally. The DJI GOTM 4 app is successfully connected to the drone. Ensure that the sensors for the Obstacle Sensing System are clean.
B. In-flight Operation 64

The drone operation should avoid severe weather conditions (wind speeds exceeding 22 mph (10 m/s), snow, rain, and fog) and be conducted in open areas. To fly the drone, launch the DJI GO 4 app and tap the `GO FLY' button, as shown in Figure 211. Before taking off, it is required to ensure that the aircraft status bar in the upper left corner of the DJI GO 4 app indicates `Ready to Go (GPS)' or `Ready to Go (Vision)' in flying indoors without any warning messages.
Figure 22. Diagram. `READY TO GO' Status Bar in the DJI GO 4 App 1. Compass Calibration It is possible that the status bar in the app will indicate calibration is needed for the drone before takeoff. Under this circumstance, simply follow the instructions provided in the app, which are also listed below. The calibration procedures should be carried out in an open area. Step 1: Tap the aircraft status bar in the app and select "Calibrate" option, then follow the on-screen instructions. Step 2: Hold the drone horizontally and rotate 360 degrees. The aircraft status indicators will display a solid green light.
65

Figure 23. Diagram. Rotate the Drone Horizontally Step 3: Hold the drone vertically, with the battery side pointing upward, and rotate it 360 degrees around the center axis. Recalibrate the compass if the aircraft status indicator glows blinking red.
Figure 24. Diagram. Rotate the Drone Vertically Step 4: If the Aircraft Status Indicator blinks red and yellow after the calibration procedure, then move the drone to a different location and try again, until the app indicates the drone is ready to fly. 2. Takeoff Step 1: Place the aircraft in an open, flat area with the battery level indicators facing towards the drone pilot.
66

Step 2: Launch the DJI GO 4 app and enter the Camera page. Step 3: Wait until the aircraft indicators blink green, which means the Home Point is recorded and it is safe to fly. If the indicators flash yellow, then the Home Point has not been recorded. Step 4: For manual takeoff, push both left and right sticks towards each other or away from each other to start the motors. Then slowly push forward the left stick to take off.
Figure 39. Diagram. Steps to Take off Manually For automatic takeoff, first confirm the surrounding environments are safe for flight, then use the Auto Takeoff function by tapping the `Auto Takeoff' button in the app and sliding the icon to confirm for takeoff. The icon of the `Auto Takeoff' function in the app is shown in Figure 25. The drone will then take off and hover at 4 ft (1.2 m) above the ground.
Figure 25. Icon. `Auto Takeoff' Function Icon in the DJI GO 4 App 3. Flight Control Once the drone takes off, users can use remote controller to fly the drone. On the remote controller, the left stick controls the drone's elevation and heading while the right stick controls the drone's forward, backward, and lateral movements. The detailed stick operations and functions are discussed below.
67

Change Drone's Elevation: Users can push the left stick forward to ascend and backward to descend the drone's elevation. The more the stick is pushed away from the center position, the faster the drone will change elevation.
Figure 26. Diagram. Left Stick Forward/Backward Movement to Change Drone Elevation
Rotate Drone: Users can push the sick to the left to rotate the aircraft counterclockwise, and push the stick to the right to rotate the aircraft clockwise. If the stick is centered, the drone will keep its current direction. The more the stick is pushed away from the center position, the faster the drone will rotate.
Figure 27. Diagram. Left Stick Left/Right Movement to Rotate Drone Move Drone Forward/Backward: Users can push the stick forward to fly the drone forward and backward to fly the drone backward. The drone will hover in the current place if the stick is centered. The stick can be pushed further away from the center position for a larger pitch angle and faster flight.
68

Figure 28. Diagram. Right Stick Forward/Backward Movement to Move Drone Forward/Backward
Move drone to the Left/Right: Users can push the stick to the left to fly the drone left and push the stick to the right to fly the drone right. The drone will hover in the current place if the stick is centered. The stick can be pushed further away from the center position for a larger pitch angle and faster flight.
Figure 29. Diagram. Right Stick Left/Right Movement to Move Drone to the Left/Right
Change Camera's Facing Direction: Users can turn the dial to the right to let the camera point upwards and turn the dial to the left to let camera point downwards. The camera will remain in its current position when dial is static.
Figure 30. Diagram. Gimbal Dial to Change Camera's Facing Direction Pause the Flight: Users can press the Intelligent Flight Pause button to pause the current task.
Figure 31. Diagram. Press the Pause Button to Pause the Flight 69

4. Image/Video Shooting To collect image/video data during flight, users can either use the shutter and record button on the remote controller to shoot images and videos, or use the touch interface in the DJI GO 4 app to capture photos, record videos and playback. The available functions and operations involved in each method are listed below. Remote Controller Gimbal Dial ([1] in Figure 32): Turn the dial to control the tilt of the gimbal. Video Recording Button ([2] in Figure 32): Press the button once to start recording video, then press again to stop recording. Shutter Button ([3] in Figure 32): Press the button to take a photo. If burst mode is activated, multiple photos will be taken with a continuous press. This function can be used during video recording as well. Camera Settings Dial ([5] in Figure 32): Turn the dial to adjust camera settings such as ISO, shutter speed, and aperture without letting go of the remote controller. Press down on the dial to toggle between these settings.
Figure 32. Diagram. Remote Controller Diagram DJI GO 4 App Taking Photos: Tap the shutter/record switch ([6] in Figure 33) to select shutter. Tap the shutter/record button ([8] in Figure 33) to take photos. There are five shooting
70

modes available: Single Shooting, Multiple Mode, AEB (Auto Exposure Bracketing), Timed Shot, and RAW Burst Mode. The default mode is Single Shooting, and the shooting mode can be changed via the DJI GO 4 app. Recording: Tap the shutter/record switch ([6] in Figure 33) to activate video recording mode, and then tap the shutter/record button ([8] in Figure 33) once to start recording, and tap again to stop recording. The recording time length will be displayed below the shutter/record button. Change Camera Settings: Tap the Photography Configurations and Parameter Settings button ([10] in Figure 33) to set exposure modes, ISO, shutter, photo styles, and auto exposure values of the camera. Playback: Tap the playback button ([11] in Figure 33) to review captured photos and videos. Press the same button again to return to the camera view.
71

Figure 33. Diagram. Touch Interface of DJI GO 4 App 5. Landing Auto Landing The auto landing function can only be used when the Aircraft Status Indicator is blinking green. The landing process can be paused by tapping the cross button on the screen. The steps involved in the auto landing process are discussed as follows: Step 1: Let the drone hover over a level surface and ensure the landing condition is ideal. Step 2: Tap on the Auto Landing button (shown in Figure 34) in the DJI GO 4 app and slide to confirm.
Figure 34. Icon. Auto Landing Button Step 3: Landing Protection will be activated during auto landing, and it determines whether the ground is suitable for landing. If so, the drone will land gently. If not, the drone will hover and wait for pilot confirmation. If Landing Protection is inactive, the DJI GO 4 app will display a landing prompt when the drone descends below 0.7 meters. Tap to confirm or pull down the control stick for 2 seconds to land when the environment is appropriate for landing. Step 4: The drone will land and turn off automatically. Manual Landing
72

Step 1: Lower the drone landing gear by toggling the Return to Home (RTH) switch on the front panel of the remote controller down before landing. The drone will not be able to land if the landing gear is not lowered.
Figure 35. Icon. Auto Landing Button Step 2: Push the left stick on the remote controller backward slowly until the drone lands on the ground. Keep holding the stick for a few seconds to stop the motors. C. Post-flight Procedures 1. Dismount the Gimbal and Camera The gimbal and camera should be removed before transforming the drone from Landing Mode to Travel Mode. Step 1: Press down the gimbal detach button and rotate the gimbal lock at the same time to remove the gimbal and camera. The gimbal lock should be fully rotated when removing the gimbal for the next installation. Step 2: Put the cover back to the gimbal connector of the drone for protection. Step 3: Put the lens cap and gimbal cap back on to the camera. 2. Power off the Drone Press the power button ([13] in Figure 29) five times to transform the drone to Travel Mode. The battery level indicator lights should turn off. 3. Remove the Batteries Press the Battery Removal button ([15] in Figure 29) to remove the batteries from the drone. If necessary, charge the used batteries in preparation for the next flight.
73

4. Detach the Propellers For each propeller, press down the spring pad and rotate the propeller lock to remove. Put all the propellers back to the drone box. 5. Power off the Remote Controller Step 1: To power off the remote controller, simply press and hold the Power Button and wait until all the Battery Level LEDs have been turned off. The batteries should be turned off first before turning off the remote controller. Step 2: Disconnect the mobile device with the remote controller by unplugging the USB cable. Step 3: Press the button on the side of the mobile device holder to release the clamp and take off the mobile device. Change the mobile device holder to its original position and put the remote controller back into the drone box.
74

Table 6. Procedure Checklist of Operating the DJI INSPIRETM 2 Drone

Procedure Checklist of Operating the DJI INSPIRETM 2 Drone

Project Number:

Comments

Date: General Information
Time:

Operator:

Required Equipment

Check

Drone Equipment

DJI INSPIRETM 2 Drone TB50 Intelligent Flight Batteries (1 set for each flight) DJI INSPIRETM 2 Remote Controller 1550T Quick Release Propellers (2 red and 2 white) Zenmuse X5STM camera A 16 GB Micro SD card

A mobile device with DJI GOTM 4 app installed

Preflight Preparation

Check

Download the DJI GOTM 4 app
Insert Intelligent Flight Battery
Install 1550T Quick Release Propellers
Insert Micro SD Card (if needed)

Download and install the DJI GOTM 4 app from the App Store or Google Play Store to a mobile device Press the battery power button once, then press again and hold for 2 seconds to check the battery level indicators to ensure the batteries are fully charged Insert a set of paired batteries into the battery slots located at the end of the drone Pair the propellers and motors with arrows of the same color (red or white) Press down the spring pad, then hold it and rotate the propeller lock until the arrows are aligned and a click sound is heard Check the propellers and ensure they are in good condition and installed correctly Insert the micro SD card into the camera Micro SD card slot located on one side of the drone before powering on the drone

Place the drone in an open area on a flat ground

75

Unlock Drone Travel Mode and Power on

Press the drone power button a minimum of five times to power on the drone and switch it from

travel to landing mode

Remove the lens cap and gimbal cap from the camera

Mount the Zenmuse X5STM Camera to
the Gimbal

Press the gimbal and camera release button located on the drone bottom and hold, then rotate to remove the gimbal cap from the drone Align the white dot on the gimbal of the camera to the red dot on the drone and insert the gimbal

Rotate the gimbal lock to the locked position by aligning the red dots on the gimbal and drone

Put the mobile device in the mobile device holder of the remote controller

Link the Remote Controller to the
drone

Connect the mobile device to the remote controller with a USB cable Press and hold the Power Button to power on the remote controller, and a beep sound will be heard when it powers on Ensure the remote controller has enough power by checking the battery level indicator on the front panel of the remote controller Launch the DJI GOTM 4 app on the mobile device. Enter camera view and tap "Linking Remote Controller" button Locate the Linking button on the drone, and press the Linking button to start linking and the remote controller status LEDs will display a solid green light when linking is completed

In-flight Operation

Avoid controlled airspace

Flight Environment

Avoid severe weather conditions (wind speeds exceeding 22 mph (10 m/s), snow, rain, and fog)

Avoid large/ special events

Take off

Place the aircraft in an open, flat area with the battery level indicators facing towards the drone pilot

Check

76

Flight Control
Landing
Dismount the Gimbal and Camera Power off the Drone

Launch the DJI GO 4 app and tap the `GO FLY' button, then enter the Camera page Ensure the aircraft indicators blink green, which means the Home Point is recorded and it is safe to fly Ensure that the aircraft status bar in the upper left corner of the DJI GO 4 app indicates `Ready to Go (GPS)' or `Ready to Go (Vision)' in flying indoors without any warning messages Tap the `Auto Takeoff' button in the app and slide the icon to confirm for takeoff
Ensure the left stick on the remote controller can change the drone's elevation and heading Ensure the right stick on the remote controller can change the drone's forward, backward, and lateral movements Turn the dial located on the side of the remote controller to ensure it can change the camera's facing direction Use the shutter and record button on the remote controller or use the touch interface in the DJI GO 4 app to capture photos or record videos Let the drone hover over a level surface and ensure the landing condition is ideal
Tap on the Auto Landing button in the DJI GO 4 app and slide to confirm If a landing prompt appears due to the activated Landing Protection, tap to confirm to land when the environment is appropriate for landing
Post-flight Operation
Press down the gimbal detach button and rotate the gimbal lock at the same time to remove the gimbal and camera Put the cover back to the gimbal connector of the drone for protection Put the lens cap and gimbal cap back on to the camera Press the drone power button five times to switch the drone back to Travel Mode

Check

77

Remove Batteries Detach Propellers
Power off the Remote Controller

Press the Battery Removal button to remove the batteries from the drone For each propeller, press down the spring pad and rotate the propeller lock to remove. Then put all the propellers back to the drone box Press and hold the Power Button and wait until all the Battery Level LEDs have been turned off
Disconnect the mobile device with the remote controller by unplugging the USB cable
Take off the mobile device and put the remote controller back into the drone box

Operator Signature

78

APPENDIX D. STANDARD OPERATING PROCEDURES FOR FIELD DATA COLLECTION
I. General This standard operating procedure provides instructions and lists of equipment required by the field data collection of operational, geometric, and illumination characteristics from innovative intersections through field survey. In addition to regular field survey, drones that are commercially available will also be used to capture high-resolution drone videos to measure and calibrate important operational and safety related parameters for selected intersections. This procedure could be used directly, or modified as needed, by transportation agencies and other practitioners to meet their specific data collection needs. II. Field Survey Procedures For the safety analysis discussed in this document, the safety influence area for each intersection is set to be 400 ft (122 m) upstream from the entry/exit point of each approach, and data should be collected within this area. The stop lines can be used to delineate exit and entry points, as shown in Figure 536. For situations where the safety influence areas of two adjacent survey intersections overlap, the edge of the influence area on the approach between the two intersections should be set at the half-way point. Intersection geometric characteristics data such as number of lanes, lane width, and angles between two adjacent approaches etc. should be collected by field survey, and instructions regarding the collection of each data type are provided below.
79

Figure 536. Drawing. Location of Entry and Exit points at Roundabouts and Conventional Intersections
A. Required Equipment Compass GPS device Traffic safety vest for every crew member Survey-crew-ahead safety signs
80

Two traffic cones Metered wheel 25 feet tape measure Laser distance meter (Bosch GLM 50) Laser target card Laser enhancement glasses B. Field Data Collection
1. Intersection Location It is necessary to record the latitude and longitude information of selected reference points within each intersection surveyed so that they can be geocoded in the analysis process. At least three reference points are required to locate one intersection and these points should be within the intersection's safety influence area. The latitude and longitude values can be obtained from Google Maps and they should be recorded as decimal degrees. 2. Lane Width Measurement In order to avoid entering into active travel lanes, a Bosh GLM 50 laser distance meter and a laser target card are used for lane width measurement. For data collection during daytime, a pair of laser enhancement glasses is also used to enhance the visibility of the red beam laser in sunlight. The lane width of each intersection approach should be measured within the safety influence area defined earlier. Undivided Roadway Lane Width Measurement Use the laser meter and the laser target card to measure the entire road width from one edge of the pavement to the other.
81

Step 1: One crew member holds the laser meter on one side of the road while another crew member holds the laser target card on the other side. Step 2: The crew member with the laser meter beams the laser across the travel lanes towards the target card. Since the laser might cause damage to human eyes, it is important that the crew member who is holding the target card does not look at the laser meter. Step 3: Read and record the width of the road displayed on the meter screen. Then divide the measured distance by the number of lanes on that approach to obtain the width of each lane. Divided Roadway Lane Width Measurement For roadways with wide median, use the laser meter and the laser target card to only measure the width of incoming approach lanes. Step 1: One crew member holds the laser meter on the edge of pavement next to the shoulder while another crew member stands at a safe location inside the roadway median and holds the laser target card on the other edge of the pavement next to the median. Step 2: The crew member with the laser meter beams the laser across the travel lanes towards the target card. Step 3: Read and record the width of the road displayed on the meter screen. Then divide the measured distance by the number of incoming travel lanes on that approach to obtain the width of each lane. 3. Intersection Geometric Layout
82

Step 1: Choose the basic intersection layout from Figure 537 based on the intersection type. The basic layouts provided are for 4-leg intersections, users can cross out or add legs depending on the actual number of legs the selected intersections have.
Figure 537. Drawing. Basic Layout of Conventional Intersection (Left) and Roundabout (Right)
Step 2: Indicate the true North direction with a North Arrow on the intersection layout. Step 3: Assign directions to each intersection leg based on the direction of vehicles entering the intersection on that approach, and label the approach directions on the layout. For example, the Northbound (NB) approach is the one on which vehicles traveling towards the intersection are heading Northbound. Step 4: Number the intersection legs clockwise starting from SB and record the name of each leg based on the corresponding street names. Step 5: Observe and record the approach geometric characteristics based on the intersection type. For roundabout approaches, record the number of travel lanes, lane width, roadway functional class, the angle to the upstream approach, the presence of splitter island, roundabout ahead warning sign, yield sign, speed limit sign and
83

horizontal curve. For conventional intersection approaches, record the number of travel lanes for each turning movement within the safety influence area, lane width, roadway functional class, posted speed limit, presence of horizontal curve, intersection ahead warning sign and shoulder width. Step 6: If the selected intersection is a roundabout, users are also required to measure and record the inscribe circle diameter and the presence of raised central island. Step 7: Observe and record the presence of any abutting properties within the safety influence area, as well as the presence of any potential lighting source(s) other than the purposefully built streetlights (e.g., gas station, stores, house decorations etc.) within the intersection. 4. Intersection Safety Countermeasure In addition to the geometric layouts, all the safety countermeasures employed within the intersection safety influence area should be recorded too. These countermeasures include roadway medians, rumble strips, median barriers, transverse markings, roadside safety barriers, etc., which can be applied to all kinds of intersections. Additionally, for roundabouts, the safety treatments for central island like truck aprons, reflective chevron signs, plants should be considered as well. For each countermeasure, the specific type, dimensions, and locations should be recorded. Users can refer to Appendix 1 located at the end of this SOP for typical examples of each safety countermeasure. 5. Intersection Pedestrian Facility To analyze pedestrians' impacts on intersection safety and traffic operations, the collected dataset should include the presence of pedestrian facilities within the safety
84

influence area. Common pedestrian facilities include marked/raised/paved crosswalk, refuge island, sidewalk, and pedestrian crossing sign. Similar to safety countermeasures, the detailed types, dimensions and locations of each pedestrian facility should be recorded. 6. Data Recording Step 1: To record the observed intersection geometric characteristics data, users can use the data reporting form provided in Figure . Figure 538 shows an example of the reporting form filled during one of the previous field surveys. Users are encouraged to add or delete the items listed in the form based on specific data collection needs. Step 2: Users can scan the completed data forms along with the intersection geometric layout sketches to store the collected dataset, or extract the information from the forms and input into a spreadsheet within 24 hours of field survey.
85

Figure 53. Form. Data Reporting 86

Figure 538. Form. Example of Filled Data Reporting C. Safety Guidelines It is important for survey crew members to wear a traffic safety vest at all times. The vest must
be put on before surveyors set off from their base to the selected intersection site(s). The vest must be worn on top of all other clothing. The survey must be carried out by at least two surveyors; One can serve as a lookout to warn of potential hazards while the other does the main survey work. Crew members should not enter the active travel lane at any time since there is no measurement that will require crew members to be in the active travel lane. III. Drone Data Collection Procedures
87

A. Required Equipment DJI InspireTM 2 drone Intelligent flight batteries Gimbal and ZenmuseTM X5S camera Flashlight Metered wheel Traffic safety vest for every crew member Survey-crew-ahead safety signs Two traffic cones
B. Before Departure to the Field Ensure that there is no event near the selected intersection site that will generate large crowds
because drones must not be flown over a large crowd. Check for Notice to Airmen (NOTAM) for the vicinity of the site and local weather forecast.
Field trips should proceed only under good weather conditions when there is no rain/snow/fog, the temperatures are within the range of -4 to 104 F and wind speeds do not exceed 15 MPH. For best results, wind speeds should not exceed 8 MPH. Check the FAA's B4UFLY App to ensure that the survey location is not within a restricted air space or has not been designated as a temporary no fly zone. Ensure that the remote controller and all the Intelligent Flight Batteries are fully charged. Ensure the DJI GO 4 app or DJI GS PRO app and the aircraft's firmware have been upgraded to the latest version. Ensure that the gimbal is detached from the drone during travel to and from the site.
88

Ensure that no member of the team is under the influence of alcohol or drugs, and is not fatigued or impacted by emotional or psychological stress.
Ensure that all crew members wear the traffic safety vests before going to the field. C. Preflight Drive to the survey site and find a parking place that is located at a sufficient distance from the
intersection site but in a location such that the drone pilot can always have a clear line of sight to both the drone and the surveyed intersection. This is to ensure the field data collection process won't distract drivers and/or influence driver behavior. Some field setup locations used in the previous studies include adjacent parking lots, right-of-way reservations around the roads, and driveways of residential buildings (with permission of the owners). Figure 39 shows a setup location selected in a parking lot near the surveyed roundabout site.
Figure 395. Photo. Example of Setup Location Near a Roundabout Ensure that all propellers are in good condition. Do not use aged, chipped, or broken
propellers. 89

Check to ensure that the gimbal can rotate freely before powering it on.

Ensure that the lens cover of the camera is off, and the lens is clean and free of stains.

Ensure that the Micro SD card has enough available data space.

Ensure that the camera settings match the standard specifications for flight. The standard

specifications are listed in Table .

Table 7. Zenmuse X5S Camera Standard Specifications for Flight

Standard Zenmuse X5S Specifications for Roundabout Video Recording

Parameters

Labeled Settings

Camera Mode

Auto (400 ft)

Resolution

20.8 megapixels

Exposure Value

Manual (take images at -3, -2, and +2)

ISO Setting

3200

White Balance

Default

Auto Focus

Enabled

Shooting Mode

Single Shot (enabled)

Shutter Speed

Auto

Aperture

Manual (take images at 4 and 5)

Color Mode

Black and White

Assemble the DJI InspireTM 2 drone by inserting the batter pair, attaching propellers, mounting

the gimbal and camera and inserting the Micro SD card. Then power the drone on. The

assembling process is discussed in detail in the `Procedures for operating the DJI

INSPIRETM 2 drone' SOP.

90

Prepare the remote controller by connecting it to the mobile device, launch the DJI GO 4 app on the mobile device, enter camera view and then tap "Linking Remote Controller" button to connect it to the drone.
Check the flight mode switch on the remote controller and ensure the drone is set to P-mode (Positioning).
Rotate each propeller to ensure that it moves freely without touching any part of the drone. Be aware of the sharp edges of the propellers.
Check that the propellers and motors are installed correctly and firmly before every flight. All crew members must stay clear of propellers or motors when they are spinning. The aircraft
must only be touched by hand while it is powered off. Observe the surroundings and develop an emergency landing plan in case the drone cannot
return to the takeoff point. Ensure that the drone is placed in a flat open area, and its' takeoff and landing positions are
clear of overhead power lines and/or tree branches. Ensure that Wi-Fi on any mobile device is turned off to avoid causing interference to the
remote controller. Use a high beam flashlight to inspect and ensure that the planned takeoff and landing path do
not have any overhead power cables that may otherwise not be very visible at night. Ensure the Aircraft Status Bar in the DJI GO 4 app indicates `Ready to Go (GPS)' or `Ready
to Go (Vision)' in flying indoors. If so, the drone is ready to take off.
D. During Flight
91

The DJI InspireTM 2 drone attached with Zenmuse X5STM camera can be used to capture both image and video data to measure and calibrate geometric and operational related parameters for different types of intersections. Due to the differences in the required operation and flight time between the two types of data collection, the operating procedures of both image and video data collection will be discussed. Depending on the specific data collection needs, users can follow the corresponding procedures provided below.
1. Video Data Collection Procedures Intersection geometric characteristics (e.g., lane configuration, the presence of safety countermeasures, etc.) and operational data (e.g., vehicle trajectories, traveling speed, etc.) can be extracted from the high-resolution video data recorded by the drone. Step 1: Scroll the Left Dial ([1] in Figure 32) of the remote controller to tilt the camera down by 90 degrees to have a top-down view of the intersection. The camera angle in degrees will be displayed on the Gimbal Slider ([9] in Figure 33) in the DJI GO 4 app.
Figure 540. Diagram. Remote Controller Diagram Step 2: Fly the drone to the center of the intersection at an altitude of 390-395 ft above ground level (AGL), just shy of the maximum allowable height of 400 ft AGL, to ensure that the video could capture both the intersection center as well as a significant portion of each intersection approach.
92

Step 3: Set the camera at a minimum resolution of 1080P at 30 frames-per-second to ensure video qualities in which traffic is seen as a smooth rather than "glitchy" movement of vehicles. This is also required for the subsequent drone video data analysis. A typical high-resolution image taken by the drone camera is shown in Figure 541.
Figure 541. Photo. Example of High-resolution Image of One Roundabout Taken by the Drone Camera
Step 4: Record videos of vehicle operation within the intersections. Users can either press the Video Recording Button ([2] in Figure 32) on the remote controller once to start recording videos, then press it again to stop recording. Or tap the Shutter/Record Switch ([6] in Figure 33) to activate video recording mode, and then tap the Shutter/Record button ([8] in Figure 33) once to start recording, and tap it again to stop recording. For each intersection, two video recordings of approximately 10 to 15 minutes duration should be made. The video length is influenced by the drone battery
93

usage speed and the battery power required for a safe landing. It is recommended to start landing procedures when the battery level drops to 25 - 30%.
Figure 542. Diagram. Touch Interface of DJI GO 4 App Step 5: Tap the playback button ([11] in Figure 33) to review recorded videos. Press the same button again to return to the camera view. The recorded videos will be saved in the inserted Micro SD card. 2. Collect Image Data (Geometric and illumination characteristics data collection) Intersection geometric characteristics (e.g., lane configuration, the presence of safety countermeasures, etc.) and operational data (e.g., vehicle trajectories, traveling speed, etc.) can be extracted from the high-resolution video data recorded by the drone. Step 1: Scroll the Left Dial ([1] in Figure 32) of the remote controller to tilt the camera down by 90 degrees to have a top-down view of the intersection.
94

Step 2: Fly the drone to a desired altitude based on the data collection needs to include all the interested information in the camera frame. Do not fly above the maximum allowable altitude of 400 ft AGL. The app will give out warnings when the drone altitude exceeds 395 ft AGL. Step 3: Change the camera settings based on the required image qualities by tapping the Photography Configurations and Parameter Settings button ([10] in Figure 33) to set exposure modes, ISO, shutter, photo styles, and auto exposure values of the camera. Step 4: Take images of the selected intersection. Users can either press the Shutter Button ([3] in Figure 32) to take a photo, and multiple photos will be taken with a continuous press if burst mode is activated. Or tap the Shutter/Record Switch ([6] in Figure 33) to select shutter, then tap the Shutter/Record button ([8] in Figure 33) to take photos. There are five shooting modes available: Single Shooting, Multiple Mode, AEB (Auto Exposure Bracketing), Timed Shot, and RAW Burst Mode. The default mode is Single Shooting, and the shooting mode can be changed via the DJI GO 4 app. Step 5: Tap the playback button ([11] in Figure 33) to review captured images. Press the same button again to return to the camera view. The captured images will be saved in the Micro SD card. 3. Safety Guidelines The altitude of the drone should never exceed 400 ft AGL. In case of an emergency landing or loss of power that causes a free-fall crash of the drone,
crew members should not attempt to catch the drone because the rotating propellers are dangerous and can cause significant harm. The pilot and observer(s) must maintain visual line of sight to the drone at all times.
95

The pilot must not answer any incoming phone calls or use their mobile device while operating the drone.
In the instance of low battery warning or dangerous wind speed warning, land the drone immediately at a safe location.
Do not remove the Micro SD card while the drone is powered on.
E. Post Flight The gimbal and camera should be removed before transforming the drone from Landing
Mode to Travel Mode. Detach the gimbal from the drone and put both in secure travel mode before departing to base or to another measurement location. Power off the drone and pick up the aircraft only after it is powered off. Power off the remote controller and disconnect the mobile device, then put all the equipment back into the drone box. Download the captured images or videos from the Micro SD card onto an external storage device or laptop. Do not connect the aircraft system to any USB interface that is older than version 2.0.
96

Table 8. Procedure Checklist of Field Data Collection

Procedure Checklist of Field Data Collection

Project Number:

Date: General Information
Time:

Recorder:

Required Equipment

A compass

A GPS device

Traffic safety vest for every survey crew member

Field Survey Equipment

Survey-crew-ahead safety signs Two traffic cones A measuring wheel

A 25 feet tape measure

A laser distance meter (Bosch GLM 50) and laser target card

Comments Check

DJI INSPIRETM 2 Drone

Drone Data Collection Equipment
Record Intersection Location

TB50 Intelligent Flight Batteries (1 set for each flight) DJI INSPIRETM 2 Remote Controller 1550T Quick Release Propellers (2 red and 2 white) Zenmuse X5STM camera A 16 GB Micro SD card A mobile device with DJI GOTM 4 app installed Traffic safety vest for every survey crew member Survey-crew-ahead safety signs Two traffic cones Flashlight (for nighttime data collection)
Field Survey Select at least three reference points within each intersection surveyed and record the latitude and longitude information of each selected reference points

Check

97

Measure Lane Width
Record Intersection Geometric Layout
Record Intersection Safety Countermeasures Safety Guidelines
Before departure

Use a Bosh GLM 50 laser distance meter and a laser target card for lane width measurement Choose the basic intersection layout from the provided list based on the intersection type Indicate the true North direction with a North Arrow on the intersection layout Label the approach directions on the layout Number the intersection legs clockwise starting from SB and record the name of each leg based on the corresponding street names
Observe and record the approach geometric characteristics in the data reporting form based on the intersection type
Observe and record the presence of any abutting properties within the safety influence area
Record all the safety countermeasures and pedestrian facilities employed within the intersection safety influence area
Wear a traffic safety vest on top of all other clothing at all times The survey crew should consist at least two people Do not enter any active travel lanes at any time
Drone Data Collection
Ensure that there is no event near the selected intersection site that will generate large crowds Check for Notice to Airmen (NOTAM) for the vicinity of the site and local weather forecast Check the FAA's B4UFLY App to ensure that the survey location is not within a restricted air space or has not been designated as a temporary no fly zone Ensure that the remote controller and all the Intelligent Flight Batteries are fully charged Ensure the DJI GO 4 app and the aircraft's firmware have been upgraded to the latest version Ensure that the gimbal is detached from the drone during travel to and from the site

Check

98

Drone Operation
Image/Video Data Collection
Safety Guidelines

Ensure that no member of the team is under the influence of alcohol or drugs, and is not fatigued or impacted by emotional or psychological stress
Ensure that all crew members wear the traffic safety vests before going to the field Refer to the APPENDIX C. STANDARD OPERATING PROCEDURES FOR OPERATING THE DJI INSPIRETM 2 DRONE
Scroll the Left Dial of the remote controller to tilt the camera down by 90 degrees to have a top-down view of the intersection
Fly the drone to a desired altitude based on the data collection needs to include all the interested information in the camera frame
For video recordings, set the camera at a minimum resolution of 1080P at 30 frames-per-second to ensure video qualities
Capture images/ videos of vehicle operation within the intersections by either pressing the Shutter Button/ Video Recording Button on the remote controller or tap the Shutter/Record Switch in the app to switch between image/ video mode and then tap the Shutter/Record button to take a picture or start recording Tap the playback button to review captured images or videos Download the captured images or videos from the Micro SD card onto an external storage device or laptop The altitude of the drone should never exceed 400 ft AGL In case of an emergency landing or loss of power that causes a free-fall crash of the drone, crew members should not attempt to catch the drone because the rotating propellers can cause significant harm
The pilot and observer(s) must maintain visual line of sight to the drone at all times

99

Operator Signature

The pilot must not answer any incoming phone calls or use their mobile device while operating the drone
In the instance of low battery warning or dangerous wind speed warning, land the drone immediately at a safe location
Do not remove the Micro SD card while the drone is powered on

100

APPENDIX E. EXAMPLE OF TYPICAL ROADWAY ELEMENTS All images in this appendix were captured from Google Street View.
Figure 59. Photo. Roadway Raised Median
Figure 60. Photo. Roadway Median Barrier (Left to right: Cable barriers, Metalbeam guardrails, concrete barriers) 101

Figure 61. Photo. Approach Splitter Island (Left to right: Raised Splitter Island without a Crosswalk, Raised Splitter Island with a Depressed Crosswalk)
Figure 62. Photo. Central Island (Left to right: Raised Central Island with Plants, Raised Central Island with Chevron Signs) 102

Figure 63. Photo. Pedestrian Crosswalk (Left to right: Marked Crosswalk, Textured Crosswalk)
Figure 64. Photo. Pedestrian Refuge Island
Figure 65. Photo. Traverse Lane Making 103

Figure 66. Photo. Rumble Strips. (Left to right: Centerline, Shoulder, Transverse Rumble Strips)
Figure 67. Photo. Intersection Ahead Signs. (Left to right: 3-leg Intersection, 4-leg Intersection, Roundabout Ahead Sign) 104

APPENDIX F. STANDARD OPERATING PROCEDURES FOR DRONE CAMERA CALIBRATION
I. General Commercially available drones attached with cameras can be used to replace tripodmounted cameras as a significant improvement to the photographic measurement protocol developed for the street lighting audit. It is worthwhile to establish standard procedures so that such technique can be implemented on a regular basis. Due to the differences inherited in various camera performances and settings, the camera needs to be calibrated first before field measurement to ensure accuracy. This standard operating procedure is developed to provide instructions on how to calibrate the Zenmuse X5STM camera mounted on the DJI InspireTM 2 drone for the application of field measurement of illumination levels at intersections. The same procedure has previously been conducted to calibrate the Canon EOS Rebel T3 camera and the measurement error range was proven to be less than 4%. II. Methodology Overview To calibrate the Zenmuse X5STM camera, a set of monochromatic images with different levels of scene illumination will be taken, and pixel information will be extracted through image analysis to convert into scene luminance. Then a luminance meter and illuminance meter will be used to measure the scene illumination level to serve as a standard, and the measured results of the Zenmuse X5S TM camera will be calibrated against the standard. III. Required Equipment The following equipment will be required as a minimum to conduct this calibration procedure:
105

The DJI InspireTM 2 drone attached with Zenmuse X5STM camera Fully charged Intelligent Flight Batteries for the drone A 4GB or greater SD card for storing captured scene images An Extech-HD450 or equivalent illuminance meter A Gossen Starlight2 or equivalent luminance meter An extra 9V battery for illuminance meter A dimmable lamp IV. Drone Camera Calibration Procedures A. Identification of Scenes with Different Illumination Levels Step 1: Find a room without any natural lighting, define a rectangular area on a flat surface using tape. This is to ensure that the pixel information will always be analyzed within the exact same area so that for the subsequent image analysis, the illumination level changes of the provided lighting source are the only variable that will influence the measured results. Step 2: Once the scene area to be measured is determined, a dimmable lamp is placed in front of the area at a distance. This lamp will serve as the only lighting source for the camera calibration process. Step 3: Adjust the output levels of the dimmable lamp from the brightest to the darkest until completely turned off to obtain different scene illumination levels. It is up to users' choice of the number of illumination levels to be tested. Figure shows an example of one selected scene under different illumination levels.
106

Figure 68. Photo. Scenes with Different Illumination Levels Step 4: For each illumination level, use the luminance meter to measure the luminance values of the defined rectangular scene. Record the measured values for the subsequent calibration. Additionally, an illuminance meter will also be used to monitor incident light output from the dimmable lamp. This helps to confirm that the data collection is done under constant luminance and any variation in the measured pixel intensities of different target images is only influenced by the camera's exposure settings or source luminance instead of voltage fluctuations. B. Zenmuse X5STM Camera Settings for Image Shooting
107

DJI provides the DJI GOTM 4 app to control the drone and the attached camera through a mobile device connected to the drone's remote controller. After the mobile device has been successfully connected to the remote controller, the touch screen of the mobile device can be used to record videos, capture photos, and set professional photography configurations. Figure shows the user interface of the DJI GOTM 4 app. To activate still image capturing and video recording functions, users need to first insert a supported Micro SD card into the drone (DJI InspireTM 2). Once the image capturing function is activated, the drone camera settings should be set based on the recommended values provide below.
Figure 69. Diagram. User Interface of the DJI GOTM 4 App Shutter/Record Switch: Tap the shutter/record switch ([6] in Figure) and select shutter. To
take a picture tap the shutter/record button ([8] in Figure). Shooting Mode: The default mode of the drone camera is set to be Single Shooting Mode and
this mode will also be used for camera calibration. If the shooting mode has been changed, it 108

can be reset by tapping the photo configurations menu ([10] in Figure), then choosing sub menus in the following order: Settings Video/Photo Setting Photo. Exposure Mode: The exposure mode must be set to Manual. This can be done by tapping the photo configurations menu ([10] in Figure) and then choosing sub menus in the following

order:

. This will allow aperture, shutter speed, and ISO to be set manually based on

field conditions.

Shutter Speed: This will vary based on the user-selected ISO sensitivity. Each ISO sensitivity

has a different calibration curve and a fixed exposure time that must be used. The possible

shutter speed values for images range from 8s to 1/8000s.

Aperture: This can very between the recommended f-stops for the selected ISO. The

maximum aperture is F1.7 and the minimum is F16.

ISO: This can be set at 3200, 6400,12800, or 25,600 depending on users' data collection

needs. The possible ISO values for images range from 100 to 25,600.

Photo Style: This should be set to Standard.

Photo Color: Photo color should be set to Black and White.

White Balance: This should always be set to automatic white balance (AWB).

Image Format: The images should be taken in (.DNG) format. DNG format is a generic, and

highly compatible format developed by Adobe. It offers the advantage of smaller file sizes

without loss of data and a universal format that is independent of manufacturer or camera-

specific software.

Any other settings meant to enhance the images taken by the camera should be turned off.

Some of these include `Exposure compensation / AEB settings', `Lighting optimizer', etc.

The camera must be set to Spot Metering instead of Spot Focusing.

109

C. Image Shooting Process Step 1: Power on the DJI InspireTM 2 drone and unlock the Travel Mode to switch the drone into Landing Mode by pressing the drone power button a minimum of five times. Mount the Zenmuse Camera to the drone gimbal, as shown in Figure.
Figure 70. Diagram. Zenmuse X5STM Mounted to the Drone Step 2: Place the drone at a table in front of the defined rectangular scene area, tilt the gimbal by scrolling the left dial of the remote controller to adjust the camera's field of view so that the defined scene area is centered in the camera view. Step 3: For each scene with different illumination levels, change the ISO values to 3200, 6400, 12800, or 25,600 respectively depending on users' data collection needs. For each ISO sensitivity, select different levels of aperture ranging from F1.7 to F16. Step 4: For each combination of ISO, aperture, and shutter speed, tap the shutter/record button ([8] in Figure) to take a photo. For each captured image, record the camera's current settings including the shutter speed, aperture, and ISO values. D. Pixel Information Extraction To extract pixel intensity information from the captured images, an image analysis software ImageJ is used and a script is developed accordingly to automatically extract the minimum, average, and maximum pixel intensity from the defined scene area in each image.
110

Step 1: Convert all the captured images into the lossless TIFF format and put them along with the developed script in the same folder. Step 2: Open the ImageJ software, and the initial interface is demonstrated in Figure .
Figure 71. Screenshot. User Interface of ImageJ Click `Plugins -> Macros -> Edit' to open the script in a separate window. Then click `Plugins -> Macros -> Record' to open the Recorder window. Keep these two windows open during the entire image analysis process. Step 3: Open the first image in ImageJ by clicking `File -> Open' and selecte the image. In the toolbar, select the `Polygon selection' function as shown in Figure.
Figure 72. Screenshot. Select `Polygon' Function in the Toolbar On the selected image, draw a polygon with the same shape of the defined scene area, similar to the one highlighted in yellow in Figure.
111

Figure 73. Photo. Draw A Polygon in the Selected Image Step 4: After the polygon is drawn, switch to the Recorder window, find the according coordinates of each drawn polygon's corner in the brackets of the `makePolygon' function, as underscored in Figure 43.
Figure 434. Screenshot. Coordinates of Each Polygon's Corner in the Recorder Window
Copy all the coordinates and paste them into the developed script to replace the values of the `makePolygon' function (underscored in Figure ). Additionally, change the number in
112

the `for loop' condition (marked in Figure ) based on the number of actual images that users intend to analyze.
Figure 75. Screenshot. Replace the Underscored Numbers with Copied Coordinates in the Script
Step 5: Ensure the current window is the script edit window, run the script by clicking `Macros -> Run Macro'. The analysis results will be displayed in a separate window named as Results. Users can export the extracted pixel results into csv files and save the files to a local folder. It should be noted that because the position of the drone camera hasn't been changed during the image shooting process, the defined scene area will always be in the same location within the camera's field of view, thus the same polygon can be applied to every captured image for pixel extraction. However, if the position of the camera has been changed, then users will need to draw new polygons manually for images with a different field of view, and repeat the above process. E. Dark Current Estimation Step 1: After the image shooting process, keep all the equipment at the same place, turn off the lamp completely. Step 2: For each ISO sensitivity selected in the image shooting process, take another set of images with different aperture settings.
113

Step 3: Conduct the pixel information extraction process on the captured images, and

record the mean pixel intensity of each image. Obtain the dark current for each ISO value

by calculating the median of the images' mean pixel intensity.

F. Camera Calibration Process

To calibrate the camera, a relationship between the measured pixel intensity and actual

scene luminance should be established. Previous studies have found that the relationship

between pixel intensity and scene luminance can be simplified into a linear equation, as

shown in Equation 1. The term `RHS' can be defined as the interaction of scene luminance

and camera's exposure parameters, which equals to the product of shutter speed (t) and

ISO sensitivity (S) and scene luminance (Ls), divided by the square of the aperture number

(fs2). K is a constant that will be calibrated for the drone camera.





=

(

2

)



...

=

()

Equation 1. Relationship between Pixel Intensity and Scene Luminance

Step 1: For each analyzed image, subtract the estimated dark current from the measured

mean pixel intensity to obtain the adjusted pixel intensity. Then input the adjusted pixel

intensity into Equation 1 to calculate RHS.

Step 2: Plot the extracted mean pixel intensity against the calculated RHS based on the

collected data for each ISO sensitivity. Figure shows an example of the plot made during

a previous study. The plot is based on data collected at scene luminance of 54.4, 2.94, 1.28,

1.2, 0.26, and 0.24 cd/m2 with the Starlight2.

114

Figure 76. Scatterplot. Example of Mean Pixel Intensity vs. RHS Plot Step 3: Because the initial fitting of the data doesn't quite suggest a linear relationship between the mean pixel intensity and RHS in the selected linear response range, the original RHS values are transformed by natural log fitting. Step 4: Plot the extracted mean pixel intensity against the natural log transformed RHS, as shown in Figure. Conduct linear regression on the processed data to obtain the value constant K of Equation 1, and the mathematical relationship between mean pixel intensity and scene luminance is established.
115

Figure 77. Scatterplot. Example of Mean Pixel Intensity vs. Log Transformed RHS Plot
For subsequent scene illumination level measurement, simply use the calibrated camera to capture images of the interested scene, and analyze the captured images to extract the mean pixel intensity, then use the calibrated equation to convert the pixel information into the measured scene luminance.
116

Table 9. Procedure Checklist of Drone Camera Calibration

Procedure Checklist of Drone Camera Calibration

Project Number:

Date: General Information
Time:

Recorder:

Required Equipment

DJI INSPIRETM 2 Drone attached with Zenmuse X5STM camera

A set of fully charged Intelligent Flight Batteries

Drone Camera Calibration Equipment

A 4GB SD card for storing captured scene images An Extech-HD450 illuminance meter A Gossen Starlight2 luminance meter

An extra 9V battery for illuminance meter

A dimmable lamp

A mobile device with DJI GOTM 4 app installed

Drone Camera Calibration

Comments Check
Check

Find a room without any natural lighting, define a rectangular area on a flat surface using tape

Identification of Scenes with Different Illumination Levels

Place a dimmable lamp in front of the defined area at a distance Adjust the output levels of the dimmable lamp from the brightest to the darkest until completely turned off to obtain different scene illumination levels based on project needs
Use the luminance meter to measure the luminance values of the defined rectangular scene for each illumination level and record the measured values

Adjust Zenmuse X5STM Camera Settings for Image Shooting

Connect the mobile device with DJI GO 4 app installed to the drone remote controller
Tap the shutter/record switch button in the DJI GO 4 app and select shutter, then tap the shutter/record button again to take a picture Use the default 'Single Shooting Mode' of the drone camera
117

Image shooting
Extract pixel information

Set the exposure mode to be Manual Set photo style to be Standard Set photo color to be Black and White Set white balance to be automatic white balance (AWB) The images should be taken in (.DNG) format
Any other settings meant to enhance the images taken by the camera should be turned off
Set the camera to be Spot Metering Power on the DJI InspireTM 2 drone and unlock the Travel Mode to switch the drone into Landing Mode by pressing the drone power button a minimum of five times Mount the Zenmuse X5STM Camera to the drone gimbal Place the drone at a table in front of the defined rectangular scene area, tilt the gimbal by scrolling the left dial of the remote controller to adjust the camera's field of view so that the defined scene area is centered in the camera view For each scene with different illumination levels, change the ISO values to 3200, 6400, 12800, or 25,600 respectively depending on users' data collection needs. For each ISO sensitivity, select different levels of aperture ranging from F1.7 to F16 For each combination of ISO, aperture, and shutter speed, tap the shutter/record button to take a photo For each captured image, record the camera's current settings including the shutter speed, aperture, and ISO values Convert all the captured images into the lossless TIFF format and put them along with the developed script in the same folder Open the ImageJ software and Click `Plugins -> Macros -> Edit' to open the script in a separate window, then click `Plugins -> Macros -> Record' to open the Recorder window. Keep these two

118

windows open during the entire image analysis process

Estimate Dark Current
Camera Calibration Process

Open the first image in ImageJ by clicking `File > Open' and select the image
In the toolbar, select the `Polygon selection' function, and draw a polygon with the same shape of the defined scene area on the selected image
Switch to the Recorder window, find the according coordinates of each drawn polygon's corner in the brackets of the `makePolygon' function
Copy all the coordinates and paste them into the developed script to replace the values of the `makePolygon' function and change the number in the `for loop' condition based on the number of actual images that users intend to analyze Ensure the current window is the script edit window, run the script by clicking `Macros -> Run Macro' Export the extracted pixel results from in a separate window named as Results into csv files and save the files to a local folder After the image shooting process, keep all the equipment at the same place, turn off the lamp completely For each ISO sensitivity selected in the image shooting process, take another set of images with different aperture settings Conduct the pixel information extraction process on the captured images, and record the mean pixel intensity of each image Obtain the dark current for each ISO value by calculating the median of the images' mean pixel intensity
For each analyzed image, subtract the estimated dark current from the measured mean pixel intensity to obtain the adjusted pixel intensity

119

Operator Signature

Use the adjusted pixel intensity to calculate RHS and transform the RHS values by natural log fitting Plot the extracted mean pixel intensity against the natural log transformed RHS based on the collected data for each ISO sensitivity
Conduct linear regression on the processed data to identify the mathematical relationship between mean pixel intensity and scene luminance

120

APPENDIX G. STANDARD OPERATING PROCEDURES FOR SAFETY
I. General The field data collection process of roundabouts and other innovative intersections usually involves measuring and recording roadway geometric and operational features along with capturing images/videos of intersection operations with drone-mounted and tripodmounted cameras. Due to the delicate nature of the aircraft and cameras, mishandling the equipment may lead to component malfunction, serious injury, and property damage. This standard operating procedure provides safety guidelines regarding the aircraft operation and field data collection to minimize the risks. The aircraft discussed in this document is the INSPIRETM 2 model manufactured by DJI, and the compatible camera is the Zenmuse X5STM.
II. Drone Operation Safety Guidelines Since many aircraft components are sensitive to the surrounding environment like temperature, air density, etc. and their performance could change drastically under different conditions, users should first be familiar with the aircraft and its limitations, then operate the aircraft in a safe and responsible manner to avoid any serious injury or property damage. The following guidelines discuss some basic principles that users should always follow during each stage of the aircraft operation. For more detailed operating rules, users can refer to the safety guidelines prepared by DJI using this link.
A. Flight Condition Requirements Operate the aircraft only in good to moderate weather conditions with temperatures
between -4 and 104 F (-20 to 40 C). Never fly during severe weather conditions
121

including wind speeds exceeding 22 mph (10 m/s), snow, rain, smog, heavy wind, hail, lightning, tornadoes, or hurricanes, etc. Keep the aircraft at least 30 feet (10 meters) away from obstacles, people, animals, buildings, public infrastructure, trees, and bodies of water when in flight. As the altitude increases, the safe distance between the aircraft and above objects should also increase. Operate the aircraft in open areas only. Tall buildings or steel structures may affect the accuracy of the on-board compass and block the GPS signal. Do not operate the aircraft near areas with magnetic or radio interference, which include high voltage lines, large scale power transmission stations or mobile base stations, and broadcasting towers, etc. B. Pre-flight Checklist Ensure the remote controller, Intelligent Flight Battery, and mobile device are fully charged. Ensure that camera lens is clean and free of stains, the Micro SD card has been inserted into the camera, and the gimbal can rotate freely before powering it on. Ensure the propellers and mounting plates are securely mounted onto the motors, and the motors can start and function normally. Follow the on-screen instructions to calibrate the compass. Ensure the DJI GO 4 app and aircraft's firmware have been upgraded to the latest version. Ensure that the flight area is outside the NO-Fly Zones and flight conditions are suitable for flying the aircraft. Ensure that the drone pilots are not flying under the influence of alcohol, drugs or any substance that may impair cognitive abilities to operate the aircraft safely.
122

Be familiar with the selected flight mode and understand all safety functions and warnings.
Be sure to observe all local, state, and federal regulations. Obtain appropriate authorizations and understand the risks.
Always maintain line of sight of the aircraft and do not only rely on first person view camera to control the aircraft.
Ensure the DJI GO 4 app is properly launched to assist the aircraft operation and record the flight data.
C. Operation Avoid interference between the remote controller and other wireless equipment. Make
sure to turn off the Wi-Fi on your mobile device. Do not answer incoming calls and avoid using mobile devices during flight. Read all prompted safety tips, warning messages, and disclaimers carefully in the DJI GO
4 app. Land the aircraft immediately at a safe location if there is an alert shown on the app. Do not apply external force to the gimbal after the gimbal is powered on. Handle with care
and do not touch the gimbal connector, as any damage can lead to malfunction. Do not switch from P-mode to either A-mode or S-mode unless you are sufficiently
familiar with the aircraft's behavior for each flight mode, since disabling GPS may result in being unable to land the aircraft safely. Do not pull the left stick to the bottom inside corner and press the Return-to-Home (RTH) button at the same time when the aircraft is airborne unless in an emergency situation. This combination command feature can be turned off via the DJI GO 4 app.
123

Ensure antennas of the remote controller are unfolded and adjusted to proper tension to achieve optimal transmission quality. Make sure to always fly the aircraft within the transmission range of the remote controller.
Always be alert when in control of the aircraft as the vision system may be disabled in certain situations (e.g., bad lighting or unclear obstacle surface patterns). Do not fly closely above reflective surfaces such as water or snow as they can affect the performance of the vision system.
The aircraft cannot automatically brake and stop at a safe distance from an obstacle if the aircraft speed exceeds 31.3 mph (14 m/s).
Land immediately when severe drifting occurs in flight, i.e., the aircraft does not fly in straight lines.
When battery warnings are triggered, promptly bring the aircraft back to the Home Point or land at a safe location to avoid losing power during flight and causing damage to the aircraft, property, animals, and people.
Ensure the landing gear is lowered before landing. Do not attempt to catch or hold the aircraft because the landing gear will be lowered if the Vision Position System detects an object and may cause injury.
Only power off the aircraft and the remote controller after the motors stop rotating. After landing, first stop the motor, power off the aircraft, and turn off the Intelligent Flight Battery, then turn off the remote controller.
While safety and flight assistance features such as obstacle avoidance, aircraft stabilization, and Return-to-Home are designed to assist aircraft operation, pilots' discretion will still be needed, especially in case of emergency.
124

D. Local Laws and Regulations Observation Avoid operating the aircraft in the vicinity of manned aircraft, regardless of altitude. Avoid operating the aircraft in densely populated areas, including cities, sporting events,
exhibitions, performances, etc. Do not fly the aircraft above the authorized altitude and never higher than 400 ft (120 m)
above ground level. Stay clear of and do not interfere with manned aircraft operations. Do not fly the aircraft near or inside NO-Fly zones specified by local or federal laws and
regulations. The NO-Fly zone list includes airports, borders between two sovereign countries or regions, etc. A complete list of No-Fly zones can be found at http://www.dji.com/flysafe/no-fly. Always keep the aircraft within a visual line of sight and use an additional observer to assist if necessary. Respect the privacy of others when using the camera, comply with local privacy laws, regulations, and moral standards. E. Product Care Store the Intelligent Flight Battery and remote controller in a cool, dry place away from direct sunlight. The recommended storage temperature ranges from 71 to 82 F (22 to 28 C) for storage periods of more than three months. Do not store them in environments outside the temperature range of -4 to 113 F (-20 to 45 C). Keep the camera away from water or other liquids. If it gets wet, wipe off any water droplets with a soft, absorbent dry cloth, and do not use substances containing alcohol, benzene, thinners, or other flammable substances to clean the camera. Do not store the camera in humid or dusty areas.
125

Detach the gimbal from the aircraft if the aircraft will be stored for a long period of time or transported over long distances.
Do not connect the aircraft to any USB interface that is older than version 2.0. Check every part of the aircraft after any crash or violent impact. Contact a DJI authorized
dealer directly if any problems or issues are detected. Fully charge and discharge the battery at least once every three months to maintain battery
health, and keep the battery level between 40% and 65% for long-term storage. The aircraft is recommended to be returned to the manufacturer for service after every 50
hours of flight time to ensure proper performance. III. Field Data Collection Safety Guidelines In addition to capturing images/videos with drone-mounted camera, the field data collection process often includes in-situ measurement and recording of the intersection operational and geometric characteristics data. Below are the guidelines that the survey crew should follow to ensure a safe and efficient data collection process.
Ensure that every survey crew member wears a traffic safety vest at all times. The vest must be put on before surveyors set off from their base to the selected intersection site(s). The vest must be worn on top of all other clothing.
The survey must be carried out by at least two surveyors; One can serve as a lookout to warn of potential hazards while the other does the main survey work.
Ensure that every survey crew member is in good physical condition, including sight and hearing.
Do not enter the active travel lanes at any time if there is no measurement that requires crew members to be in the active travel lane.
126

Avoid working on wet pavement in an active traffic area, allow sufficient time for pavements to be fully dried after rain before performing any surveys.
Survey vehicle must be parked off the road at any available parking spot close to the intersection such as a gas station or store front. Turn off all lights including headlights and emergency lights if the parking spot is within 60 meters of intersection.
Place safety signs or traffic cones in front of the parked vehicles in the direction of oncoming traffic to help enhance the visibility of survey crew.
All state-specific safety guidelines should be followed including those outlined in the GDOT Automated Survey Manual. The GDOT Automated Survey Manual can be downloaded using this link.
127

Table 10. Safety Procedure Checklist

General Information
Flight Condition Requirements
Pre-flight Checklist

Safety Procedure Checklist

Project Number:

Comments

Date:

Time:

Recorder:

Drone Operation

Check

Operate the aircraft only in good to moderate weather conditions with temperatures between -4 and 104 F (-20 to 40 C). Never fly during severe weather conditions including wind speeds exceeding 22 mph (10 m/s), snow, rain, smog, heavy wind, hail, lightning, tornadoes, or hurricanes, etc Keep the aircraft at least 30 feet (10 meters) away from obstacles, people, animals, buildings, public infrastructure, trees, and bodies of water when in flight. As the altitude increases, the safe distance between the aircraft and above objects should also increase Operate the aircraft in open areas only. Tall buildings or steel structures may affect the accuracy of the onboard compass and block the GPS signal Do not operate the aircraft near areas with magnetic or radio interference, which include high voltage lines, large scale power transmission stations or mobile base stations, and broadcasting towers, etc.

Ensure the remote controller, Intelligent Flight Battery, and mobile device are fully charged

Ensure that camera lens is clean and free of stains, the Micro SD card has been inserted into the camera, and the gimbal can rotate freely before powering it on Ensure the propellers and mounting plates are securely mounted onto the motors, and the motors can start and function normally Follow the on-screen instructions to calibrate the compass

Ensure the DJI GO 4 app and aircraft's firmware have been upgraded to the latest version

128

In-flight Operation

Ensure that the flight area is outside the NO-Fly Zones and flight conditions are suitable for flying the aircraft Ensure that the drone pilots are not flying under the influence of alcohol, drugs or any substance that may impair cognitive abilities to operate the aircraft safely Be familiar with the selected flight mode and understand all safety functions and warnings Be sure to observe all local, state, and federal regulations. Obtain appropriate authorizations and understand the risks Always maintain line of sight of the aircraft and do not only rely on first person view camera to control the aircraft Ensure the DJI GO 4 app is properly launched to assist the aircraft operation and record the flight data Avoid interference between the remote controller and other wireless equipment. Make sure to turn off the Wi-Fi on your mobile device Do not answer incoming calls and avoid using mobile devices during flight Read all prompted safety tips, warning messages, and disclaimers carefully in the DJI GO 4 app. Land the aircraft immediately at a safe location if there is an alert shown on the app Do not apply external force to the gimbal after the gimbal is powered on. Handle with care and do not touch the gimbal connector, as any damage can lead to malfunction Do not switch from P-mode to either A-mode or Smode unless you are sufficiently familiar with the aircraft's behavior for each flight mode, since disabling GPS may result in being unable to land the aircraft safely Do not pull the left stick to the bottom inside corner and press the Return-to-Home (RTH) button at the same time when the aircraft is airborne unless in an emergency situation. This combination command feature can be turned off via the DJI GO 4 app
129

Local Laws and Regulations Observation

Ensure antennas of the remote controller are unfolded and adjusted to proper tension to achieve optimal transmission quality. Make sure to always fly the aircraft within the transmission range of the remote controller Always be alert when in control of the aircraft as the vision system may be disabled in certain situations (e.g., bad lighting or unclear obstacle surface patterns). Do not fly closely above reflective surfaces such as water or snow as they can affect the performance of the vision system The aircraft cannot automatically brake and stop at a safe distance from an obstacle if the aircraft speed exceeds 31.3 mph (14 m/s)
Land immediately when severe drifting occurs in flight, i.e., the aircraft does not fly in straight lines When battery warnings are triggered, promptly bring the aircraft back to the Home Point or land at a safe location to avoid losing power during flight and causing damage to the aircraft, property, animals, and people Ensure the landing gear is lowered before landing. Do not attempt to catch or hold the aircraft because the landing gear will be lowered if the Vision Position System detects an object and may cause injury Only power off the aircraft and the remote controller after the motors stop rotating. After landing, first stop the motor, power off the aircraft, and turn off the Intelligent Flight Battery, then turn off the remote controller While safety and flight assistance features such as obstacle avoidance, aircraft stabilization, and Returnto-Home are designed to assist aircraft operation, pilots' discretion will still be needed, especially in case of emergency
Avoid operating the aircraft in the vicinity of manned aircraft, regardless of altitude
Avoid operating the aircraft in densely populated areas, including cities, sporting events, exhibitions, performances, etc.

130

Drone Product Care

Do not fly the aircraft above the authorized altitude and never higher than 400 ft (120 m) above ground level. Stay clear of and do not interfere with manned aircraft operations Do not fly the aircraft near or inside NO-Fly zones specified by local or federal laws and regulations. The NO-Fly zone list includes airports, borders between two sovereign countries or regions, etc.
Always keep the aircraft within a visual line of sight and use an additional observer to assist if necessary
Respect the privacy of others when using the camera, comply with local privacy laws, regulations, and moral standards Store the Intelligent Flight Battery and remote controller in a cool, dry place away from direct sunlight. The recommended storage temperature ranges from 71 to 82 F (22 to 28 C) for storage periods of more than three months. Do not store them in environments outside the temperature range of -4 to 113 F (-20 to 45 C) Keep the camera away from water or other liquids. If it gets wet, wipe off any water droplets with a soft, absorbent dry cloth, and do not use substances containing alcohol, benzene, thinners, or other flammable substances to clean the camera. Do not store the camera in humid or dusty areas Detach the gimbal from the aircraft if the aircraft will be stored for a long period of time or transported over long distances Do not connect the aircraft to any USB interface that is older than version 2.0 Check every part of the aircraft after any crash or violent impact. Contact a DJI authorized dealer directly if any problems or issues are detected Fully charge and discharge the battery at least once every three months to maintain battery health, and keep the battery level between 40% and 65% for longterm storage

131

The aircraft is recommended to be returned to the manufacturer for service after every 50 hours of flight time to ensure proper performance

Field Data Collection

Ensure that every survey crew member wears a traffic safety vest at all times. The vest must be put on before surveyors set off from their base to the selected intersection site(s). The vest must be worn on top of all other clothing

The survey must be carried out by at least two surveyors; One can serve as a lookout to warn of potential hazards while the other does the main survey work

Ensure that every survey crew member is in good physical condition, including sight and hearing

Field Data Collection Safety Guidelines

Do not enter the active travel lanes at any time if there is no measurement that requires crew members to be in the active travel lane Avoid working on wet pavement in an active traffic area, allow sufficient time for pavements to be fully dried after rain before performing any surveys

Survey vehicle must be parked off the road at any available parking spot close to the intersection such as a gas station or store front. Turn off all lights including headlights and emergency lights if the parking spot is within 60 meters of intersection

Place safety signs or traffic cones in front of the parked vehicles in the direction of oncoming traffic to help enhance the visibility of survey crew

All state-specific safety guidelines should be followed including those outlined in the GDOT Automated Survey Manual

Operator Signature

Check

132

APPENDIX H. STANDARD OPERATING PROCEDURES FOR DRONE VIDEO DATA ANALYSIS
I. General Due to the successful application of computer-vision techniques on high-resolution drone video data to extract quantitative operational and safety parameters from roundabouts in previous research projects, it is reasonable to believe that this method has the potential to enter routine service and make important contributions to transportation system management and design by both reducing data collection costs and increasing parameter measurement accuracy. To help state DOTs effectively use high-resolution drone video collection and computer-vision data reduction as a part of normal data collection activities, this standard operating procedure provides guidelines to extract quantitative measurements of traffic conditions (e.g., traffic volumes, vehicle trajectories, etc.) from roundabouts and other innovative intersections through analysis of videos collected by a drone-mounted camera. The video data analysis will be conducted in the DataFromSky TrafficSurvey ViewerTM (DFS Viewer) software, a desktop application that can detect objects, trajectories, and interactions based on raw data from videos. Users are also encouraged to explore other functions provided in DFS Viewer to produce quantitative measurements of traffic conditions based on specific data analysis needs. II. Standardized Drone Video Dataset For automatic processing of drone captured video data, the TrafficSurvey ViewerTM developed by DataFromSky will be used. There are certain requirements regarding the video data collection process and video formats that should be satisfied to ensure the accuracy of extracted traffic parameters.
133

A. Drone Video Recording Guidelines For drone video recording, previous field tests found that if the view angle of the drone camera is above 55 degrees (at which point the drone's altitude is larger than the distance between the drone's projection on the ground and the intersection center), then there is a significant drop of vehicle localization accuracy, as shown in Figure . Therefore, for the most reliable vehicle localization results, the ideal drone position should be directly above the intersection to minimize the dynamic and static occlusion between individual objects. If the safety regulations or other unexpected conditions make it impossible to hover the drone directly above the intersection, fly the drone to the nearest possible place above the center of the intersection to get a near bird's eye view.

Commented [CW2]: Above -55 (i.e. closer to 0 degrees horizontal)? Or below 55? I read this as we DON'T want the camera angle at -90 degrees (i.e. straight down), which I think is the opposite of what we want to say. Verify this is accurately stated

Figure 78. Diagram. The incidence angle of the drone should be below 55 degrees Source: DataFromSky
134

During recording, it is recommended to keep the drone's position and camera parameter settings unchanged to increase the precision of automatic video analysis. Based on the experiences from previous research projects, the ideal condition to fly the drone for field data collection is in clear skies with winds below 8 MPH. High ambient wind speeds above 15 MPH can cause the drone to dither in its position, which could result in the views of captured videos constantly shifting, making it impossible for automatic vehicle trajectory extraction without correction for the shift. Figure shows a comparison of vehicle trajectories identified from drone video data captured under high wind and low wind conditions.
Figure 79. Photo. Vehicle Trajectory Data Extracted from Drone Videos Captured under High Wind Condition (left) vs. Low Wind Condition (right) 135

When recording vehicle operations within an intersection, pilots should make sure that vehicles or other noteworthy objects should be visibly identifiable within the camera frame. Avoiding parasitic optical phenomena (such as glare or lens flare caused by bright light sources) in the video recordings is also important for accurate data processing. For highquality recordings, it is also better to use a camera with ultra-wide range lens instead of fisheye lens, which can cause high distortion at image margins.
Figure 80. Three situations not ok: too bright (left), ok: ideal (center), not ok: blurred due to low exposure time (right) Source: DataFromSky
B. Video Quality Requirements For automatic video analysis, resolution is a key determinant of object detection and classification. The minimal size of an object for correct detection and classification is 30 * 30 pixels, and a recommended size is between 30 * 30 pixels and 150 * 150 pixels within the intersection. For objects within a distance of less than or equal to 394 ft (120 m) at oblique angles, FULL HD (1920 * 1200) resolution will be sufficient. While for objects
136

that are at a distance between 394 ft and 984 ft (300 m) at oblique angles, 4K video resolution is required. In addition to resolution, bitrate settings can also influence video quality. For FULL HD videos coded with the H264 coding format, the bitrate should be at least 10 Mbit/s. And for 4K videos coded with H264, a minimum of 20 Mbit/s is necessary. For best results, it is recommended to upload videos in their original format with a framerate around 25 FPS. It should be noted that videos with resolution below 512 * 512 are not supported by the software system. In terms of video formats, the software can process most of the existing video encoding formats including MP4, AVI, MPEG, WMV, etc. The system also supports most existing video codes like H.262, H.263, H.264, other MPEG-4 video codecs, etc. III. DataFromSkyTM TrafficSurvey Viewer Functions DataFromSkyTM TrafficSurvey Viewer is a license-free software that provides computer vision services for fully automated vehicle identification and trajectory extraction based on traffic videos. Some of the basic functions provided in the software are: Visualize detected objects and their trajectories within the processed video Classification of objects up to 16 categories Origin-Destination matrix (OD matrix) Turning movement counts (TMCs) Calculation of headways Gap time, time to follow data Safety analysis (time to collision, post encroachment time, heavy breaking) Current speed, acceleration, deceleration of any object
137

Color recognition License plates detection (optional) Position of the object within each millisecond of the video Interactions of objects within the video - distance measurement Travel and occupancy times Configurable gates for vehicle counting Configurable virtual lanes, traffic regions, or action regions Capacity evaluation To download the DataFromSky ViewerTM software, users can use this link and start the installation process following the provided guidelines. IV. Data Analysis Procedures The data analysis procedures of captured drone video data include uploading videos to DFS Viewer for object identification and trajectory extraction, then geocoding and configuring the processed video files based on specific data analysis needs, and exporting the corresponding data for further advanced analysis to measure traffic operational and safety parameters within the selected intersections. A. Process Drone Videos through DFS ViewerTM To gain access to the video data analysis, users will have to create an account first by signing up on the website ai.datafromsky.com. Then users can use this account to upload the captured drone videos to the DataFromSky AERIALTM platform to conduct video analysis. The analysis of objects detection and trajectory extraction will be performed in the background, users can check the processing status in their personal accounts under the Tasks section.
138

Once the analysis is complete, an email will be sent to users with instructions and a link to download the processed results. The results are presented in the form of a tracking log, which is a data package containing information about traffic analysis scenes and detected or annotated vehicle trajectory data. It should be noted that since each tracking log is tied to the original video data, to view or edit a tracking log, users will need to open it in the software with the original video files opened as well. B. Geo-Registration To extract accurate speed or distance data, geo-registration is required for each tracking log as everything in the original processed files is measured in units relative to the footage resolution. This step can be performed in the DFS Viewer based on known latitude and longitude information of at least four reference points selected within the video frame. Some examples of reference points include corners of splitter islands or yield signs within roundabouts. The steps to conduct geo-registration of a tracking log are illustrated as follows:
1. Open the tracking log through DFS Viewer, select Tracking Log -> Manual GeoRegistration, as shown in Figure. Then follow the instructions provided in the Manual Georegistration Wizard window (Figure) to select the region of interest.
139

Figure 81. Screenshot. Conduct Geo-registration in DFS Viewer, Source: DataFromSky
Figure 82. Screenshot. Manual Geo-Registration Wizard in DFS Viewer 2. Draw blue polygons on the region of interest within which objects will be identified and
analysis will be conducted. Figure presents a typical example of identified region of interest within a roundabout.
140

Figure 83. Photo. Regions of Interest Marked by Blue Polygons 3. Set at least 4 reference points near places where objects are expected to be detected, then
obtain the real-world coordinates of selected reference points either from Google Maps or in-situ measurement, and input the coordinates information either in UTM or WGS-84 system. Reference points should be visually distinct and stationary, some common examples include roadway signs, edges of pavement markings, etc. within the intersection.
Figure 84. Photo. Reference Points Selected within a Roundabout 4. Confirm the geo-registration process with the Finish button and save the tracking log for
subsequent analysis. If the recorded videos are relatively unstable, users can also consider requesting the commercial service provided by DFS to conduct geo-registration at each frame with special stabilization of the video to minimize the impacts of camera movement. C. Video Files Configuration DFS Viewer allows users to set up their own annotation configurations of tracking log files to measure and extract traffic parameters through virtual gates and regions based on vehicle
141

trajectory data. Inside a tracking log file, users can select Tracking Log > Manage Annotation Configurations to open the configurations window, as shown in Figure 44.
Figure 445. Screenshot. Manage Annotation Configurations of the Tracking Log Source: DataFromSky
To create a new annotation configuration for specific data analysis needs, users can clone the existing configuration, select the newly cloned one as active, and click the Edit button to modify the configuration, as shown in Figure. During the configuration process, users can set gates, lanes, action regions, traffic regions, and anonymization regions based on analysis requirements. Once the configuration is finished, simply confirm the annotation redefinition, and apply it to the video scene.
142

Figure 86. Screenshot. Edit or Activate the Annotation Configurations of the Tracking Log
Traffic Gates Traffic gates are virtual lines that can be used to count different types of objects that pass the gates in one or both directions. They are in the form of a line consisting of straight segments connecting points, thus, they require at least 2 specified points to create. Once a gate is created, users will need to set the characteristics of the gate, and the steps involved in the gate setup process are:
1. Choose the gate type. There are three types of gates available, namely entry gates, exit gates, and neutral gates. Entry gates (green) are usually placed in the entrance of each intersection approach together with exit gates (red) placed in the exit of each approach to provide Origin-Destination information. Neutral gates (blue) can be set at any place of interest within the identified region to record vehicle gate crossing events.
143

2. Specify the gate direction. The gate direction can be set to only record vehicles passing in the specified direction, and will be displayed as a small arrow on the gate. The direction can either be set to positive, negative, or bidirectional depending on users' data analysis needs.
3. Adjust the angular sensitivity. The sensitivity of the gates can also be adjusted based on the angle of passing objects.
4. Select the allowed objects. The types of objects that can be tracked by the gates include car, medium vehicle, heavy vehicle, etc. Users can select what types of objects to be recorded in the gate annotation window.
5. Set the gate tag. The gate tag is used to identify the gate, and will be displayed in the configured tracking log file.
Figure 87. Photo. Set up Traffic Gates in One Intersection Approach Source: DataFromSky
144

Once all the gates are set up at desired locations within the video frame, confirm the gate settings with the Finish Editing button and apply the new configuration to the video analysis scene. The created gates can always be modified later by moving or removing the points in the configuration edit window and reapply it to the scene. The display formats of the traffic gates can be changed by clicking `View' -> `Show Traffic Gates' or `Switch Gate Label', as shown in Figure.
Figure 88. Screenshot. Modify Display Format of Traffic Gates Source: DataFromSky
Action Regions Action regions can be used to detect the presence and traveling speeds of objects within a certain area of the scene view. To define an action region, simply follow the steps below:
1. Create an action region. Click on the Add Action Region button or right-click in the scene view and select Add Action Region, then draw a polygon to define the action 145

region. The action region will be displayed in an orange polygon after three points have been added. 2. Edit an action region. To move the already placed points of the defined action region, simply drag the points to desired locations. The points can also be removed by rightclicking and selecting Remove Control Point. Once the editing of an action region is done, press the Finish Editing button in Main Toolbar. 3. Set presence alert. Check the enabled button next to Presence Alert to display and record a presence alert whenever an object enters the defined action regions. Users can also select object types for which the presence alert will be displayed and recorded. 4. Set speeding alert. Check the enabled button next to the Speeding Alert to display and record a speeding alert whenever a vehicle exceeds a set speed within the defined action regions. Users can select maximum speed for individual objects as well.
146

Figure 89. Screenshot. Set up an Action Region (Orange Area) in One Intersection Approach. Source: DataFromSky
Traffic Regions Traffic regions can be used to extract operating information like average speed, acceleration, etc. of each object or detect stationary vehicles within a certain area of the scene view.
1. Create a traffic region. Click on the Add Traffic Region button from the main menu or right-click in the scene view and select Add Traffic Region, then draw a polygon to define the traffic region, which will be displayed in green.
2. Set allowed types. Click the Select Object Types button next to the Allowed Types to choose which types of objects the traffic region will detect. 147

3. Set the maximum stationary vehicle speed. Input the number to set the maximum stationary vehicle speed below which the vehicle is considered as stationary.
4. Set the minimum stationary vehicle spell. Input the number to set the minimum stationary vehicle spell for how long at least the vehicle must be below the maximum stationary vehicle speed to be considered as stationary.

Figure 90. Screenshot. Set up a Traffic Region (Green Area) in One Intersection Approach Source: DataFromSky
D. Data Export After setting up the annotation configuration for the tracking log file, users can conduct operational and safety analysis based on the identified object trajectory data and export the corresponding results in DFS Viewer. The procedures to export raw trajectory data and gate/region crossing events data are discussed below.
148

Commented [WA3]: Safety analysis and gap-acceptance analysis are provided by DFS as a function too, which could be included in this SOP.

1. Trajectory Data The raw trajectory data of each detected object can be exported into a .CSV file and information such as position, speed, acceleration, etc. can be calculated to every millisecond of the video based on the extracted trajectories. To get the raw trajectory data, go to the toolbar and click Analysis -> Export Trajectories to CSV File or click the icon on the main toolbar.

Commented [CW4]: Millisecond (1/1000) or microsecond (1/1000000)?

Figure 91. Screenshot. Export Raw Trajectory Data in DFS Viewer Source: DataFromSky
The first eight columns of the extracted file contain information about the object track ID and object type, the entry gate ID and entry time, the exit gate ID and exit time, as well as the total travel distance (m) and average speed (km/h). Then a set of columns with data related to object position - x [deg], y [deg], Speed [km/h], Tangential acceleration [ms-2],
149

Lateral acceleration [ms-2], Traveling time [s], Angle [rad], and Traffic regions (list) are repeated for each frame of the video.
2. Gate Crossing Event Data The gate crossing events data can be used to conduct analysis including O-D matrix, gap time, time to follow, or average speed of the objects between two gates and others. To export information about one specific gate, choose the gate in section Traffic Analysis Objects and click Show in section Detailed Info, then click Export button to export the crossing event information to .CSV format for subsequent analysis, as shown in Figure.
Figure 92. Screenshot. Export Crossing Events Information for One Specific Gate Source: DataFromSky
To get the crossing events for all gates, simply go to the toolbar, click Analysis -> Export Gate Crossing Events to CSV File, as indicated in Figure, and the extracted file will be downloaded to a selected local folder. The information contained in the extracted data includes Gate ID, Track ID and Type, Image ID and Time [s], Speed, Tan. Acc. [ms-2],
150

Lat. Acc [ms-2], Headway [s] and Headway [m]. Additionally, more advanced analysis can be conducted based on traffic gates, such as OD matrix, turning movement counts, safety analysis, and gap-acceptance behavior analysis, etc.
Figure 93. Screenshot. Export Crossing Events Information for All Gate Source: DataFromSky
3. Traffic Region Crossing Event Data To export data from the defined traffic regions, go to the toolbar and click Analysis -> Export Traffic Regions Crossing Events to .CSV File. For each object identified within the traffic region, the extracted data include Traffic region ID, Track ID and object type, Entry time [s] and Exit time [s] of the object, Average speed [km/h], Average tangential acceleration [ms-2], Average lateral acceleration [ms-2], and Average total Acceleration [ms-2] within the defined traffic region, as well as the Total stationary time [s], and Longest stationary time [s] measured within the traffic region.
151

Table 11. Procedure Checklist of Drone Video Data Analysis

Procedure Checklist of Drone Video Data Analysis

Project Number:

Comments

General Information

Date: Time:

Recorder:

Standardized Drone Video Dataset Collection

Check

Ensure the drone position to be the nearest possible place above the center of the intersection to minimize the dynamic and static occlusion between individual object

Keep the drone's position and camera parameter settings unchanged to increase the precision of automatic video analysis

Drone Video Recording Guidelines

Avoid flying drones under the condition of high ambient winds with speed being above 15 MPH
Ensure vehicles or other noteworthy objects are visibly identifiable within the camera frame during the data collection process

Avoid parasitic optical phenomena (such as glare or lens flare caused by bright light sources) in the video recordings Use a camera with ultra-wide range lens to avoid high distortion at image margins The recommended size of the captured object is between 30 * 30 pixels and 150 * 150 pixels within the intersection

Video Quality Requirements

4K video resolution is required for objects that are at a distance between 394 ft and 984 ft (300 m) at oblique angles
The bitrate should be at least 10 Mbit/s for FULL HD videos coded with the H264 coding format while for 4K videos coded with H264, a minimum of 20 Mbit/s is necessary

Videos are recommended to be uploaded in their original format with a framerate around 25 FPS

Drone Video Data Analysis

Check

152

Process drone videos through DFS ViewerTM
Geo-Registration
Video Files Configuration
Analysis Results Export

Create a user account by signing up on the website ai.datafromsky.com
Use the created account to upload the captured drone videos to the DataFromSky AERIALTM platform to conduct video analysis
Download the corresponding tracking log files once the video processing is complete
Open the tracking log through DFS Viewer, select Tracking Log -> Manual Geo-Registration Draw blue polygons on the region of interest within which objects will be identified and analysis will be conducted
Set at least 4 reference points near places where objects are expected to be detected, then input the real-world coordinates of selected reference points either in UTM or WGS-84 system
Confirm the geo-registration process with the Finish button and save the tracking log for subsequent analysis Select Tracking Log > Manage Annotation Configurations to open the configurations window
Create traffic gates in the configuration to count different types of objects that pass the gates in one or both directions if needed
Create action regions to detect the presence and traveling speeds of objects within a certain area of the scene view if needed Create traffic regions to extract operating information of each object or detect stationary vehicles within a certain area of the scene view if needed Go to the toolbar and click Analysis -> Export Trajectories to CSV File or click the icon on the main toolbar to get the raw trajectory data if needed
Go to the toolbar and click Analysis -> Export Gate Crossing Events to CSV File to obtain the crossing events for all gates if needed

153

Operator Signature

Go to the toolbar and click Analysis -> Export Traffic Regions Crossing Events to .CSV File to export vehicle speed and stationary time data from the defined traffic regions

154

REFERENCES
AASHTO (2019). AASHTO survey finds drone use exploding among state dots. AASHTO Journal. https://aashtojournal.org/2019/05/24/aashto-survey-finds-droneuse-exploding-among-state-dots/
Barmpounakis, E. N., & Geroliminis, N. (2020). On the new era of urban traffic monitoring with massive drone data: The pNEUMA large-scale field experiment. Transportation Research Part C: Emerging Technologies, 111, 5071. https://doi.org/10.1016/j.trc.2019.11.023
Brahimi, M., Karatzas, S., Theuriot, J., & Christoforou, Z. (2020). Drones for Traffic Flow Analysis of Urban Roundabouts. International Journal for Traffic and Transport Engineering, 9(3), 6271. http://www.sapub.org/global/showpaperpdf.aspx?doi=10.5923/j.ijtte.20200903.02
Gbologah, F. E., Wei, A., & Rodgers, M. (2022). Evaluation of factors influencing roundabout performance in Atlanta. Transportation Research Record, 2676(9), 216 229. https://doi.org/10.1177/03611981221086621
Hadiwardoyo, S. A., Hernndez-Orallo, E., Calafate, C. T., Cano, J., & Manzoni, P. (2018). Experimental characterization of UAV-to-car communications. Computer Networks, 136, 105118. https://doi.org/10.1016/j.comnet.2018.03.002
Khan, M. S., Ectors, W., Bellemans, T., Ruichek, Y., Yasar, A., Janssens, D., & Wets, G. (2018). Unmanned Aerial Vehicle-based Traffic Analysis: A case study to analyze traffic streams at urban roundabouts. Procedia Computer Science, 130, 636 643. https://doi.org/10.1016/j.procs.2018.04.114
Kim, D. H. (2020). Pedestrian and bicycle volume data collection using drone technology. Journal of Urban Technology, 27(2), 4560. https://doi.org/10.1080/10630732.2020.1715158
Kumar, A., Krishnamurthi, R., Nayyar, A., Luhach, A. K., Khan, M. M., & Singh, A. (2021). A novel Software-Defined Drone Network (SDDN)-based collision avoidance strategies for on-road traffic monitoring and management. Vehicular Communications, 28, 100313. https://doi.org/10.1016/j.vehcom.2020.100313
Li, S., Moslehy, A., Hu, D., Wang, M., Wierschem, N., Alshibli, K., & Huang, B. (2022). Drones and other technologies to assist in disaster relief efforts. In TN.gov (No. RES2021-05). Tennessee Department of Transportation. https://www.tn.gov/content/dam/tn/tdot/long-range-planning/research/finalreports/res2021-final-reports/RES2021-05_Final_Report_Approved.pdf
155

Naughton, J. B., & McDonald, W. S. (2019). Evaluating the variability of urban land surface temperatures using drone observations. Remote Sensing, 11(14), 1722. https://doi.org/10.3390/rs11141722
O'Brien, C. (2023). Tennessee aims to restrict agencies from using certain Chinese drones. WKRN. https://www.wkrn.com/news/tennessee-politics/tn-drone-bill/
Rodgers, M., Gbologah, F., Wei, A., & Nam, S. (2023). Safety and Illumination of Rural and Suburban Roundabouts (Phase II). In Dot.ga.gov (FHWA-GA-21-1911). Georgia Department of Transportation.
Sahil, S. (2022)., Fog-Cloud centric IoT-based cyber physical framework for panic oriented disaster evacuation in smart cities. Earth Sci Inform 15, 14491470. https://doi.org/10.1007/s12145-020-00481-6
Sutheerakul, C., Kronprasert, N., Kaewmoracharoen, M., & Pichayapan, P. (2017). Application of unmanned aerial vehicles to pedestrian traffic monitoring and management for shopping streets. Transportation Research Procedia, 25, 17171734. https://doi.org/10.1016/j.trpro.2017.05.131
TN Department of Safety & Homeland Security, & Boyd, J. (n.d.). CIRT drone use in crash investigations [Slide show]. TN.gov. https://www.tn.gov/content/dam/tn/tdot/traffic-operations/transportationmanagementoffice/hsoc/CIRT%20Drone%20Use%20in%20Crash%20Investigations_Lt%20Justin %20Boyd.pdf
Utomo, W., Bhaskara, P.W., Kurniawan, A., Juniastuti, S., & Yuniarno, E.M., (2020). "Traffic Congestion Detection Using Fixed-Wing Unmanned Aerial Vehicle (UAV) Video Streaming Based on Deep Learning," 2020 International Conference on Computer Engineering, Network, and Intelligent Multimedia (CENIM), Surabaya, Indonesia, 2020, pp. 234-238, doi: 10.1109/CENIM51130.2020.9297921.
Villa, T. F., Jayaratne, E., Gonzalez, L. F., & Morawska, L. (2017). Determination of the vertical profile of particle number concentration adjacent to a motorway using an unmanned aerial vehicle. Environmental Pollution, 230, 134142. https://doi.org/10.1016/j.envpol.2017.06.033
156

Locations