Welcome to GEOG 892 - Geospatial Applications of Unmanned Aerial Systems
Welcome to GEOG 892 - Geospatial Applications of Unmanned Aerial Systems mjg8Quick Facts about GEOG 892
- Instructor: Dr. Qassim Abdullah
- Course Structure: Online, 10 weeks. There are 9 online quizzes, 11 activities including online discussions or reports development, UAS data processing using Pix4D software, and 1 final project
Overview
Unmanned Aerial Systems (UAS), or drones, are developing aggressively, and many government and non-government agencies are considering acquiring such systems. This course will focus on the geospatial utilization of a UAS. It will cultivate students' knowledge of the capabilities and limitations of the UAS and data post-processing systems. It introduces fundamental concepts surrounding operating a UAS such as strategies for selecting the right UAS, assessing its performance, managing resulting products (i.e. imagery), selecting the appropriate commercially available processing software, assessing product accuracy, figuring ways and means of producing metric products from UAS, and understanding rules and regulations governing operating a UAS in the United States.
Learn more about GEOG 892, Geospatial Applications of Unmanned Aerial Systems (2 minutes)
I'm Karen Schuckman, I'm the lead faculty for the Remote Sensing and Earth Observation certificate program. Qassim and I have known each other for many years. We worked together at the same company, and we continue to work together through the American Society for photogrammetry and remote sensing, and I feel we're extremely lucky to have him teaching this course for us here at Penn State. The topic of unmanned aerial system is evolving very rapidly, it's getting very important, so you really need a place where you navigate that changes, the evolving regulation of the FAA, the evolving... the rapid changes in the technology. It's going in a very rapid pace. The course among other things will give you a final project where you started from the beginning. A real project where you select your platform and the UAS, the sensor you use, the remote sensing application. This course will help you succeed in your career and in your business, whatever you're pursuing in the unmanned aerial system for geospatial application.
Want to join us? Students who register for this Penn State course gain access to assignments and instructor feedback and earn academic credit. For more information, visit Penn State's Online Geospatial Education Program website. Official course descriptions and curricular details can be reviewed in the University Bulletin.
This course is offered as part of the Repository of Open and Affordable Materials at Penn State. You are welcome to use and reuse materials that appear on this site (other than those copyrighted by others) subject to the licensing agreement linked to the bottom of this and every page.
Lesson 1: Introduction to the Unmanned Aerial System
Lesson 1: Introduction to the Unmanned Aerial System mjg8Lesson 1 Introduction
Lesson 1 Introduction AnonymousWelcome to Lesson 1! In this lesson, you will become familiar with the history behind the use of the UAS. You will also be familiar with the current status of the UAS development. In addition, you will be exposed to the different classes of UAV/UAS according to their size, weight, and missions.
At the end of this lesson, you will have a working knowledge about how the unmanned aerial missions started, the current status and the classes of the UAV/UAS.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- Explain why aircraft classification is mature for manned aviation but still evolving for UAS.
- Apply common UAS classification approaches based on size, range/endurance, and mission performance.
- Interpret the U.S. Department of Defense (DoD) five-group UAS framework.
- Describe civil/commercial UAS classification practices used in industry.
- Treat FAA Part 107 as a regulatory overlay and map operational limitations and waiver pathways to UAS missions.
Lesson Readings
Course Textbooks
- Chapter 1 of the textbook: Unmanned Vehicle Systems for Geomatics: Towards Robotic Mapping
- Chapter 1 of the textbook: Bankhart et al., Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chapters 1 & 2 of textbook: Fahlstrom, et al., Introduction to UAV Systems (Aerospace Series), 4th edition
Web Articles
- Pacchioli, D., "Programming autonomous vehicles to fly like birds"
- Chiles, J., "Drones for Hire"
- Maksel, R., "Robot Reporters"
Google Drive (Open Access)
- Commercial UAV News report: "6 Predictions for 2016: UAV Experts Discuss Important Developments for Commercial Drone Applications"
- Parts I, II, & III of Collier Crouch, C., thesis "Integration of mini-UAVs at the tactical operations level : implications of operations, implementation, and information sharing" (pdf)
- Watts, et al., Remote Sensing
Lesson Activities
- Study lesson 1 materials on Drupal and the text books chapters assigned to the lesson
- Start your first post for the discussion on "Agreements and Differences in UAS Classification"
- Review Final Project Idea Assignment Details
- Attend the weekly call on Thursday evening at 8:00pm ET
- Study Quiz 1 materials
Background: Aircraft Classification in Aviation
Background: Aircraft Classification in Aviation ksc17Over the years, extensive experience with manned aircraft has enabled aviation authorities and industry experts to develop comprehensive and widely recognized systems for classifying these vehicles. In the United States, the Federal Aviation Administration (FAA) categorizes civil aircraft using an organized hierarchy that includes categories, classes, and types—illustrated in Table 1. This well-established structure is fundamental to key functions such as pilot training, certification, and ensuring the safe operation and oversight of aircraft within the National Airspace System.
While the classification system for traditional aircraft types is well-established and consistent, the landscape for UAS is markedly different. In the UAS realm, the rapid pace of technological advancements and the proliferation of diverse applications have led to significant variation in classification approaches among defense, civil, and commercial sectors.
To Read
- For more reading on the topic, consult the article "Aircraft Classifications & Regulations" by Embry-Riddle Aeronautical University.
UAS Classification Overview
UAS Classification Overview mjg8Unlike traditional, piloted aviation, the field of Unmanned Aircraft Systems (UAS) lacks a universally recognized classification standard. This absence of global standardization is largely due to the rapid evolution of UAS technologies and their ever-growing spectrum of uses, ranging from military operations to commercial deliveries and recreational activities. As a result, classification approaches can vary significantly across different sectors and regions.
Various organizations and authorities have developed their own frameworks for categorizing UAS. For instance, defense agencies typically rely on structured, performance-based tiers that consider factors such as payload capacity, endurance, and operational ceiling. In contrast, civil aviation authorities and commercial operators often adopt more flexible categories, prioritizing criteria like physical size, operational risk, and the intended mission profile. These differences reflect the unique requirements and priorities of each sector.
Within this course, the terms "UAS" (Unmanned Aircraft System) and "UAS" (Unmanned Aerial Vehicle) are used interchangeably for simplicity, although some organizations make distinctions based on system components or operational context.
There are several key parameters commonly used to classify UAS:
- Size and Mass: This includes the physical dimensions and Maximum Takeoff Weight (MTOW). For reference, Table 2 presents a conceptual size spectrum, ranging from very small (insect-sized, less than 0.5 meters) to large, aircraft-scale systems.
- Range and Endurance: Classifications may take into account how far and how long a UAS can operate without refueling or recharging.
- Operating Altitude and Air Speed: Some systems are designed for low-altitude, slow-speed missions, while others can reach higher altitudes and faster speeds.
- Launch/Recovery Method and Airframe Type: UAS may be hand-launched, catapulted, or require runways, and can feature fixed-wing, rotary-wing, or hybrid airframes.
- Intended Mission and Operating Environment: The mission—such as surveillance, mapping, delivery, or search and rescue—and where the system operates (urban, rural, maritime, etc.) can influence its classification.
As UAS technologies continue to advance, classification schemes are likely to evolve further in response to new capabilities, regulatory changes, and emerging use cases. Stakeholders should consult the latest guidance from relevant authorities and industry groups for up-to-date information on UAS categorization. For further reading, refer to the article "Aircraft Classifications & Regulations" by Embry-Riddle Aeronautical University.

Classification According to Size
Classification According to Size ksc17A common and intuitive approach classifies UAS by physical size. While boundaries vary by source, size is often correlated with payload capability, endurance, launch/recovery infrastructure, and typical mission scope.
Very Small UASs
Very small unmanned aerial vehicles (UASs) encompass platforms ranging from insect-sized constructions up to those measuring approximately 30 to 50 centimeters in length. These UASs are exceptionally lightweight, making them suitable for operations over limited distances. Their compact size allows for diverse design approaches, including flapping-wing mechanisms, rotary-wing structures, and miniature fixed-wing configurations. Flapping-wing UASs excel in maneuverability, enabling them to perform agile movements and even perch or land on restricted surfaces. Rotary-wing variants stand out for their ability to hover steadily, which is especially useful in confined environments or for tasks requiring stationary flight.
Notable examples of very small UASs, Figure 1, include the IAI Malat Mosquito, which features a wingspan of roughly 35 centimeters and can remain airborne for about 40 minutes. Another representative is the Aurora Flight Sciences Skate, with a wingspan close to 60 centimeters and a length of approximately 33 centimeters. Additionally, the Cyber Technology CyberQuad Mini, with its square 42 by 42-centimeter footprint, and the larger CyberQuad Maxi, illustrate the variety in this UAS class. These systems highlight the technological advancements and versatility found within the very small UAS category.

Small UASs (Mini-UASs)
Small UASs (often called mini-UASs), Figure 2, typically include platforms with at least one dimension > 50 cm and up to ~2 m. Many are fixed-wing systems and are commonly hand-launched, enabling rapid field deployment without runway infrastructure. Rotary-wing designs are also present in this class, particularly for hovering and confined-area operations.
Representative examples (illustrative):
- RQ-11 Raven (approx. 1 m length; ~1.4 m wingspan).
- Bayraktar mini class systems (approx. 5 kg; data link range ~20 km), Figure 5.
- RQ-7 Shadow, Figure 3.
- AiRanger™ (crossover system spanning small to medium characteristics), Figure 4.




Medium UASs
Medium unmanned aerial vehicles (UASs), Figure 6, are characterized by their size and weight, making them unsuitable for transport or launch by a single individual. These UASs are notably larger than small systems but still smaller than typical light manned aircraft. Their wingspans usually fall within the 5-to-10-meter range, and they are capable of carrying payloads that typically weigh between 100 and 200 kilograms. Medium UASs are often deployed for intelligence, surveillance, and reconnaissance (ISR) missions, which can involve various sensors and communication equipment. Due to their size and operational requirements, these UASs generally need dedicated launch and recovery systems, as well as specialized ground support equipment to facilitate their deployment and retrieval.
Representative examples:
- Hunter (wingspan ~10.2 m; length ~6.9 m; takeoff weight ~885 kg).
- UK Watchkeeper.
- RQ-2 Pioneer; BAE Systems Skyeye R4E; Boeing Eagle Eye.
- RS-20 (crossover small–medium characteristics).

Large UASs
Large UASs, Figure 7, also known as unmanned aerial vehicles, are sizable aircraft-scale platforms designed for missions that require extended endurance and operation at high altitudes. These systems are frequently deployed in scenarios involving combat, surveillance, or intelligence gathering, where their ability to remain airborne for long durations is critical. Due to their substantial size and advanced capabilities, large UASs can function within airspace typically reserved for traditional manned aircraft. They are equipped with state-of-the-art payloads, including sophisticated sensors and targeting equipment, as well as robust long-range communication systems, enabling them to carry out complex and demanding operations over vast distances.
Representative examples:
- MQ-1 Predator A.
- MQ-9 Predator B (Reaper).
- RQ-4 Global Hawk (including NASA variants), Figure 8.


Classification According to Range and Endurance
Classification According to Range and Endurance mjg8Another commonly used approach classifies UASs by their operating radius (range) and time aloft (endurance). This is particularly common in military contexts because range/endurance are closely tied to mission capability and logistics.
Very Low-Cost, Close-Range UASs
Very low-cost, close-range Unmanned Aerial Systems (UASs) represent the most accessible tier of drone technology, designed for missions that require limited operational radius and short flight durations. These platforms typically offer an operational range of approximately 5 kilometers from their launch point, making them suitable for tasks within a confined geographic area. Their endurance is modest, generally allowing them to remain airborne for about 20 to 45 minutes per flight, which is adequate for brief reconnaissance, surveillance, or data collection missions.
As of 2024, the cost of these UASs is estimated to be around $12,000, positioning them as an economical solution for organizations or agencies with constrained budgets. This affordability makes them particularly attractive for entry-level users, training applications, or routine operations where the deployment of more sophisticated and expensive UAS platforms would be impractical or unnecessary.
Despite their low cost, these systems incorporate technology and features comparable in sophistication to advanced model aircraft, such as the Raven and Dragon Eye. They are typically equipped with lightweight airframes, basic autopilot systems, and simple sensor payloads—often including standard-definition cameras for visual observation. While they may lack the advanced capabilities of larger or longer-range drones, their ease of use, rapid deployment, and minimal logistical requirements make them highly effective for quick-response scenarios.
Very low-cost, close-range UASs are commonly used in military, law enforcement, and civilian applications where short-range aerial oversight is needed. Examples include tactical reconnaissance, perimeter security, search and rescue in localized areas, and environmental monitoring over small sites. Their compact size and straightforward operation allow operators to launch and recover these drones with minimal equipment and training, further enhancing their utility in field operations where agility and cost-effectiveness are priorities.
Overall, these UAS platforms provide a practical balance between affordability and functionality, enabling a wide range of users to leverage unmanned aerial technology for essential tasks within a limited operational envelope.
Close-Range UASs
Close-range Unmanned Aerial Systems (UASs) are designed to operate at distances of up to approximately 50 kilometers from their launch point. These platforms generally offer endurance ranging from 1 to 6 hours, allowing for extended operations compared to very low-cost, close-range systems. Close-range UASs are especially valuable for missions that require reliable performance over a moderate area and time frame, such as tactical reconnaissance and surveillance.
In military applications, these UASs are often deployed to gather real-time intelligence, monitor troop movements, or provide situational awareness for ground units. Their relatively longer endurance and increased range make them suitable for tasks that demand more persistent observation than very low-cost systems can provide, yet do not require the extensive coverage of short-range or mid-range platforms.
Close-range UASs typically feature advanced sensors, including high-resolution cameras and sometimes infrared or multispectral imaging equipment, enabling them to operate effectively both day and night. Their compact size and ease of deployment make them ideal for rapid response scenarios, where timely information is critical to mission success. Furthermore, these systems are often used in law enforcement, border patrol, disaster response, and environmental monitoring, supporting operations that benefit from aerial oversight but do not necessitate the capabilities of larger, more expensive UASs.
Examples of close-range UASs include platforms like the ScanEagle and Puma. These systems balance affordability, reliability, and operational flexibility, making them a popular choice for both military and civilian agencies seeking effective aerial solutions within a limited operational radius.
Short-Range UASs
Short-range Unmanned Aerial Systems (UASs) are designed to operate at distances of approximately 150 kilometers or greater from their launch point, offering a significant increase in operational scope compared to close-range platforms. These systems typically feature endurance capabilities ranging from 8 to 12 hours, enabling them to conduct missions over extended periods without the need for frequent recovery and relaunch. Such endurance and range make short-range UASs highly suitable for a variety of demanding applications.
In military and security contexts, short-range UASs are commonly deployed for reconnaissance and surveillance missions that require persistent monitoring over larger geographic areas. Their ability to remain airborne for up to half a day allows for continuous data collection, supporting real-time intelligence gathering, target tracking, and situational awareness for commanders and decision-makers. These platforms are equipped with advanced sensor suites, such as high-resolution electro-optical and infrared cameras, synthetic aperture radar, and communication relay systems, which enhance their utility in both day and night operations and under diverse weather conditions.
Beyond defense applications, short-range UASs are also valuable in civilian roles, including border security, search and rescue operations, environmental monitoring, and infrastructure inspection. Their extended range and endurance make them effective for covering wide areas, such as coastlines, forests, or remote industrial sites, where ground access may be limited or time-consuming. The robust design of these UASs often includes features for autonomous navigation, automated takeoff and landing, and secure data transmission, ensuring reliable performance during critical missions.
Examples of short-range UASs include platforms like the Hermes 450 and the RQ-7 Shadow, both of which are widely used by military and government agencies around the world. These systems offer a balance of operational flexibility, payload capacity, and mission duration, making them an essential asset for organizations that require sustained aerial observation and rapid deployment capabilities within a regional operational theater.
Mid-Range UASs
Mid-range Unmanned Aerial Systems (UASs) are advanced platforms specifically engineered to operate at distances of up to approximately 650 kilometers from their launch point. These high-speed systems are designed with the capability to cover substantial geographic areas, making them highly effective for missions that require both extended range and rapid deployment. Thanks to their powerful propulsion systems and aerodynamic designs, mid-range UASs can achieve greater speeds than close- and short-range counterparts, allowing for timely arrival at target locations and swift execution of mission objectives.
The endurance of mid-range UASs typically ranges from 12 to 24 hours, enabling them to sustain operations over lengthy periods without the need for frequent recovery. This extended operational window is especially valuable for missions involving continuous reconnaissance, persistent surveillance, and the collection of meteorological data across vast territories. Equipped with sophisticated sensor payloads—including high-resolution electro-optical cameras, infrared imaging devices, synthetic aperture radar, and atmospheric monitoring instruments—these platforms can gather comprehensive intelligence under diverse environmental conditions, both day and night.
In military contexts, mid-range UASs are frequently deployed to support battlefield surveillance, target acquisition, and intelligence gathering over regional theaters of operation. Their ability to transmit real-time data to command centers enhances situational awareness and improves decision-making during dynamic scenarios. Additionally, these systems are increasingly utilized for civilian applications, such as monitoring severe weather events, mapping environmental changes, and supporting disaster response efforts where rapid assessment over large areas is required.
Mid-range UASs often incorporate advanced features like autonomous navigation, automated takeoff and landing, secure communications, and multi-mission payload versatility. Their robust design and flexible operational profiles make them essential assets for organizations seeking aerial solutions that balance speed, range, and endurance. Notable examples within this category include platforms such as the MQ-9 Reaper and Heron, which are widely adopted by military and governmental agencies for their reliability and mission adaptability.
Endurance UASs
Endurance Unmanned Aerial Systems (UASs) represent a class of aerial platforms specifically engineered for missions requiring exceptional operational longevity and broad area coverage. These systems are capable of remaining airborne for up to approximately 36 hours without the need for refueling or recovery, making them ideally suited for persistent surveillance and intelligence-gathering operations. With a working radius of about 300 kilometers from their launch point, endurance UASs can effectively monitor vast regions, including remote or challenging environments that may be inaccessible or impractical for manned aircraft.
Endurance UASs typically operate at altitudes reaching up to 30,000 feet, allowing them to conduct long-duration reconnaissance missions above adverse weather conditions and outside the range of many ground-based threats. This high operational ceiling, combined with advanced avionics and robust propulsion systems, enables these platforms to maintain stable flight profiles and gather high-quality data over extended periods.
Equipped with state-of-the-art sensor suites—including multi-spectral cameras, synthetic aperture radar, signals intelligence packages, and secure communication relays—endurance UASs are indispensable for both military and civilian applications. In defense contexts, they are primarily deployed for strategic surveillance, border patrol, and target tracking missions, where continuous situational awareness and real-time intelligence are critical for informed decision-making. Their ability to loiter for prolonged periods ensures uninterrupted monitoring of areas of interest, supporting early warning systems and enhancing operational security.
In addition to military roles, endurance UASs are increasingly utilized in civilian sectors for applications such as maritime patrol, disaster response coordination, environmental monitoring, and infrastructure inspection. Their long endurance and extensive working radius make them valuable assets for tracking weather patterns, assessing damage following natural disasters, or conducting resource management surveys over large, remote territories.
The integration of autonomous navigation, automated takeoff and landing capabilities, and redundant safety systems further enhances the reliability and operational efficiency of endurance UASs. These features minimize crew workload and reduce operational risks, ensuring that missions can be conducted safely and effectively even in complex or dynamic environments. As technology continues to advance, endurance UASs are expected to play an increasingly vital role in supporting a wide range of long-duration aerial operations across both governmental and commercial domains.
U.S. Department of Defense (DoD) UAS Group Classification
U.S. Department of Defense (DoD) UAS Group Classification ksc17The United States Department of Defense (DoD) organizes Unmanned Aerial Vehicles (UASs) into five distinct groups, as illustrated in Table 3 and Figure 9. This classification is determined by several key criteria: the system’s maximum gross takeoff weight (MGTW), its typical operating altitude, and its air speed. Importantly, if a UAS possesses any attribute that falls within a higher group—such as exceeding the MGTW, flying at a higher altitude, or reaching a greater airspeed—it is assigned to that higher classification group, regardless of its other characteristics. This tiered approach ensures each UAS is categorized according to its most advanced operational capability.
*AGL = Above Ground Level; MSL = Mean Sea Level. Source: U.S. Army Roadmap for UAS 2010–2035 (as cited in the course notes).

Summary and Final Tasks
Summary and Final Tasks sxr133Summary
We have now concluded the materials for Lesson 1, which walked us through the early history of UAS development. As is the case with most emerging modern technologies, we find the US defense program behind UAS development and its introduction to the civilian market. In addition, we learned about the different classifications for UAS. We also learned about the current status and the different applications of UAS.
One thing I would like to emphasize here is the fact that there is no single civilian owner of a large size UAS (such as the one used by the military, which is the size of a Boeing 737). In other words, there is a large gap between the size and sophistication of UAS used by the military and the ones used by civilians, which are characterized by smaller size and lesser sophistication. I believe that the reason behind this gap is strict regulation surrounding the operation of UAS in the National Airspace (NAS). Such a gap will diminish once civilian UAS has access to the NAS.
As for this lesson’s readings, try to read as much as you can through the materials available on the Internet, as it is a great resource. There is no one good textbook available so far on the subject. That is why I recommend buying, if you can, the two supplementary references listed under the course requirements in addition to the designated textbook.
(Note: Unless it is an online quiz or assignment, all deliverables should be organized and submitted in a Word document. Figures should be scanned and inserted in the document.)
Final Tasks
| 1 | Study lesson 1 materials on Drupal and the text books chapters assigned to the lesson |
|---|---|
| 2 | Start your first post for the discussion on "Agreements and Differences in UAS Classification". Complete your participation in the discussion forum detailed in Classification of the Unmanned Aerial Systems by the end of Lesson 2 |
| 3 | Review the final project details in Canvas. |
| 4 | Study Quiz 1 materials. Complete the Quiz by the end of Lesson 2 |
Civil and Commercial UAS Classification (Industry Practice)
Civil and Commercial UAS Classification (Industry Practice) msm26Unlike the military sector, which primarily organizes unmanned aircraft systems (UAS) based on mission profiles, operational altitude, and platform capabilities, the civil and commercial arenas employ a different set of criteria for classification. These non-military sectors focus heavily on practical concerns, reflecting the diverse nature of UAS applications in business, research, public safety, and recreation. Rather than concentrating on combat missions or strategic roles, civil and commercial classification emphasizes how the UAS is actually used in real-world settings.
Key factors in these classifications include the method of deployment, the payload capacity, and the intended business or operational application. For example, a UAS designed for aerial photography will have different payload and operational requirements than one designed for package delivery or infrastructure inspection. In addition, regulatory considerations play a crucial role, shaping the boundaries between different categories of UAS. These regulations often dictate aspects such as maximum allowable weight, operational restrictions, and safety requirements, which in turn influence how manufacturers and operators classify their systems.
Although there is no single, universally accepted standard for UAS classification within the commercial and civil sectors, several conventions have become widely adopted by industry stakeholders. One of the most prevalent is classification based on maximum takeoff weight (MTOW), which directly correlates with both the risk profile of the aircraft and the regulatory requirements it must meet. This approach makes it simpler for operators to determine the appropriate usage scenarios, training needs, and compliance obligations for each type of UAS.
Over time, these conventions have evolved to reflect the rapid growth and diversification of UAS technology. Industry groups, manufacturers, and regulators have worked together to define categories such as micro, small, and large UAS, with each class serving specific niches within the broader market. These categories help facilitate safe integration into the national airspace, support regulatory compliance, and guide the development of new business models and technological innovations.
In summary, civil and commercial UAS classification is characterized by its focus on practical deployment, payload, business application, and regulatory compliance. The absence of a single universal standard has led to the development of several broadly recognized conventions that shape the way UAS are designed, operated, and regulated in industry practice.
6.1 Weight-Class Conventions Used in Commercial Practice
In the commercial sector, the classification of unmanned aircraft systems (UAS) relies heavily on the maximum takeoff weight (MTOW) as the principal criterion. This approach is widely adopted because MTOW is a critical determinant of an aircraft's potential kinetic energy, which directly impacts the risk it poses to people and property in the event of an accident. Furthermore, MTOW serves as a practical metric for aligning UAS with various regulatory frameworks, as most aviation authorities use weight thresholds to define operational categories and associated safety requirements.
Industry standards have established several distinct weight-based categories to help manufacturers, operators, and regulators clearly differentiate between types of UAS, Table 4. One of the most commonly recognized categories is the micro-class, which includes drones that weigh less than 250 grams (approximately 0.55 pounds). These micro-class UAS are designed to be extremely lightweight and portable, making them ideal for indoor operations, close-range inspections, educational purposes, and as training platforms. Due to their low mass, micro-class drones generally present minimal risk and are often subject to fewer regulatory restrictions, especially when operated in controlled environments.
The next category is the small UAS (sUAS), defined by an MTOW of less than 55 pounds (25 kilograms). This category represents the dominant class of commercial drones and encompasses a wide range of applications, including aerial mapping, photogrammetry, infrastructure inspection, agriculture, environmental monitoring, and public safety missions such as search and rescue or incident response. The 55-pound threshold aligns with key regulatory definitions, such as those established by the U.S. Federal Aviation Administration (FAA) under Part 107, which sets the operational and certification requirements for small unmanned aircraft in the United States.
UAS that exceed the 55-pound (25-kilogram) MTOW threshold are classified as large UAS. These systems are typically more complex, capable of carrying heavier payloads, and suited for specialized commercial or industrial applications such as long-endurance surveillance, cargo delivery, or infrastructure development. Due to the increased risks associated with their size and capabilities, large UAS are subject to more stringent operational restrictions, certification procedures, and oversight by aviation authorities. Operators of large UAS must often demonstrate advanced pilot qualifications, implement comprehensive safety management systems, and adhere to additional airspace integration requirements.
By using MTOW as the primary basis for classification, the commercial UAS industry is able to standardize risk assessments, streamline regulatory compliance, and facilitate the safe integration of drones into national and international airspace systems. These conventions also provide clear guidance for manufacturers during product development and for end-users when selecting platforms that best match their operational needs and compliance obligations.
Airframe-Based Categories Relevant to Commercial Operations
Commercial UAS operators frequently categorize platforms by airframe type due to its significant impact on operational capabilities such as endurance, launch and recovery methods, payload stability, and overall mapping workflow. For example, multirotor drones are characterized by their vertical takeoff and landing (VTOL) abilities and exceptional maneuverability, making them ideal for close-range inspections and mapping smaller areas. In contrast, fixed-wing drones deliver superior endurance and can efficiently cover larger expanses, which makes them well-suited for corridor mapping and extensive survey missions. Hybrid VTOL airframes blend the advantages of vertical takeoff and landing with the efficient cruising performance of fixed-wing designs, enabling operations in settings where traditional runway infrastructure is unavailable or limited.
Mission-Driven Categories in Geospatial Workflows
Within geospatial mapping applications, how unmanned aircraft systems (UAS) are classified tends to revolve around the specific requirements of each mission rather than relying on fixed categories. Operators generally base their platform choices on several mission-critical factors, including the target ground sampling distance (GSD), the size of the area to be mapped, expected flight duration, the type of sensor payload needed (such as RGB, multispectral, thermal, or LiDAR sensors), and the desired level of mapping accuracy. As a result, commercial operators often approach classification as a flexible, decision-based process, adapting their selections to the unique demands of each project, instead of adhering to a strict or static taxonomy.
FAA Part 107 as a Regulatory Overlay (Not a Classification Replacement)
FAA Part 107 as a Regulatory Overlay (Not a Classification Replacement) msm26The FAA's Part 107 rule governs routine, civil (non-recreational) operations of small unmanned aircraft systems (UAS) within the United States. Rather than serving as a classification system for UAS based on attributes like size, range, or endurance, Part 107 acts as a regulatory framework layered on top of existing engineering categories. This means that while manufacturers and operators may classify drones by their physical and performance characteristics, Part 107 imposes additional operational constraints, Table 5, on how these aircraft can be used within the National Airspace System. Specifically, Part 107 applies to "small unmanned aircraft," which are defined as any unmanned aircraft weighing less than 55 pounds (25 kg) at takeoff, including all components and payloads attached to the aircraft.
Waivers and Authorizations: Expanding the Operating Envelope
Under FAA Part 107, operators can apply for waivers that permit unmanned aircraft systems (UAS) to conduct missions that exceed standard regulatory limits—such as flying at night, operating over people, or beyond visual line of sight (BVLOS). Additionally, operations within controlled airspace require prior authorization from the FAA. The waiver application process is designed to ensure that applicants present robust risk mitigation strategies and detailed procedural controls, demonstrating their ability to maintain the highest safety standards during expanded operations, Figure 10.

Practical Mapping Use: Connecting UAS Classes to Geospatial Missions
Practical Mapping Use: Connecting UAS Classes to Geospatial Missions msm26When conducting geospatial mapping and photogrammetry with unmanned aircraft systems (UAS), selecting the appropriate platform involves balancing multiple factors. These include the engineering class of the aircraft (such as its size and flight endurance), the type and capabilities of the onboard sensors, and the required accuracy for the mission. All of these considerations must be aligned with FAA Part 107 regulatory requirements and the specific airspace environment of the site.
- For large-area mapping, fixed-wing UAS are often more efficient due to their longer endurance. However, these missions may be restricted by visual line of sight (VLOS) limitations, potentially necessitating segmented flight planning or securing a Beyond Visual Line of Sight (BVLOS) waiver.
- Multirotor platforms excel at capturing high-resolution data over smaller areas and offer greater maneuverability for precise coverage. However, they may require several flights and frequent battery changes to achieve full area coverage.
- When operating near critical infrastructure or within controlled airspace, operators must obtain FAA preflight authorizations and implement robust procedural controls, regardless of the UAS platform used, to ensure regulatory compliance and safety.
References and Suggested Reading
- Federal Aviation Administration (FAA). Small Unmanned Aircraft Systems (sUAS) Regulations (Part 107) overview page.
- Electronic Code of Federal Regulations (eCFR). 14 CFR Part 107—Small Unmanned Aircraft Systems (definitions and operating rules).
- FAA. Part 107 Waivers guidance page.
To Read
- Chapter 2, of textbook 2, Introduction to UAS Systems (Aerospace Series)
- Crouch, C. Integration of Mini-UASs at the Tactical Operations Level: Implications of Operations, Implementation, and Information Sharing
To Do
- Read the paper "Unmanned Aircraft Systems in Remote Sensing and Scientific
Research: Classification and Considerations of Use by Watts, et al". Highlight agreements and differences in the UAS classifications system between the one adopted in the paper and the one given in this lesson. Post your opinion in the discussion forum for Lesson 1. Respond to at least one posting from your peers. (3 points)
Lesson 2: Unmanned Aerial System Elements
Lesson 2: Unmanned Aerial System Elements mjg8Lesson 2 Introduction
Lesson 2 Introduction AnonymousWelcome to Lesson 2! In this lesson, you will become familiar with the elements that combine to create an operational Unmanned Aerial System (UAS). Most UASs consist of an Unmanned Aerial Vehicle (UAV), human elements, payload, control elements (for a larger system it will be a ground control station (GCS) or mission planning and control station (MPCS)), and a data link communication unit (Figure 2.1). Military versions of the UAS will have an additional weapon system platform and supporting soldiers as part of the human element.

In addition, you will understand and develop knowledge about the different acquisition and auxiliary aerial sensors that are usually carried on board the UAS payload. Finally, at the end of this lesson, you will have a working knowledge of the different components forming a UAS and how the different components relate and interact with one another, the data acquisition sensors, and the auxiliary sensors that accompany a UAS mission, such as GPS and IMU.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- describe and identify the different elements of an Unmanned Aerial System (UAS);
- understand the functionality of each element making the UAS;
- explain how the different elements of a UAS complement each other;
- understand the basics in regard to an aerial vehicle design;
- describe different payloads;
- identify the different miniaturized sensors used for remote sensing;
- understand the fundamentals of digital cameras and LiDAR;
- understand the basics principals of GPS and IMU.
Lesson Readings
Course Textbooks
- Chapter 3 of the textbook: Unmanned Vehicle Systems for Geomatics: Towards Robotic Mapping
- Chapter 3 of the textbook: Bankhart et al., Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chaptes 9,10,11,12,13,17, and 18 of the textbook: Fahlstrom, et al., Introduction to UAV Systems (Aerospace Series), 5th edition
Google Drive (Open Access)
- Chapter II from Crouch, C. thesis, “Integration of mini-UAVS at the tactical operations level: Implications of operations, implementation, and information sharing.”
- Section 2.6 of the report "Eyes of the Army, U.S. Army Roadmap for Unmanned Aircraft Systems 2010-2035."
- Chao, H. et al., "Towards Low-cost Cooperative Multispectral Remote Sensing Using Small Unmanned Aircraft Systems."
- Williams, K., Human Factors Implications of Unmanned Aircraft Accidents: Flight-Control Problems, FAA report
- Takanmaki, I, et al., "How and why Unmanned Aircraft Vehicles can improve Real-time awareness?"
Lesson Activities
- Study lesson 2 materials on CANVAS/Drupal and the textbook chapters assigned to the lesson
- Complete your discussions on "Agreements and Differences in UAS Classification."
- Complete quiz 1
- Complete quiz 2
- Install Pix4D Software
- Attend the weekly call and Exercise 1 training on Thursday evening at 8:00 pm ET
The Air Vehicle
The Air Vehicle ksc17The air vehicle is the airborne part of the system. The air vehicle here means the aircraft, in conjunction with the payload, that forms an Unmanned Aerial System (UAS). In general, the unmanned aircraft is usually called an Unmanned Aerial Vehicle (UAV) and can be either a fixed-wing or rotary airplane that flies without a human on board.
The UAV is a complicated system including structures, aerodynamic elements such as wings and control surfaces, propulsion systems, control systems, communication elements, and launch and recovery subsystems. Larger UAVs use fuel-powered engines in order to attain flight, while smaller UAVs typically use either gasoline-powered engines or electrically powered engines. When the UAV has sensors and payloads, it is customarily called an Unmanned Aerial System (UAS). In this course, the terms UAS and UAV will be used interchangeably to mean the same. Due to inclusion of the word "unmanned," there has been some resistance in recent years to use of the names Unmanned Aircraft and Unmanned Aerial Vehicle. There is a push to adopt the term Remotely Piloted Aircraft (RPA) or Remotely Piloted Vehicle (RPV) because of the crucial human involvement related to the operation of the system. UAVs come in all different sizes and shapes; however, the following are the major factors to be considered in designing a UAV:
- take-off weight
- wing span
- length
- endurance speed
- endurance time
- weight of fuel
- engine power
- engine capacity
- engine weight
- airframe weight
To Read
- Chapter 3 of the textbook: Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chapter II from: Collier C. Crouch, 2005, “Integration of mini-UAVS at the tactical operations level: Implications of operations, implementation, and information sharing,” master’s thesis, Naval postgraduate school, Monterey, California.
- Section 2.6 of the report "Eyes of the Army, U.S. Army Roadmap for Unmanned Aircraft Systems 2010-2035." It is a good reading about UAS definitions.
The Communication Data Link
The Communication Data Link mjg8The term data link is used to describe how commands are communicated back and forth between the ground control system and the autopilot. The data link is a key subsystem for any UAS, as it provides a two-way communication to ensure that missions are executed safely and according to plan. A good data link is illustrated in Figure 2.2:
- The uplink that is illustrated in Figure 2.2, below, operates with a bandwidth of a few kHz and secures sending data to control the UAS flight path and to communicate with the payload;
- The downlink from the UAS to the ground control station uses a low data rate to acknowledge commands and to send status information about the air vehicle, and uses a high data rate (1-10 Mbs) for sending payload sensor data, such as video, down to the ground control station. The downlink signal can also be used to locate and measure the position of the air vehicle (range and azimuth) in reference to the ground antenna and to improve the overall accuracy of target locations measured by the payload sensors.

This schematic drawing illustrates the workings of a data link system. The system is composed of four elements:
- the Ground Antenna (which looks like a truck with an antenna extended from the back),
- the Ground Control Station (a satellite dish mounted on an unattached, flat-bed truck trailer),
- the UAS (airborne above the Ground System, which is the Ground Antenna and Ground Control Station together) and
- a Satellite, floating in space.
Between the Ground Antenna and the Ground Control Station is a two-way arrow containing the words "Wire, fiber optics, RF"
An arrow points from the Ground Control Station to the UAS. It contains the words "Uplink commands - few kbs"
An arrow points from the UAS to the Ground Control Station. It contains the words "Status - few kbs; Sensor data - about 10Mbs."
A two-directional arrow between the UAS and the Satellite contains the words "Alternate via satellite or other relay."
A two-directional arrow between the Ground Control Station and the Satellite contains the words "Alternate: satellite or other relay."
There are two different modes for operating a UAS. Those are:
- The radio frequency line of sight (LOS) operation, where a direct communication link is established between the UAS and the ground station, and
- The beyond line of sight (BLOS) operation. This mode of operation is used when the UAS is controlled from far distances beyond the LOS capability. Communications satellites are usually used in this mode of communication.
More details on the two operating modes will be covered in lesson 4.
To Read
- Section 3.4 of chapter 3 of textbook 1 "Introduction to the Unmanned Aircraft Systems"
- Chapter 13 of textbook 2 "Introduction to UAV Systems (Aerospace Series)"
The Command and Control Element
The Command and Control Element mjg8The command and control element is the nerve center for the UAS operation. It controls the following tasks:
- launching the vehicle,
- flying the vehicle,
- recovery of the vehicle,
- receiving and processing data from internal sensors of the flight system,
- receiving and processing data from external sensors of the payload,
- controlling the operations of the payload, and
- providing the interfaces between the UAS and the outside world.
The command and control element utilizes several subsystems to accomplish its missions. They are:
- UAV status and controls,
- payload data display and control,
- map displays for mission planning and for monitoring the flight path,
- autopilot to provide the ability for the UAV to execute its mission based on preprogrammed instructions without operator intervention,
- ground terminal for two-way communication with the UAV and the payload,
- computer(s) to:
- provide an interface between the UAV and operator,
- control the data link and the data flow between the UAV and the command and control station,
- perform the navigation functions for the system,
- perform necessary computations for the autopilot and the payload control functions,
- communication links to other organizations for command and control and for dissemination of information gathered by the UAV.
The most important parts of the command and control element are the Autopilot and the ground control station, as described in the following subsections:
Autopilot
Autopilot is the sub-system that enables partial or fully autonomous flight. A UAV can be operated completely by a remote control, where an operator steers the air vehicle all the time, or a UAV can be flown autonomously, where a pre-programmed path is fully executed from takeoff to landing by the autopilot sub-system without any pilot intervention. Small, light-weight autopilots are readily available and are made by a few manufacturers. Besides guiding the air vehicle throughout the pre-set flight path, the autopilot also executes a “lost link” routine if the UAV loses contact with the ground control station. The lost link procedure guides the UAV to a known waypoint, where contact with the ground control station can again be established. The following scenario was developed for a typical emergency procedure based on loss of link between the Yamaha RMAX UAS and the ground Control Station:
Emergency Procedures for Yamaha RMAX UAS
The RMAX utilizes a redundant communication system to ensure constant contact between the aircraft and the remote pilot. The ground control station provides real-time data regarding aircraft location, altitude and flight characteristics. The pilot constantly monitors the flight information provided to the ground control station, and through the assistance of a trained observer, maintains a visual line of sight to the aircraft. In the event of a loss of link between the aircraft and the ground control station, the subsequent procedures are followed:
- Preflight Actions - Prior to any flight, and as part of the mission preparation, the mission operator will insert appropriate lost link settings to allow the RMAX to safely return to the predetermined landing location. The settings are stored on the aircraft so that in the event of a lost link, the RMAX is able to continue operations under autonomous control.
The mission operator will identify a safe altitude and a location for the aircraft to fly to once the RMAX detects a lost link. Once the aircraft reaches the specific GPS location, it will begin at auto descent and shut off the rotors upon landing.
- In the Air - The RMAX continuously monitors the status of communication with the ground station. When the RMAX detects a loss of link with the ground station, it starts a timer. This timer value (typically 5 seconds) is set by the operator in the mission settings page. When this timer expires, the RMAX goes into lost communication mode and will command the vehicle to an operator-indicated lost communication waypoint at a predetermined altitude. The aircraft then commands a 20-second descent until touchdown. Once the aircraft lands, the aircraft automatically turns the rotors off.
| Problem: | Sign of Problem: | Monitored throughout: | Solution: |
|---|---|---|---|
| Low Signal | Vehicle is slow to respond to manual commands or PCC commands. Autopilot terminates steering mode. Audible and warning light alarms. | Yes, signal strength displayed in percentage and packet update rate. | Turn Autopilot on and abandon manual flight. Initiate auto-land. |
| Loss of Communication | Autopilot terminates manual control or fails to respond to PCC commands. Audible communication alarm and warning light. | Yes. | The vehicle returns to loss communication waypoint, hovers until elapse of flight timer, then commences auto-land procedure. |
| Loss of GPS | First indication is poor altitude hold performance, also poor position hold during hover. | Yes, indicated by the number of satellites tracked and GPS Quality PDOP. | Assume manual control of aircraft and land. |
| Low Power Avionics | Lower than nominal voltage displayed. | Yes. | Land Immediately. |
| Engine Failure | Noise level or RPM changes, engine loses power. | Yes, monitored by rotor RPM through the RPM sensor. | Return and land immediately. If the engine dies, initiate autorotation procedure. |
| Tail Rotor Failure | Loss of tail control. | No. | Switch to manual control and initiate autorotation procedure. |
Ground Control Station
The ground command station (GCS) is the site where the pilot controls the UAV during the flight. The GCS size and sophistication depends on the category of the UAS/UAV. Some large UASs require a formal facility with multiple workstations and personnel, while a GCS for small UAS can be a handheld transmitter. Most UASs used by the geospatial community are small UASs that do not require a dedicated GCS.
To Read
- Section 3.3 of chapter 3 of the textbook: Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chapter 9 of the textbook: Introduction to UAV Systems (Aerospace Series), 5th edition
- The paper "Towards Low-cost Cooperative Multispectral Remote Sensing Using Small Unmanned Aircraft Systems"
Payload
Payload sxr133Payload refers to air vehicle (aircraft) cargo. It is also defined as the amount of cargo weight an air vehicle can safely carry. Carrying a payload on board is the sole purpose for most UASs. Payloads come in a variety of sizes, weights, and functions. In our business of geospatial remote sensing, we focus on remote sensing sensors and the necessary navigation systems accompanying them. A UAS dedicated to remote sensing and mapping missions is usually equipped with one or more of the following sensors.
Remote Sensing Sensors for the Unmanned Aerial System
Electro Optical Sensors: such as cameras (still and video, film and digital). Aerial imaging is considered one of the most acceptable applications for UASs. Recent cameras are all digital cameras (versus film). Digital aerial cameras are categorized as follows:
- Large Format Cameras: These are mainly used on board manned aircraft and large UASs. They are very heavy for small and medium-sized UASs. Large format cameras are used to cover large areas, such as entire counties or states. Figure 2.3 illustrates a few of the most common aerial cameras used today.
Figure 2.3 Large Format Aerial CamerasSource: (as cited by each image)- Medium Format Cameras (not compact): These are cameras that are smaller than large format cameras and more suitable for medium and large UASs. Cameras in this class are still too heavy for small UASs. They are widely used for manned aircraft and can be suitable for large size UASs.
- Small and Medium Miniaturized Format Cameras: This class of cameras is similar to the cameras we own at home and use for recreational purposes, but with compacted size to be suitable for UAS use. Miniaturized cameras are the newest development in the field of digital cameras; they are developed mainly for small UASs. Examples of miniaturized cameras are the Imperx Bobcat 2 camera (Figure 2.4), which has a 16 mega pixel array, weighs only 369 grams (13 ounces) and has a length of 53 mm (2 inches), and the iXA camera system from Phase One Industrial (Figure 2.5). In their latest development, Phase One released their latest payload for UAS, the P3 DJI which is based on mounting an iXA on DJI M300 using DJI mounting hardware and app. Here is additional information about the Phase One suite of sensors for UAS.
- Infrared Sensors: An infrared sensor operates in the infrared range of the electromagnetic spectrum. Infrared sensors for remote sensing are designed to operate in two regions of the electromagnetic spectrum (EMS), those are:
- The Near Infrared Region (NIR): NIR is barely outside the Red of the visible region of the EMS, see Figure 2.6. When the NIR band mixed with the red and green bands of the visible light, it forms a false colored infrared (CIR) image when it is displayed in the order of NIR, R, G instead of the usual R,G,B. False colored infrared imagery is effective in studying vegetation indices and vegetation health and conditions. Precision Hawks runs successful applications for the precision Ag industry. Few sensors designed for UAS to provide multi-spectral imagery, including the NIR band. PARROT SEQUOIA+ is one of those affordable sensors, Figure 2.7.
- Thermal Infrared Region (TIR): TIR sensor records the sensed heat and displays it as an image. There are two different technologies used in building such sensors. The uncooled sensor is usually less expensive and less sensitive when compared to heavyweight cooled sensors. Thermal Infrared sensors are widely used for survey and inventory of buried hot water and steam pipes and to inspect heat loss from these pipes. They are also employed for roof inspection, looking for water leaks and heat loss. An example of an infrared sensor is the FLIR A6700SC, shown in Figure 2.8. Recently, FLIR offered a suite of smaller thermal infrared cameras suitable for small UAS such as the VUE PRO, Figure 2.9. In a teaming agreement with DJI, new payloads by DJI such as Zenmuse XT simultaneously carry an RGB camera and a FLIR thermal sensor.
- Laser Sensors: These sensors use laser light for range finding. In addition to the laser source, LiDAR systems use GPS and Inertial Measurement Units (IMU) for precise geolocation of point cloud or terrain mapping. Laser ranging, when combined with necessary auxiliary sensors such as GPS and IMU (see details below), makes a laser-based terrain mapping system called Light Detection and Ranging (LiDAR). LiDAR systems can map the terrain generating point cloud, which can be used to precisely model the terrain below the path of the UAS. Recently, miniaturized laser-based systems started their way into UAS payloads. An example of the compact size LiDAR that is suitable for UAS is the Riegl VUX-1 (Figure 2.10) and Velodyne (Figure 2.11). T o learn more about a few different UAS-based lidar systems and how people are evaluating the quality of its products, I encourage you to watch this video about UAS-based lidar systems evaluation.
- SAR Sensors: Synthetic Aperture Radars are usually employed by the military for reconnaissance purposes. They require large-sized UASs, as they are heavy. We should not expect a civilian UAS with a SAR system as part of the payload in the foreseeable future.
Auxiliary Sensors for the Unmanned Aerial System
Auxiliary sensors here mean the navigation sensors that are necessary to determine the location and the orientation of the UAS and its remote sensors that are mentioned earlier in this section. For the UAS and onboard sensor position determination, the Global Positioning System (GPS) is used, and for the attitudes or orientation of the UAS and the onboard sensors, the Inertial Measurement Unit (IMU) is used.
Global Positioning System (GPS)
The GPS does not need an introduction, as everyone is familiar with its definition. It is the same GPS that you might use to drive around town. However, GPS that is used to determine a remote sensor position usually undergoes a post-processing to enhance the accuracy of the position.
UAS are offered with two grades of GPS accuracies. The most common one is the single frequency GPS receivers as it is cheaper, and it does not require post-processing or real-time correction service. Such receivers provide location accuracy of around 1 to 2-meter. For more accurate geospatial products generation, the more accurate dual frequency receiver and precise services are need needed. The latter receivers offer two modes of operations, both of which yield positional accuracy of 1 to 3 cm with little or no ground controls required for the project. UAS vendors are fielding systems with two operational modes, those are:
- The real-time kinematic GNSS (RTK) mode: This mode of operation allows the UAS to receive in real time GPS positions corrections from GPS correction services. This mode of operation has particular requirements:
- RTK requires a GNSS base station equipped with a transmitter with a reliable link to a fairly dynamic moving platform such as UAS.
- The rover (on the UAS) itself requires a dedicated receiver for the corrections.
- The post-processed kinematic (PPK) mode: This mode of operation does not require the real time GPS positions correction, as the acquired GPS data can be post-processed at a later date. RTK operations not only require a stationary base station during the UAS mission, but the location of such base station should be surveyed and located before the UAS flies the project, something may complicate the deployment of the mission in some circumstances. Although PPK requires a base station as well, the base station’s precise location can be determined later after leaving the project area.
In principle, both RTK and PPK promise positional accuracies at the 1-3cm level. The main purpose of RTK and PPK is to minimize or eliminate the need for ground control points, thereby reducing cost. For more details on GPS, please visit GPS Defined.
Inertial Measurement Unit (IMU)
An inertial measurement unit, or IMU, is an electronic device that measures and reports on aerial vehicle velocity, orientation, and gravitational forces using accelerometers and gyroscopes. IMUs are typically used to control and maneuver manned aircraft, unmanned aerial vehicles (UAVs), and satellites. Another important use for the IMU is that it helps IMU-enabled GPS devices to maintain positioning information when GPS-signals are unavailable, such as in tunnels, inside buildings, or when electronic interference is present.
The IMU is the main component of inertial navigation systems (INS) used in aircraft, spacecraft, watercraft, and guided missiles in Geo-spatial mapping activities. The data collected from the IMUs sensors allows us to determine the orientation of the sensor, which is an important aspect in geolocating on the ground each pixel of the sensor. The IMU, like other components necessary for the operation of UASs, is miniaturized in weight and size to make it fit on small UASs. An example of these small IMUs, which are mainly designed for UASs, is the SBG 500E, illustrated in Figure 2.12.
For more details on the IMU, you can visit the IMU Wikipedia page.
To Read
- Section 3.5 of Chapter 3 of the textbook: Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chapter 10, 11, and 12 of the textbook: Introduction to UAV Systems (Aerospace Series), 5th edition
- Chapter 3 of the textbook: Fundamentals of capturing and processing drone imagery and data
Launch and Recovery
Launch and Recovery sxr133The launch and recovery element is an area that requires the most human interaction. Some UASs require elaborate launching procedures, while others can be hand thrown toward the sky. Some large UASs require long runways and other field support equipment such as fuel trucks, ground power units, and ground tugs. Similarly, the requirements for recovery procedures vary widely. Most small UASs that are used for geospatial projects require simple procedures and can be hand held or launched with the use of a catapult.
Some UASs, such as target drones, are air-launched from fixed-wing aircraft. Usually, large UASs are equipped with wheels for takeoff and landing and do not need special equipment, while smaller UASs need a variety of launch and recovery strategies depending on the complexity of the system.
A truck driven at a speed of 60 mph can be used to launch a small UAS assuming that the launching site contains a smooth surface for the truck to use. In this type of launching method, the UAS is held in a cradle above the truck cab with its nose pointed high toward the launching path, Figure 2.13. Once speed is sufficient for takeoff, the UAS is released and lifts upward toward its takeoff path.
Many small and medium-sized UAS launch systems have a requirement to be mobile, or in other words, to be mounted on a truck or a trailer. Such mobile launchers fall within one of the following types:
- Rail Launchers: The UAS is held fast to a guide rail while it is accelerated to launch speed.
- Pneumatic Launchers: Compressed air or gas is used to provide the necessary force for launching the UAS.
- Hydraulic/Pneumatic Launchers: Compressed gaseous nitrogen is used as the power source for launch.
- Zero Length Rocket Assisted (RATO) Launching: There is no rail or track used in this mode of launching. The UAS rises directly from a holding mechanism, and it will be in free flight once the rocket is fired.
For more details on these launchers, refer to chapter 17 of the supplemental textbook Introduction to UAV Systems, 4th edition.
To Read
- Section 3.6 of Chapter 3 of the textbook: Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chapter 17 of the textbook: Introduction to UAV Systems (Aerospace Series), 5th edition
The Human Element
The Human Element sxr133Like any other technology that requires human intervention for the safety of operation, human involvement is considered to be the most important element for the successful and safe operation of the UAS. Even with autonomous flights using autopilot, the human role during launch and recovery is crucial to the operation of the UAS. As navigation technology develops further, the human role in operating a UAS will diminish dramatically.
The human element is key in almost all operational aspects of any UAS and plays a great role in the success and survival of its operation. Starting with mission planning, humans have to design and arrange a concept of operation in order to guarantee success. Equally important is the human role in the flight control process. Autopilot can do only so much without the guidance and intervention of the operator.
The role of the pilot and the observer cannot be underestimated, as without them the flight will not occur. This is true even with the most sophisticated drones, such as the Predator. Even the Predator, with sophistication and automation built in, needs a pilot to fly it. The human element is involved in all of the following aspects of operating a UAS:
- Mission planning and control: has to be performed by an operating team.
- Launch and recovery procedure: has to be performed by an operating team.
- Payload management and control: has to be managed by an operating team.
- Data links monitoring: has to be managed by an operating team.
- Ground support equipment coordination and management: has to be performed by an operating team.
Automation in operating a UAS results in less human intervention, but it will never eliminate the role of the human in such an operation. Imagine that an airline invites you to be on board an airplane flown solely by autopilot. There are no pilots on board. Would you accept such an invitation? I am certain your answer would be a big NO. Using the same analogy, could you imagine operating a UAS, which is less sophisticated than a jetliner, without a pilot and without an observer? That is how important the human role is in operating a UAS. That is at least true for the time being. Who knows what the future may bring to this field.
To Read
- Chapter 6 of the textbook: Introduction to the Unmanned Aircraft Systems, 2nd edition
- Chapter 9 of the textbook: Introduction to UAV Systems (Aerospace Series), 5th edition
- Williams, K. Human Factors Implications of Unmanned Aircraft Accidents: Flight-Control Problems, FAA report, DOT/FAA/AM-06/8 Office of Aerospace Medicine, Washington, DC 20591- April 2006
Summary and final tasks
Summary and final tasks sxr133Summary
Congratulations! You've finished most of the Lesson 2 material. What I hope you learned from this lesson is all you need to know about the different elements that form a UAS. The payload section is very valuable to individuals with background in geospatial mapping, as it goes through the different sensors utilized by the industry today. Understanding the functionality of each of the UAS elements will help you in the common lessons, where we are going to talk about Concepts of Operation (CONOP), risk assessment, and Certificate of Authorization (COA). Therefore, please make sure that you understand the different topics of this lesson and do not hesitate to ask questions.
Final Tasks
| Task | Description |
|---|---|
| 1 | Study lesson 2 materials on CANVAS/Drupal and the text books chapters assigned to the lesson |
| 2 | Complete Lessons 1 & 2 Quizzes |
| 3 | Complete your discussions on "Agreements and Differences in UAS Classification" |
| 4 | Install Pix4D software. Pix4D is the data processing software you will use to process UAS imagery. Follow the instructions in Canvas. |
Lesson 3: Concept of Operation (CONOP) and Risk Assessment for UAS
Lesson 3: Concept of Operation (CONOP) and Risk Assessment for UAS sxr133Lesson 3 Introduction
Lesson 3 Introduction AnonymousWelcome to Lesson 3! In this lesson, you will become familiar with the concept of operating a UAS and how to design a Concept of Operation (CONOP). The CONOP subject focuses on the pre-flight description of the mission that a UAS operation will go through. You will also learn how to analyze risks surrounding UAS operations and to how to assess and mitigate the impact of such risks.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- understand the concept of operation design strategy;
- understand risk assessment;
- design a CONOP and Risk assessment for a UAS mission.
Lesson Readings
Course Textbooks
- Read chapter 7 "Safety Assessment" of textbook 1: Barnhart, et al., Introduction to Unmanned Aircraft Systems, 2nd edition
- Chapter 7 of the textbook: Bankhart et al., Introduction to the Unmanned Aircraft Systems, 2nd edition
Web Articles
- Lamb, G., et al., "Air Combat Command"
Google Drive (Open Access)
- Gebre-Egziabher, D., et al., "Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications."
- FAA document “Integration of Unmanned Aircraft Systems into the National Airspace System Concept of Operations.”
- CONOP developed by NOAA’s National Weather Service: "River Forecast Center (RFC) Analysis and Gridded Forecast Editor Improvement."
Lesson Activities
- Study lesson 3 materials on CANVAS/Drupal and the textbook chapters assigned to the lesson
- Start working on the "CONOP and Risk Assessment" report assignment
- Complete quiz 3
- Submit preliminary project idea
- Start your first post for the discussion on "SWOT Analysis."
- Start UAS Data Processing Using Pix4D for Exercise 1
- Download and practice Mission Planner Software
- Attend the weekly call on Thursday evening at 8:00 pm ET
CONOP Elements
CONOP Elements sxr133The term CONOP means a complete description of the mission that a UAS operation goes through from launch to recovery. The CONOP includes a procedure for the mission to be carried out to achieve the mission objectives. The procedure depends on the system configuration and capabilities. UAS capabilities determined by its components such as sensors, guidance, endurance (in time), weather limitations (ceilings, wind speeds, etc.), navigation and control play key role in defining the CONOP. CONOP may also depend on other factors such as safety considerations for the UAS as well as to lives and properties along the flying path of the mission. The procedure will also include weather condition such as wind speed and visibility, as the mission may be halted or terminated if the favorable weather condition is not reached.
The FAA expectations from the provided CONOP are:
- to give FAA clear understanding of the proposed operations;
- to include:
- description of UAS;
- details of intended use;
- proposed area of operations;
- intended classes of Airspace.
- to enable or to include the development of the Operational Risk Assessment (ORA).
There are many ways to design CONOP one of which is described in the final report published by the ITS Research Institute of the University of Minnesota entitled “Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications”. In that report, the process describes the following main elements of the design:
- CONOP Definition
- Identification of CONOP Safety Risks
- Designing Fault Detection Strategies
Figure 3.1 illustrates a flow chart for small UAS concept of operation (CONOP) design process.
Many details need to be identified to complete CONOP development. Information such as:
- What are the known elements and the high-level capabilities of the system?
- What are the geographical and physical extents of the mission under execution?
- What is the time-sequence of activities that will be performed?
- What resources are needed to design, build, or retrofit the system?
- Which kind of risks are associated with different components of the system?
The block diagram in Figure 3.2 illustrates the components that make up most of the UASs available in the market today. Such a diagram is very beneficial to CONOP analysis and development, as it lists all the sub-systems included in a UAS. As you can see below, the main components that concern us in this course are the mission sensor payload, airborne data link, and navigation and control sensors. The mission sensor payload represents the highest priority for geospatial data users. Types and quality of sensors within the mission sensor payload block are directly linked to the end user's needs and expectations. The other two blocks, the airborne data link and navigation and control sensors, mainly concern the FAA and its regulations. Main FAA concerns lie in the quality of the communications and the navigational systems that steer and control the aircraft.

At the center of this schematic drawing is a rectangle containing the words “Flight Computer.” Arrows leave the Flight Computer from the left, the right, and from its top and bottom.
From the right, an arrow points to a smaller rectangle with rounded corners that contains the words “Motor Controller.” A subsequent arrow leaves the left side of the Motor Controller and points to a labeled drawing of the engine.
From the right of the Flight Computer, an arrow points to a smaller rectangle with rounded corners that contains the words “Control Servo.” The Control Servo is attached with a thin line to a drawing of an aircraft, which is labeled “Control Surface.”
From the bottom of the Flight Computer, an arrow points to a rectangle entitled “Airborne Data Link Tranceiver.” From the Airborne Data Link Tranceiver, an arrow points back to the Flight Computer. The Airborne Data Link Tranceiver has a small “Data Link Antenna” attached.
One arrow leaves the top of the Flight Computer. It points to a rectangle entitled “Mission Sensor Payload.” In addition, an arrow flows from the Mission Sensor Payload back to the Flight Computer.
A second arrow points to the top of the Flight Computer. This arrow originates in a rectangle labeled “Navigation and Control Sensors.”
To Read
The report "Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications," discusses in detail the Concept of Operations for the UAS.
To Do
Review the FAA document “Integration of Unmanned Aircraft Systems into the National Airspace System Concept of Operations”. When reading through the FAA document, focus on the components that the FAA considered when developing these Concepts of Operations. These considerations are key and beneficial to the development of your CONOP and Risk Assessment analysis that follows later in this lesson.
UAS Risk Assessment
UAS Risk Assessment sxr133In this section, you will explore potential risks surrounding UAS operations. The block diagram provided in section 3.1 illustrates UAS system components, each of which carries its own operational risk. In addition, there are many external risks surrounding the operational environment, such as weather and other aircraft sharing the same airspace. Review the document assigned for this section to stand on the different types of risk associated with the UAS operations.
To Read
Read chapter 7 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition. While you are reading through the chapter, focus on the hazard recognition and risk assessment, as you will need it for the section "Concept of Operation (CONOP) and Risk Assessment for UAS.
To Do
Review the report "Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications" that was provided to you in the section "CONOP Elements." Focus on the discussions concerning risk assessment.
Activity: Development of CONOP and Risk Assessment
Activity: Development of CONOP and Risk Assessment sxr133It's time to develop your own CONOP and Risk Assessment methodology for the UAS you selected for the SWOT analysis in Lesson 2, by doing the following:
- List all potential risks that surround such an operation and classify them into detectable or undetectable risks.
- Select three of the risks that you identified and assess the consequences of each of these risks, if they happened, on each of the following topics:
- Project Cost (budgetary)
- Project Schedule (time)
- Safety (personnel, vehicle, facilities, etc.)
- Describe your mitigation strategy for each of these three risks, when they happen.
Your document, at a minimum, should include the following sections:
- Cover page
- Table of contents
- Complete technical description of your system, element by element, with necessary illustrations and images. It should include at a minimum the following system items:
- UAS and its components
- Ground Control Stations
- Data link and communications frequency and protocol
- Payload (cameras, other sensors, and auxiliary systems such as GPS and IMU)
- Batteries
- Your project description and geographic site parameters. At a minimum, you should include the following:
- Geo-referenced topographic map of the project area, showing the project boundary
- Geo-referenced thematic map of the project area showing the project boundary and surrounding features, i.e., Google map
- Sectional chart showing project boundary, airspace class, restricted flight areas, airports, towers/obstacles, etc.
- An operational plan, i.e., project plan, budget, schedule, flight plan, etc. Your flight plan should be executed in two ways:
- Manual computations according to the formulas provided in lesson 4, provide all details of computations, not just the final computed value.
- From the Mission Planner software, provide a screen capture and details of the flight parameters.
- Besides the details for the two flight plan design scenarios, provide at a minimum the following information:
- Flying altitude
- Number of flight lines
- GSD
- Ground control points layout
- Illustration on a map showing flight lines, take-off and landing spots, etc.
- The project budget should include the following cost items:
- System acquisition or lease
- Personnel travel and per diem
- Man-hours for field operations during data acquisition, including RPIC and perhaps visual observation
- Ground control points survey
- Equipment insurance
- Spare batteries and other supplies, if any
- Data processing using Pix4D
- Etc.
- Identification and classification of all possible risks, classified as detectable and undetectable risks.
- Analysis of the impact of three of these risks on project cost, schedule, and the safety of the surroundings.
- Mitigation strategies for the three risks when they occur.
- Summary and Conclusions.
Submit your report in a Word document, NOT a PDF, to the drop box. The report should not exceed 3500 words or 15 pages (single line spacing). You will have 3 weeks to complete this assignment. (7 points or 7%)
The deadline for this assignment is at the end of lesson 5.
To Read
- Air Combat Command is a good read about UAV CONOP in general
- River Forecast Center (RFC) Analysis and Gridded Forecast Editor Improvement, CONOP developed by NOAA’s National Weather Service.
To Do
- Complete your CONOP and Risk assessment analysis as described above. Drop your completed MS Word document in the drop box in Lesson 5. (7 points)
Summary and final tasks
Summary and final tasks sxr133Summary
Congratulations! You have finished Lesson 3, CONOP and Risk Assessment for UAS. As you may have noticed from reading the different materials provided in this lesson, the development and completion of CONOP is an important milestone that needs to be achieved for the successful operation of UASs, especially here in the United States where the FAA has strong rules and regulation against operating UASs. Without a CONOP, the operation may end up with chaotic operations and disastrous results. You also noticed that recognizing the risks surrounding a UAS operation and mitigating them not only results in a safe operation, but it will also please the FAA and encourage them to issue the required permissions to operate a UAS in the National Air Space (NAS).
Final Tasks
| 1 | Study lesson 3 materials on CANVAS/Drupal and the text books chapters assigned to the lesson |
|---|---|
| 2 | Complete the Lesson 3 Quiz. |
| 3 | Submit preliminary project idea/proposal in the "Preliminary project idea" dropbox. |
| 4 | Start UAS Data Processing Using Pix4D for exercise 1. Pix4D is the data processing software you will use to process UAS imagery. Follow the instructions in Canvas. |
| 5 | Download and practice Mission Planner Software, following these instructions. |
| 6 | Start working on your CONOP and Risk assessment analysis. Submit your completed MS Word document to the drop box in Lesson 5. (7 points) |
| 7 | Start your first post for the discussion on "SWOT Analysis. |
Lesson 4: UAS Mission Planning and Control
Lesson 4: UAS Mission Planning and Control sxr133Lesson 4 Introduction
Lesson 4 Introduction AnonymousWelcome to Lesson 4! In this lesson, you will practice planning and designing a UAS mission. For this lesson, we will focus on imaging sensor (digital cameras), as it is widely used for geospatial projects. Successful execution of any mapping project requires a tremendous amount of planning prior to mission execution. Planning must be done by an experienced person who is familiar with all aspects of mapping. Mission planning includes the following categories:
- defining products specifications;
- studying area maps;
- planning the aerial imagery;
- planning the ground controls;
- selecting procedures, personnel, and production instruments;
- estimating costs;
- developing a delivery schedule.
You will understand and become familiar with the main parameters that need to be considered when selecting a UAS for geospatial business activities. You will also recognize the main manufacturers of UAV, aerial acquisition sensors, and processing software. There are not many materials in the course textbooks that directly deal with these subjects, but one can indirectly derive some information from them. In addition, several research studies were conducted by private or public groups on the status of market and future prediction.
Unmanned Aerial Vehicles (UAVs) are becoming the most dynamic growth sector, and based on a research study conducted by the Teal Group Corporation, it is expected that the global UAV market will top US $54 Billion in the next decade or so.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- understand basic requirements for mission planning;
- understand sensor internal geometry;
- describe factors affecting flight plans such as way points, product resolution and accuracy, aircraft speed, etc.;
- practice flight planning for a UAS mission;
- understand calibration requirements for imaging sensors and auxiliary systems.
- understand the major considerations in selecting a UAS for geospatial business;
- differentiate between the main providers of UAS;
- discriminate between the main providers of aerial sensors for UAS;
- recognize the main providers of software for UAS data processing.
Lesson Readings
Course Textbooks
- Chapters 3,11, 12, and 18 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th edition
- Chapters 4 and 8 of the textbook: Fundamentals of capturing and processing drone imagery and data
Google Drive (Open Access)
- Yastikli, N., et al., "In-Situ Camera and Boresight Calibration With Lidar Data"
- Merchant, D., et al., "USGS/OSU Progress With Digital Camera In Situ Calibration Methods"
- Drone Analyst's "Five Things to Consider when Adopting Drones for Your Business"
Lesson Activities
- Study lesson 4 materials on CANVAS/Drupal and the textbook chapters assigned to the lesson
- Complete quiz 4
- Complete your discussions for the assignment on "SWOT Analysis."
- Continue working on the "CONOP and Risk Assessment" report assignment
- Practice Mission Planner software
- Submit your Pix4D processing materials for exercise 1
- Attend the weekly call and the Mission Planner software training on Thursday evening at 8:00 pm ET
Studying Area Maps
Studying Area Maps ksc17In this section, you will understand the value of studying area maps for a project prior to the development of the flight plan.
Flight planners should acquaint themselves with the project area through two types of maps before proceeding with further steps of the design; those are U.S. Topo Quadrangle Maps and Sectional Aeronautical Charts.
U.S. Topo Quadrangles Map
The U.S. Topo Quadrangles Map, mainly a topographic map, shows the details of the contours of the land (terrain elevation). See Figure 4.1. This type of map reveals all information that a planner needs about the topography in the project area. Topography affects flight plan parameters such flight lines, spacing, and imagery spacing. Quad maps can be downloaded from the USGS. You can also review a sample of such maps for the State College area.
Sectional Aeronautical Chart
Sectional Aeronautical Charts, which are also called VFR charts (Figure 4.2), are described as “the primary navigational reference medium used by the VFR pilot community. The 1:500,000 scale Sectional Aeronautical Chart Series is designed for visual navigation of slow to medium speed aircraft. The topographic information featured consists of the relief and a judicious selection of visual checkpoints used for flight under visual flight rules. The checkpoints include populated places, drainage patterns, roads, railroads, and other distinctive landmarks. The aeronautical information on Sectional Charts includes visual and radio aids to navigation, airports, controlled airspace, restricted areas, obstructions, and related data. These charts are updated every six months, most Alaska Charts annually. To better understand these charts, review the FAA “Aeronautical Chart User Guide”. You can also watch this YouTube video on learning how to read the sectional charts:
The VFR acronym is adopted from “Visual Flight Rules” where a pilot relies on the visual see-and-avoid rule during flight. To download such charts, visit the FAA site.
The topographic map and the aeronautical chart provide an overview of the area and the contents of the ground cover (both natural and man-made), restricted airspace such as airport approaches, high towers, etc.
Visualize FAA On Line Data and Charts
No less important than visualizing a sectional chart, is to utilize the online FAA sites and other services, which allow you to zoom in to your geographic location to stand on the airspace status and the allowed flights ceiling. Here are a couple of the free services available to the public:
To Read
- Chapter 4 and 8 of the textbook: Fundamentals of capturing and processing drone imagery and data
- Section 18-10 of Chapter 18 of Elements of Photogrammetry with Applications in GIS, 4th edition
Sensors Characteristics
Sensors Characteristics ksc17Focal Plane and CCD Array
The focal plane of an aerial camera is the plane where all incident rays coming from the object are focused. The focal plane is where the film of a film-based camera is placed. With the introduction of digital cameras, the focal plane is occupied by the CCD array, replacing the film.
A digital camera like the ones we use at home is called a “digital frame” camera just to distinguish it from other designs of digital cameras such as “push broom” cameras. Digital frame cameras have the same geometric characteristics as the film camera that employs the film as the recording medium.
A digital frame camera consists of a sensor that is a two-dimensional array of charge-coupled device (CCD) elements (CCD is also called pixel). The sensor is mounted at the focal plane of the camera. When an image is taken, all CCDs of the sensor are exposed simultaneously, thus producing a digital frame. Figure 4.3 (from Wolf, page 75) illustrate how a digital camera captures an area on the ground that falls within the lens' field of view (FOV).
The size of a digital camera is measured by the size of its sensor. The higher number of CCDs (pixels) in the sensors, the bigger and more expensive the camera is. If a camera has a sensor with 4000 pixels by 4000 pixels, it is called a 16 megapixels camera. That is because it has 16,000,000 pixels. UAS imaging productivity, i.e. how many acres the UAS can cover in an hour, depends on the sensor size, battery life, and the lens focal length. The article "DJI Phantom 4 RTK vs. WingtraOne" clearly illustrates the difference between UAS productivity based on sensor and UAS capabilities. In that article, you will also learn about some fundamental capabilities that we usually expect from a mapping drone.
Lens Cone
The lens for a mapping camera usually contains compound lenses put together to form the lens cone. The lens cone also contains the shutter and diaphragm.
Compound Lens
The lens is the most important and most expensive part of a mapping aerial camera. Cameras on board of the UAS are not of that level of quality, as they were not manufactured to be used as mapping cameras. Mapping cameras are called metric cameras, and are built so that the internal geometry of the camera holds its characteristics despite harsh working conditions and changing operational environments. Lenses for cameras on board of the UAS are small in size and lighter in weight. They are also less expensive than standard mapping cameras. Lenses for mapping cameras should be calibrated to determine the accurate value for focal length and lens distortion (imperfectness) characteristics.
Shutters
Shutters are used to limit the passage of light to the focal plane. The shutter speed of aerial cameras typically ranges between 1/100 and 1/1000 seconds. Shutters are of two types: focal-plane shutters or the between-the-lens shutters. The latter one is the most common shutter used for aerial cameras. Most digital camera shutters are designed according to two mechanisms: the leaf shutter (also called mechanical or global shutter or the dilating aperture shutter) or the electronic rolling shutter (curtain or sliding shutter). The leaf shutter exposes the entire sensor array at once, while the rolling shutter exposes one line of pixels at a time. For aerial imaging from a moving platform such as a UAS, leaf shutter is recommended because it minimizes image blur. To understand the shortcoming of the rolling shutter, watch this video.
It is important to know which shutter is used for your camera as most processing software including Pix4D provide correction for the rolling shutter effect. However, the software does not correct for it automatically, and you will need to activate that option before you start processing the imagery.
More information on different types of shutter mechanisms can be found on Wikipedia's Shutter (photography) page.
To Read
- Chapter 3 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th edition
Geometry of Vertical Image
Geometry of Vertical Image szw5009In order to understand mission flight planning, you need to understand the geometry of the image as it is formed within the camera. The size of the CCD array and lens focal length, coupled with flying altitude (above ground), determine the image scale or the ground resolution of the image. Therefore, it is essential to the work of the flight planner to have all of this information understood and available before starting to design a mission.
In photogrammetry, we usually deal with three types of imagery (photography). They are defined in terms of the angle that the camera optical axis makes with the vertical (nadir). Those are:
- true vertical photography: ±0º from nadir
- tilted or near-vertical photography > 0º but less than ±3º – Most used –
- oblique photography: between ±35º and ±55º off nadir
For the purpose of this course, we will focus only on the first two types, and those are vertical and near-vertical photography.
Figure 4.3 illustrates the basic geometry of a vertical photograph or image. By vertical photograph or image, we mean an image taken with a camera that is looking down at the ground. As the aircraft moves, so does the camera, and this makes it impossible to take a true vertical image. Therefore, vertical image definition allows a few degrees of deviation from the nadir (the line connecting the lens's frontal point and the point on the ground that is exactly beneath the aircraft). In summary, a vertical image is an image that is either looking straight down to the ground or is looking a few degrees to either side of the aircraft.

Scale of Vertical Image
As the sun's rays hit the ground, they reflect back toward the camera, and some actually enter the camera through the lens. This physical phenomenon enables us to express the ground-image relation using trigonometric principles. In Figure 4.3, ground point A is projected at image location a' and ground point B is projected at image location b' on the film. From such geometry, the film's four corners, a', b', c', d', cover an area on the ground represented by the square ABCD. Such relations not only enable us to compute the ground coverage of a photograph (image) but also enable us to compute the scale of such a photograph or image.
The scale of an image is the ratio of the distance on the image to the corresponding distance on the ground. In Figure 4.4, the distance on the ground AB will be projected on the image on line ab; therefore, the image scale can be computed using the following formula:
Equation 1:
Analyzing the two triangles (the small triangle with base ab and the large triangle with base AB) of Figure 4.4, one can also conclude, using the similarity of triangles principle, that the scale is also equal to:
Equation 2:
Scale is expressed either in a unitless ratio such as 1/12,000 (or 1:12,000) or in pronounced units ratio such as 1 in. = 1,000 ft (or 1" = 1,000’).

Examples of Scale Computations
The following two examples will walk you step by step through the process of computing scales for imagery produced from a film-based camera and from a digital camera. In digital cameras, the scale does not play any role in defining the image quality, as is the case with film-based cameras. In digital cameras, we use the Ground Sampling Distance (GSD) to describe the resolution quality of the image, while in film-based cameras, we use the film scale.
Scale from Film Camera
Aerial photographs were acquired from an altitude of 6,000 ft AMT (Above Mean Terrain) with a film-based aerial camera with a lens focal length of 6 inches. Determine the scale of the resulting photography.
Solution:
From Figure 4.4 and equations 1 & 2,
Therefore,
OR
Scale = 1:12,000 or 1" = 1,000'
Scale from Digital Camera
Scale is meaningless in digital mapping products, as the scale concept was created to represent measured distances on old-day maps, which are plotted on paper. However, people are still using scale, and it would take time before the new generation of mappers embraced the digital representation of the new geospatial products. Digital camera manufacturers provide information on the sensor used in their cameras. Some of them express it as 16 megapixels, which could be a square array of 6,000 × 6,000 pixels or a rectangle with any ratio of width/height, such as 8,000 × 2,000 pixels or a ratio of width/height equal to 4. Some camera manufacturers provide the sensor array size in pixels and in millimeters, and some provide it with a combination of the number of pixels and sensor size in inches, leaving you wondering about the physical size of the CCD; see Figure 4.5. Figure 4.6 illustrates camera information that you need to dig deep into the provided information to obtain what you want. From Figure 4.6, which represents the information provided for the multi-spectral camera on board the DJI Phantom 4 agricultural UAS, you can indirectly derive the sensor dimensions from the given array size in pixels, and the CCD size, or 3 um, is inserted in the focal length information. The sensor dimensions in pixels were not provided directly, and you would need to figure it out from the two values provided for the optical center. The optical center, or the origin of the image coordinates at 0,0, is usually located in the middle, i.e., the center of the array; therefore, the total width of the array is equal to 800 pixels × 2 = 1,600 pixels, while the sensor height is equal to 650 pixels × 2 = 1,300 pixels. Knowing the number of pixels in the width direction, or 1,600, and the pixel size of 3 micrometers, the sensor width can be derived to be equal to 1,600 × 0.003 = 4.8 mm; similarly, the sensor height is equal to 1,300 × 0.003 = 3.9 mm.
The following is an example of calculating the scale for digital imagery acquired using a digital camera:
Aerial imagery was acquired with a digital aerial camera with a lens focal length of 100 mm and a CCD size of 0.010 mm (or 10 microns). The resulting imagery had a ground resolution of 30 cm (1 ft). Determine the scale of the resulting imagery.
Solution:
From Figure 4.4 and equation 1, assume that the distance ab represents the physical size of one pixel or CCD, which is 0.010 mm, and the distance AB is the ground coverage of the same pixel, or 30 cm.
Therefore,
OR
Scale = 1:30,000 or 1"=2,500'
Practice Scale Computation Example:
Aerial imagery was acquired with a digital aerial camera with lens focal length of 50 mm and CCD size of 0.020 mm (or 20 microns). The resulting imagery had a ground resolution of 60 cm (2 ft). Determine the scale of the resulting imagery.
Solution:
Scale = 1:30,000 or 1"=2,500'
Data for the table shown in Figure 4.6.
| Calibrated Focal Length | 1919.3333 | float | pixel | 5.74[mm] / 3.0[um/pixel]=1913.333... |
| Calibrated Optical Center X | 800 | float | pixel | X-axis coordinate of the designed position of optical center |
| Calibrated Optical Center Y | 650 | float | pixel | Y-axis coordinate of the designed position of optical center |
Imagery Overlap
Imagery acquired for photogrammetric processing is flown with two types of overlap: Forward Lap and Side Lap. The following two subsections will describe each type of imagery overlap.
Forward Lap
Forward lap, which is also called end lap, is a term used in photogrammetry to describe the amount of image overlap intentionally introduced between successive photos along a flight line (see Figure 4.7). Flight 3 illustrates an aircraft equipped with a mapping aerial camera taking two overlapping photographs. The centers of the two photographs are separated in the air with a distance B. Distance B is also called air base. Each photograph of Figure 4.7 covers a distance on the ground equal to G. The overlapping coverage of the two photographs on the ground is what we call forward lap.
This type of overlap is used to form stereo-pairs for stereo viewing and processing. The forward lap is measured as a percentage of the total image coverage. Typical value for the forward lap for photogrammetric work is 60%. Because of the light weight of the UAS, we expect substantial air dynamic and therefore substantial rotations of the camera (i.e., crab); therefore, I recommend the amount of forward lap to be at least 70%.

Side Lap
Side lap is a term used in photogrammetry to describe the amount of overlap between images from adjacent flight lines (see Figure 4.8). Figure 4.8 illustrates an aircraft taking two overlapping photographs from two adjacent flight lines. The distance in the air between the two flight lines (W) is called lines spacing.
This type of overlap is needed to make sure that there are no gaps in the coverage. The side lap is measured as a percentage of the total image coverage. The typical value for the side lap for photogrammetric work is 30%. However, because of the light weight of the UAS, we expect substantial air dynamic and therefore substantial rotations of the camera (i.e. crab), and therefore I recommend using at least 40% side lap.

Image Ground Coverage
Ground coverage of an image is the area on the ground (the square ABCD of Figure 4.3) covered by the four corners of the photograph a'b'c'd' of Figure 4.3. Ground coverage of a photograph is determined by the camera internal geometry (focal length and the size of the CCD array) and the flying altitude above ground elevation.
Example on Image Ground Coverage:
A digital camera has an array size of 12,000 pixels by 6,000 pixels (Figure 4.9). If the physical CCD size is 0.010 mm (10 um) camera, how much area in acres will each image cover on the ground if the resulting ground resolution (GSD) of a pixel is 1 foot?

Solution:
Ground coverage across the width (W) of the array = 12,000 pixels × 1 ft/pixel = 12,000 ft
Ground coverage across the height (L) of the array = 6,000 pixels × 1 ft/pixel = 6,000 ft
Covered area per image =
To Read
- Chapters 6 and 18 of the textbook Elements of Photogrammetry with Applications in GIS, 4th edition
Designing a Flight Route
Designing a Flight Route szw5009In this section, we start the practical work for flight planning an imagery mission. By the end of this section, you should be able to develop a flight plan for an aerial imagery mission. Successful execution of any photogrammetric project requires thorough planning prior to the execution of any activity in the project.
The first step in the design is to decide on the scale of imagery or its resolution and the required accuracy. Once those two requirements are known, the following processes follow:
- planning the aerial photography (developing the flight plan);
- planning the ground controls;
- selecting software, instruments, and procedures necessary to produce the final products;
- cost estimation and delivery schedule.
For the flight plan, the planner needs to know the following information, some of which he or she ends up calculating:
- focal length of the camera lens;
- flying height above a stated datum or photograph scale;
- size of the CCD;
- size of CCD array (how many pixels);
- size and shape of the area to be photographed;
- the amount of end lap and side lap;
- scale of flight map;
- ground speed of aircraft;
- other quantities as needed.
Geometry of Photogrammetric Block
Figure 4.8 shows three overlapping squares with light rays entering the camera at the lens focal point. Successive overlapping images form a strip of imagery we usually call a "strip" or "flight line," therefore, a photogrammetric strip (Figure 4.8) is formed from multiple overlapping images along a flight line, while a photogrammetric block (Figure 4.9) consists of multiple overlapping strips (or flight lines).


Flight Plan Design and Layout
Once we compute the ground coverage of the image, as it was discussed in the "Geometry of Vertical Image" section, we can compute the number of flight lines and the number of images and draw them on the project map (Figure 4.10), aircraft speed, flying altitude, etc.

Before we start the computations of the flight lines and image numbers, I would like you to understand the following helpful hints:
For a rectangularly shaped project, always use the smallest dimension of the project area to lay out your flight lines. This way it results in fewer flight lines and then fewer turns between flight lines (Figure 4.11). In Figure 4.11, the red lines with arrowheads represent flight lines or strips, while the black dashed lines represent the project boundary.
Figure 4.11 Correct flight lines orientationSource: Dr. Qassim Abdullah © Penn State University is licensed under CC BY-NC-SA 4.0.If you have a digital camera with a rectangular-shaped CCD array, always choose the largest dimension of the CCD array of the camera to be perpendicular to the flight direction (Figure 4.12). In Figure 4.12, the blue rectangles represent images as taken by a camera with a rectangular CCD array. The wider dimension of the array is always configured to be perpendicular to the flight direction (which is the east-west direction for this figure).
Figure 4.12 Correct camera orientationSource: Dr. Qassim Abdullah © Penn State University is licensed under CC BY-NC-SA 4.0.
Flight Lines Computations

Now, let us start figuring out how many flight lines we need for the project area illustrated in Figure 4.13, to the right. Figure 4.13 shows rectangular project boundaries (in black dashed lines) with length equal to LENGTH and width equal to WIDTH that were designed to be flown with 6 flight lines (red lines with arrowheads). To figure out the number of flight lines needed to cover the project area, we will need to go through the following computations:
- Compute the coverage on the ground of one image (along the width of the camera CCD array (or W)) as we discussed in section 4.3.
- Compute the flight line spacing as follows:
Line spacing or distance between flight lines (SP) = Image coverage (W) x (100 – amount of sidelap)/100. - Number of flight lines (NFL) = (WIDTH / SP) + 1.
- Always round up the number of flight lines, i.e., 6.2 becomes 7.
- Start the first flight line at the east or west boundary of the project.
In Figure 4.13, you may have noticed that the flight direction for each flight line alternates between North-to-South and South-to-North from one flight line to the adjacent one. Flying the project in this manner increases the aircraft's fuel efficiency so the aircraft can stay longer up in the air.
Number of Image Computations
Once we determine the number of flight lines, we need to figure out how many images will cover the project area. To do so, we need to go through the following computations:
- Compute the coverage on the ground of one image (along the height of the camera CCD array (or L)) as we discussed in section 4.3.
- Compute the distance between two consecutive images, or what we call the “airbase,” B, as follows: Airbase or distance between two consecutive images (B) = Image coverage (H) x ((100 – amount of end lap)/100).
- Number of images per flight line (NIM) = (LENGTH / B) + 1.
- Always round up the number of images, i.e., 20.2 becomes 21.
- Add two images at the beginning of the flight line before entering the project area and two images upon exiting the project area Figure 4.14 (it is needed to ensure continuous stereo coverage), i.e., a total of 4 additional images for each flight line, or number of images per flight line = (LENGTH / B) + 1 + 4.
- Total number of images for the project = NFL x NIM.
Figure 4.14 is the same as Figure 4.13 with added blue circles that represent photo centers of the designed images. The circles are only given to one flight line, and I will leave it to your imagination to fill all the flight lines with such circles.

Flight Altitude Computations
Flying altitude is the altitude above a certain datum the UAS flies during data acquisition. The two main datums used are either the average (mean) ground elevation or the mean sea level. Figure 4.15 illustrates the relationship between the aircraft and the datum and how the two systems relate to each other. In Figure 4.15, we have an aircraft that is flying at 3,000 feet above average (mean) ground elevation, represented by the blue horizontal line in the figure. We also have the mean terrain elevation (the blue horizontal line), situated at 600 feet above the mean sea level. Therefore, the flying altitude will be expressed in two ways, those are:
- if we want to use the terrain as a reference, we will express it as flying altitude = 3,000 feet above mean terrain, or AMT;
- if we use the sea level, we will express it as Flying Altitude = 3,600 feet above mean sea level (ASL or AMSL).

We now need to determine at what altitude the project should be flown. To do so, we go back to the camera's internal geometry and scale, as we discussed in section 4.3. Assume that the imagery is to be acquired with a camera with a lens focal length of f and with a CCD size of b. We also know in advance what the imagery ground resolution, or GSD, should be. The flying altitude will be computed as follows:
OR
From which, H can be determined:
Here, we need to make sure that both f and ab are converted to have the same linear unit, in which case the resulting altitude will be in the same linear unit of the GSD. If we assume the following values:
The flying altitude will be:
The flying height is 1,500 meters above ground level.
Aircraft Speed and Image Collection
Controlling the aircraft speed is important for maintaining the necessary forward or end lap expected for the imagery. Fly the aircraft too fast, and you end up with less forward lap than anticipated, while flying the aircraft too slowly results in too much overlap between successive images. Both situations are harmful to the anticipated products and/or the project budget. A little amount of overlap reduces the capability of using such imagery for stereo viewing and processing, while too much overlap results in too many unnecessary images that may affect the project budget negatively. In the previous subsections, we computed the airbase, or the distance between two successive images along one flight line that satisfy the amount of endlap necessary for the project. Computing the time between exposures is a simple matter once the airbase is determined and the aircraft speed is decided upon.
Computing the time between two consecutive images
When the camera exposes an image, we need the aircraft to move a distance equal to the airbase before it exposes the next image. If we assume the aircraft speed is (v) therefore the time (t) between two consecutive images is calculated from the following equation:
For example, if we computed the airbase to be 1,000 ft and we used aircraft with a speed of 150 knots, the time between exposures is equal to:
Waypoints
In the navigation world, waypoints are defined as “sets of coordinates that identify a point in physical space.” Close to this definition is the one used by mapping professionals, and that involves using sets of coordinates to locate the beginning point and the end point of each flight line. Waypoints are important for the pilot and camera operator to execute the flight plan. Waypoints in manned aircraft imagery acquisition are usually located a couple of miles outside the project boundary on both sides of the flight line (i.e., a couple of miles before approaching the project area and a couple of miles after exiting the project area, or for UAS operations, it would be a couple of hundred meters before approaching the project area and a couple of hundred meters after exiting the project area). The pilot uses waypoints to align the aircraft to the flight line before entering the project area. In UAS operation, a "waypoint" marks the beginning or the end of a flight line where the UAS either positions itself before starting to take pictures or ends taking pictures on a certain flight line.
Example of Flight Plan Design and Layout
A project area is 20 miles long in the east-west direction and 13 miles in the north-south direction. The client asked for natural color (3 bands) vertical digital aerial imagery with a pixel resolution or GSD of 1 ft using a frame-based digital camera with a rectangular CCD array of 12,000 pixels across the flight direction (W) and 7,000 pixels along the flight direction (L) and a lens focal length of 100 mm. The array contains square CCDs with a dimension of 10 microns. The end lap and side lap are to be 60% and 30%, respectively. The imagery should be delivered in TIFF file format with 8 bits (1 byte) per band or 24 bits per color for three bands (RGB). Calculate:
- the number of flight lines necessary to cover the project area if the flight direction was parallel to the east-west boundary of the project. Assume that the first flight line falls right on the southern boundary of the project;
- the total number of digital photos (frames);
- the ground coverage of each image in acres;
- the storage requirements in gigabytes aboard the aircraft required for storing the imagery;
- the flying altitude;
- the time between two consecutive images if the aircraft speed was 150 knots.
Solution:
Looking into the project size (20 × 13 miles) and the one-foot GSD requirements, a mission planner should realize right away that the image acquisition task for such a project size and specifications can only be achieved using a manned aircraft.
The camera should be oriented so the longer dimension of the CCD array is perpendicular to the flight direction (see Figure 4.12).
Number of flight lines necessary to cover the project area:
Line spacing or distance between flight lines (SP)
Number of flight lines (NFL)
(with rounding up)
Total number of digital photos (frames):
Airbase or distance between two consecutive images (B)
Number of images per flight line
Total number of images for the project
Ground coverage of each image in acres:
Ground coverage of each image
The storage requirement for the RGB (color) images:
Storage requirement for 1 band
Each pixel needs one byte per band; therefore, each of the three (R, G, B) bands needs to be accounted for.
Total Storage requirement
Flying Altitude (H):
Time between acquisition images:
Cost estimation and delivery schedule
Past experience with projects of a similar nature is essential in estimating cost and developing a delivery schedule. In estimating cost, the following main categories of efforts and materials are considered:
- labor
- materials
- overhead
- profit
Once quantities are estimated as illustrated in the above steps, hours for each phase are established. Depending on the project deliverables requirements, the following labor items are considered when estimating costs:
- aerial photography
- ground control
- aerial triangulation
- stereo-plotting (# of models = # photos -1)
- map editing
- ortho production
- LiDAR data cleaning
The table in Figure 4.16 provides an idea about the going market rates for geospatial products that can be used as guidelines when pricing a mapping project using manned aircraft operation and metric digital camera and lidar. The industry needs to come up with a comparable table based on unmanned operations. There is no good pricing model established for UAS operation, as the standards and product quality are widely variable depending on who offers such services and whether they fall strictly under the "Professional Services" designation.
| Product | GSD ft | Price per sq mile | Comments |
|---|---|---|---|
| Ortho | 0.5 | $150-$200 | Based on large projects |
| Ortho | 1.0 | $80-$100 | Based on large projects |
| Ortho | 2.0 | $30-$60 | Based on large projects |
| lidar | 3.2 | $100-$500 | Depends on accuracy, terrain, and required details |
Delivery Schedule
After the project hours are estimated, each phase of the project may be scheduled based on the following:
- number of instruments or workstations available
- number of trained personnel available
- amount of other work in progress and its status
- urgency of the project to the client
The schedule will also consider the constraints on the window of opportunity due to weather conditions. Figure 4.17 illustrates the number of days, per state/region, available annually for aerial imaging campaigns. Areas like the state of Maine have only 30 cloudless days per year that are suitable for aerial imaging activities.

To Read
Chapter 18 of Elements of Photogrammetry with Applications in GIS, 4th edition
To Do
For practice, develop two flight plans for your project, one by using manual computations and formulas as described in this section and one by using "Mission Planner" software. Compare the two.
Sensors Calibration and Boresighting
Sensors Calibration and Boresighting szw5009In this section, we will discuss the topics of camera calibration and sensor boresighting.
Camera Calibration
Most existing UASs that are dedicated to photogrammetric imaging carry on board less expensive cameras that we call nonmetric cameras. Nonmetric cameras are cameras with variable interior geometry (i.e., unknown focal length) and with relatively large lens distortion. In order to conduct photogrammetric mapping from the resulting imagery from such cameras, we need to determine to a known accuracy all interior camera parameters such as the focal length and the coordinates of the principal point, and to model the lens distortion.
The principal point of a camera is the point where lines from opposite corners of the CCD array or the lines connecting the opposite mid-way points of the CCD array sides intersect, Figure (4.18). However, when the lens is fitted on the camera body, it is impossible to align the center of the lens and the principal point described above, resulting in offset distances xp and yp as illustrated in Figure 4.18. Those two values are determined in the process of camera calibration that needs to be represented in the photogrammetric mathematical model during computations.
Mapping film camera calibration was usually performed in special laboratories dedicated to this task such as the USGS calibration lab for film cameras, which was shut down permanently on April 1, 2017 after decades of services to the mapping community. However, with the advancements in the computational analytical model in photogrammetry, we can determine the camera parameters analytically through a process called camera self-calibration from within the aerial triangulation process. Most UAS data processing software such as the one used in this course support camera self-calibration.

Sensors Boresighting
The term “boresighting” is usually used to describe the process of determining the differences in the rotations of the sensor (such as camera) rotational axes and the rotational axes of the Inertial Measurement Unit (IMU), which is usually bolted to the camera body. The IMU is a device that contains gyros and accelerometers used in photogrammetry and lidar to sense and measure sensors rotations and accelerations. In photogrammetry where the IMU is used on an imaging camera, the boresight parameters are determined by flying over a well controlled site (site with accurate ground controls) and then conducting aerial triangulation on the resulted imagery.
The aerial triangulation process will compute the six exterior orientation parameters (X, Y, Z, omega, phi, kappa) while the IMU will measure the three orientation parameters' roll, pitch, and heading (or yaw). Comparing the two sets of the orientation angles of the camera as computed by the aerial triangulation and measured by the IMU, one can establish the differences in the rotations of the camera in reference to the inertial system (from the IMU). These differences (or offsets values) will be used to correct all the future IMU-derived orientation to convert the rotation angles from inertia to photogrammetric systems so it will be utilized in the mapping process.
A similar process is followed for determining the offset values for the IMU used in the lidar system. For the lidar offset determination, there is no aerial triangulation used as it follows different processing steps. To determine the boresight offset values in lidar, the lidar has to be flown in a certain configuration over a well controlled site. Figure 4.19 represents an ideal design for lidar boresight determination. From the figure, there are two lines flown in the east-west directions (one flight line flown due east and the other flown the opposite direction, due west) from a certain altitude and two flight lines flown in the opposite direction (north-south) from an altitude that is nearly double the altitude of the east-west flight lines.

To Read
- Sections 3-9, 3-10, 3-11, 3-12 of Chapter 3 and sections 11-12 of Chapter 12 of Elements of Photogrammetry with Applications in GIS, 4th edition
- Chapter 3 of the textbook: Fundamentals of capturing and processing drone imagery and data
- In-Situ Camera and Boresight Calibration with Lidar Data
- USGS/OSU Progress with Digital Camera in Situ Calibration Methods
Basic Considerations for Selecting UAS
Basic Considerations for Selecting UAS szw5009In this section, you will understand the requirements for selecting a UAS. Selecting a UAS depends on many factors that are closely related to the intended use of the UAS. Such use requirements will determine the size and weight of the UAS, and its endurance and range of flight, among other factors. In the following sections, we will briefly discuss each of these factors.
Size and weight
Size and weight a play great role in determining payload size and weight and in limiting its range and endurance. Large UASs have the capability of carrying a larger and heavier payload, including the power source. The larger the UAS, the more fuel or battery power it can carry on board. The more power the UAS can carry on board, the better range and endurance of the UAS.
Range and Endurance
The range of a UAS is an important performance characteristic. It is dependent on a number of basic aircraft parameters and weight of the payload. Maximum UAS range and endurance can be achieved with high propeller efficiency, low fuel consumption, and large onboard fuel (or battery power) capacity. A project that requires long hours in the air will need a larger UAS. However, most UASs that are employed for geospatial mapping purposes now days have an endurance of 90 minutes and a maximum range of around 50 miles.
Stability
In physical mechanics, stability refers to the tendency of an object to stay in its present state of rest or motion despite small disturbances. An aircraft must be stable in order to remain in flight. The forces acting on the aircraft, such as thrust, weight, and aerodynamic forces, have to be in certain directions in order to restore the aircraft to its original equilibrium position after it has been disturbed by a wind or other forces. An aircraft has angular degrees of freedom. Those are rotation around the X-axis or roll, the rotation around the Y-axis, or pitch, and the rotation around the vertical to the ground, or yaw. The aircraft has to remain stable around each of these axes. The most critical rotation is the pitch, and stability about it is called longitudinal stability. Some instability can be tolerated around the roll and the yaw.
Stability is essential for aerial data such as imagery acquisition in order to achieve gap-free imaging results. The use of a gyro-stabilized mount for the camera or the imaging sensor is preferred for mapping missions, as it results in uniform coverage free of gaps.
Cost
UAS costs play a great role in the decision for acquiring one. The price of a large UAS sometime exceeds the price of a typical manned aircraft, such as various models of Cessnas, used for aerial imaging. However, the cost of a UAS is justified by the type of jobs that are expected for the use of the UAS. Smaller UAS-based aerial imaging jobs are only justified through the use of a small UAS that costs under $100,000. It is worth mentioning here that due to strict regulations by the FAA on flying UAS, there are no large jobs for the UAS at the current time within the geospatial mapping community. No one can commercially utilize UASs for money-making projects, therefore only smaller UASs are utilized by the mapping community. Once the FAA eases the regulation, we should expect larger demand for medium or large UAS.
Payload Capacity
The maximum weight that a UAS can carry on board also plays an important role in the decision of UAS selection. Different applications require different sensors and therefore different payload capacities. Current UAS used by the mapping community can carry a payload varying in weight between a few to 100 lbs. The payload capacity directly affects the cost of the UAS, as it limits the range and endurance for the UAS. UAS with longer range and endurance cost more than those that fly a maximum distance of 35 miles and for a period of 60 minutes.
To Read
Read the article "Five Things to Consider when Adopting Drones for Your Business" by Drone Analyst.
To Do
Practice with the use of Pix4D software to process the sample data.
UAS Market Survey
UAS Market Survey szw5009In this section, you will gain an understanding of the different brands and makers of the UAV, payload sensors, and processing software.
Market survey of the Air Vehicle (UAV)
Large UAS that are used mainly for defense purposes are around for a long time and have sophisticated technologies built into them. Examples of the manufacturers of such UAS are AAI Corporation, AeroVironment, Aurora Flight Sciences, BAE Systems, Boeing, Elbit Systems, General Atomics Aeronautical Systems, Inc., Israel Aerospace Industries, Northrop Grumman, Raytheon, Rotax, Sagem, Selex Galileo, and many others. Within the last decade, many startup companies started manufacturing low-cost UAS that are mainly used for civilian purposes. Examples of those manufacturers are Trimble, Altavian, Sensefly Ltd, American Aerospace Advisors, Prioria, Uconsystem, Idetec, and many more.
The following four resources contain good information on existing systems and manufacturers:
- Unmanned Aerial System (UAS) Survey
- GIM International Volume 28 Spring 2014
- UAS Suppliers
- 24th annual edition of Shephard Media’s "Unmanned Vehicles Handbook"
Market survey of Payload Sensors
The sensors required for UAS that are utilized for mapping purposes are mainly limited to cameras (Visible, near-infrared, and thermal infrared). The second resource provided in the previous section offers a list of sensors manufacturers that are used for UAS payloads. UAS payloads used for the mapping community mainly include imaging cameras. Such cameras have a variety of spectral bands such as visible (Red, green, blue), near infrared (NIR) and thermal infrared. There is only one LiDAR system developed mainly for the UAS and that is the VUX-1 manufactured by Riegl, which was described in Lesson 2. The most obvious provider of digital cameras (without endorsing any of them) that are small enough to fit within UAS payloads are the following:
- Phase One, with their multiple models of aerial cameras.
- Imprex, with their latest model of Bobcat cameras
- Nikon, with their multiple models of cameras
- Mecasense with their multiple models of multi-spectral cameras or Parrot for their Sequoia camera
Market survey of Processing Software
For image-based mapping products generation, users will need efficient photogrammetric processing software. Such software should be capable of performing the following operations, among others:
- organizing the input imagery, camera calibration reports, GPS-derived camera position, IMU-derived camera orientation angles, and ground controls data in a simple database;
- having user-friendly graphical user interface (GUI);
- having good data viewers, i.e. orthos and DSM;
- handling tens of thousands of images per project in TIFF or JPEG formats;
- image coverage verification through rapid data processing mode;
- performing automatic aerial triangulation processing using simultaneous bundle block adjustment with viewing and manual editing capability;
- accepting GPS-derived camera position and IMU-derived camera orientation;
- camera self-calibration;
- modeling GPS shift and drift;
- producing quality control reports;
- exporting exterior orientation parameters for photogrammetric work station;
- performing ortho rectification;
- automatic DSM generation through auto-correlation;
- performing image mosaic and capability to edit mosaic lines;
- exporting ortho tiles according to user defined layout in shape file format;
- performing color balancing and radiometric enhancement;
- distributed processing (parallel processing) using computing farm;
- batch processing or scripting.
Among the most obvious data processing software that are optimized for UAS data processing in the market (without endorsing any of them) are the following:
- Agisoft Metashape
- Pix4DMapper
- Menci APS
- CORRELATOR3DTM by SimActive
- Trimble Inpho UASMaster
Each of these five software packages meets most of the capabilities listed above. However, some of them may be more suitable than others, depending on the situation and the nature of the project.
To Do
- Practice more with the use of Pix4D software to process the sample data. Produce ortho photo and DSMm and send me screenshots of the products.
Summary and Final Tasks
Summary and Final Tasks szw5009Summary
Congratulations! You have just finished Lesson 4, UAS Mission Planning and Control. I hope that you appreciate the importance of this lesson material in relation to the Concept of Operation for any UAS. UAS projects based on poor planning mean nothing but guaranteed failure or/and poor quality derived products. Computations may seem complicated, but I tried to walk you through the different steps with details. However, if you feel that you are overwhelmed with understanding the design concepts, please do not hesitate to write to me.
Final Tasks
| 1 | Study lesson 4 materials and the text books chapters assigned to the lesson |
|---|---|
| 2 | Complete the Lesson 4 Quiz. |
| 3 | Complete your discussions for the assignment on "SWOT Analysis" |
| 4 | Continue working on the "CONOP and Risk Assessment" report assignment |
| 5 | Practice Mission Planner software |
| 6 | Submit your Pix4D processing materials for exercise 1 |
| 7 | Attend the weekly call and the Mission Planner software training on Thursday evening at 8:00pm ET |
Lesson 5: Geospatial Mapping and Maps Production
Lesson 5: Geospatial Mapping and Maps Production sxr133Lesson 5 Introduction
Lesson 5 Introduction AnonymousWelcome to Lesson 5! In this lesson, you will understand and be familiar with the photogrammetric process, the processing systems, and data generation from an image-based UAS. Most applications of the UAS today include one form or another of a camera system (video or still camera) from which different interpretations and therefore different applications are evolved. You will also develop understanding of processes such as aerial triangulation and ortho rectification, which are the backbone of any image processing facility. The photogrammetric textbook Elements of Photogrammetry with Applications in GIS will be your companion, beside the lesson notes, in understanding the topic.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- understand the concept of sensor and product geolocation;
- understand the concept of direct geo-referencing;
- understand the concept of aerial triangulation;
- outline complete UAS data processing workflow;
- distinguish between different products obtainable from different UAS payload sensors.
Lesson Readings
Course Textbooks
- Chapters 1, 11, 16, and 17 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th edition
- Chapters 2 and 9 of the textbook: Fundamentals of capturing and processing drone imagery and data
- Chapter 2 of the textbook: Unmanned vehicle systems for geomatics: towards robotic mapping
Lesson Activities
- Study lesson 5 materials on CANVAS/Drupal and the textbook chapters assigned to the lesson
- Start your first post for the discussion on "Human Elements of UAS."
- Submit your "CONOP and Risk Assessment" assignment report
- Complete quiz 5
- Start UAS Data Processing Using Pix4D for Exercise 2
- Submit final project idea
- Attend the weekly call and Exercise 2 training on Thursday evening at 8:00 pm ET
The Photogrammetric Process
The Photogrammetric Process ksc17In this section, you will understand the photogrammetric process and the different steps the product goes through in order to develop an ortho photo or digital elevation model.
Figure 7.1 illustrates the different steps of processing that imagery from a UAS is subject to in order to produce a mapping product such as an ortho photo or digital elevation model.

Figure 7.1 Process flow of the photogrammetric processing
As we learned in Lesson 4, the process starts with the mission planning process. Once all the parameters and requirements are defined for the mission, a flight plan is developed and aerial imagery is acquired according to the project specifications. The resulting imagery will be reviewed to assure the expected quality. Following the image QC, the field work will be conducted to survey the necessary ground controls. The ground controls survey can be conducted either before the imagery acquisition, or after it is completed.
Once the imagery acquisition and the ground control survey are completed, work can begin on the process of aerial triangulation. Aerial triangulation, as it will be described in section 7.2, is performed to determine the position and the orientation of the camera at the moment of exposure of each image. It includes a few processing concepts, such as interior and exterior orientations, relative orientation, and absolute orientation. Aerial triangulation is achieved through processing software that is based on rigorous mathematical models based on least squares. Once the aerial triangulation is completed, the imagery is ready to go through other processing steps such as ortho rectification and digital elevation modeling.
To Read
- Chapter 1 of Elements of Photogrammetry with Applications in GIS, 4th edition.
Imagery Geo-location
Imagery Geo-location ksc17In this section, you will learn about the concept of geo-referencing imagery, which is an important concept. Without it, no further photogrammetric processing of the imagery can take place.
In order to utilize the photogrammetric mathematical model, i.e., the collinearity condition, for the production of any mapping products, the following information needs to be made available:
- The exterior orientation parameters for every image: Six parameters which represent the camera attitude or orientation represented by the three rotational angles omega, phi, and kappa, and camera position, which is represented by the three coordinates Easting, Northing, and Elevation at the moment of image exposure.
- The camera interior geometry parameters: The calibrated lens focal length, the principal point coordinates, and the lens distortion as it was discussed in lesson 6.
- The size of the CCD array: The number of pixels contained in the CCD array along the width and the height of the array.
- The physical size of the CCD: Usually provided in microns such as 14 u (1 mm is equal to 1000 um).
- Ground Controls: A ground control is a feature in the imagery with known accurately surveyed coordinates. Depending on the required accuracy of the final products, ground controls can be omitted in some situations.
In this section, we will focus on the process of determining the six exterior orientation parameters. The camera position can be measured accurately using the airborne GPS technique using a GPS antenna on board the UAS. The three camera positions can also be computed using the process of aerial triangulation, as we will discuss soon. However, there are two methods for determining the camera attitude or orientation, and those are the aerial triangulation process and the direct measurement from the IMU, as we discussed in Lesson 6.
Aerial Triangulation and Bundle Block Adjustment
Aerial triangulation is usually performed on a photogrammetric block (Figure 7.2), which consists of all the imagery acquired over the project area. Figure 7.2 illustrates a photogrammetric block of imagery consisting of three strips, each of which has multiple overlapping images. Also shown are the different types of image overlaps. The top and middle strips contain images with 60% forward lap, while the bottom strip contains imagery with 80% forward lap. You may also notice in the figure that the middle and the bottom strips are overlapping by the amount of 30%. Such overlap is called side lap.

In the last section (the photogrammetric process), we mentioned a few terms related to aerial triangulation. We will briefly describe these terms in the following sub-sections:
Relative Orientation
Relative Orientation is the process of orienting images relative to one another (i.e., it recreates the “relative” position and attitude of the images at the instants of exposure), as illustrated below. Figure 7.3 shows four images that are connected to each other in space through the aircraft/GPS trajectory but are not necessarily connected to the ground datum (i.e., they are floating in space).

Relative orientation is an important process that must be performed before we scale the imagery to the ground datum through the process of absolute orientation, which will be discussed in the next section. To form a cohesive block, all images in the block should be relatively oriented with respect to each other through the process of relative orientation.
Absolute Orientation
The process of leveling and scaling the stereo model (formed from two images) with respect to a reference plane or datum using ground control points is shown in Figure 7.4. Figure 7.4 represents the same four images as Figure 7.3, but this time the block was tied to the ground datum through the use of seven ground control points (represented by the black stars).

Without performing the absolute orientation process, the generated map would not be specifically associated with a certain location in space. Generating maps that have geo-location information such as datum and coordinates systems can only happen after the process of absolute orientation is performed following relative orientation.
Exterior Orientation
Exterior orientation of a photograph defines its position and orientation in the object space. There are six elements of exterior orientation, X, Y, and Z of the exposure station position, and the three angles that define the angular orientation: ω, φ, and κ. The six elements of exterior orientation are not known and must be computed through a process called space resection within the aerial triangulation process. Here is the definition of the three orientation angles illustrated in Figure 7.5:
- Omega (ω): Rotation about the x axis. It is equivalent to the angle Roll of the navigation system.
- Phi (φ): Rotation about the y axis. It is equivalent to the angle Pitch of the navigation system.
- Kappa (κ): Rotation about the z axis. It is equivalent to the angle Yaw of the navigation system.

Knowing the six exterior orientation parameters for an image is necessary for any photogrammetric processing aimed at creating products from such an image. Whether you perform map compilation on a stereo plotter or generate an ortho image, the six exterior orientation parameters need to be computed before you start the production process.
Space Resection
Space Resection is the process of determining ray intersection in space to conclude camera position. See Figure 7.6. The method of space resection is a purely numerical method using collinearity equations to simultaneously yield all six elements of exterior orientation (X, Y, Z , omega, phi, and kappa). Once these elements are known, a stereo plotter can measure the photo coordinates of any point in a photo (x,y) and the ground coordinates can be computed. Ortho rectification software also utilizes space resection for ortho-rectifying an image. Figure 7.6 illustrates six images. Each of them has rays from the ground entering the camera through the lens. The intersection of the rays entering the camera at point "O" represents the photo center location, which is important for the determination of the exterior orientation parameters described earlier.

Aerial triangulation
Aerial triangulation can be defined as the process of densification of a sparsely distributed horizontal and vertical control network through:
- measurements performed on overlapping aerial photographs,
- known ground control points coordinates on the ground, and
- mathematical modeling and solution.
A conventional (film based) aerial triangulation process consists of the following steps:
- preparation
- point marking (for tie points and pass points marking)
- measurement
- computation
Data Preparation: Using a stereoscope, three points are selected down the center of each photo, approximately 1” from the top and bottom and at the center. These points are also marked on every overlapping photo on which they occur. They are often called “pass points” along strips and “tie points” between strips. See Figure 7.7. Ideally, pass points are selected in flat areas of high contrast that are free of obstructions and shadows.
Figure 7.7 represents three overlapping photos that are used to extract pass points between them. Notice that the three middle points for the middle photo (a, b, c) were located and marked on the same locations in the overlapping right and left image. This process is called point marking.

Point Marking: A good point marking device is characterized with:
- precise optics for stereo viewing;
- variable zoom - 6X to 25X;
- laser beams, hot needles, mechanical or electric drills that will remove emulsion from the dispositive;
- the ability to create a very precise circular mark, typically from 40 to 80 microns in diameter.
One of the earliest commercially successful point marking devices was the P.U.G., manufactured by Wild Heerbrugg Instruments, Inc. See Figure 7.8. Over time, pass points marked on dispositive became known simply as pug points.

Point Measurement: A skilled technician with analytical stereo plotting instruments records the location of each previously marked Pass point and tie point on each photograph.
Numerical Computation of Aerial Triangulation: Here is a summary for the steps taken within the processing software:
- Processing numerical observations of individual photographs to build a cohesive block.
- Forming individual photos into strips by successive, relative orientations, using the common primary pass points between overlapping photos.
- Computing Horizontal and vertical coordinates for each strip.
- Converting strip coordinates to ground coordinates using the ground control contained within a given strip.
- Applying simultaneous polynomial equations (horizontal and vertical) to produce final adjusted values for all points.
- Calculating exterior orientation elements for each photo to be used as input to a bundle adjustment program.
Unlike the aerial triangulation of the past, which was performed using film-based imagery instead of digital imagery and optical-mechanical instruments, today aerial triangulation is performed on digital imagery using a complete softcopy approach called softcopy aerial triangulation. In softcopy aerial triangulation, all manual work of points marking and measurements are left to the automation of the software. It is more efficient and more accurate.
Mathematical Model for Aerial Triangulation
The backbone of the computational model in Photogrammetry is based on two equations called the collinearity equations, which are based on the collinearity condition. The two collinearity equations are represented below:
Where,
Xc, Xc, Xc = Camera perspective center
X, Y, Z = ground point position
x, y = point position on image
mii = photo orientation matrix
f = camera lens focal length
x0, y0 = Principal point of autocollimation
Direct Geo-referencing
In the last two decades, navigation technologies have advanced to the point that enabled manufacturers of the Inertial Navigation Systems (INS), usually used for missiles and submarines navigation, to produce an Inertial Measurement Unit (IMU) to accurately measure the orientation of airborne sensors such as cameras and LiDAR. The IMU, which we briefly described in Lessons 2 and 6, are used either to replace the process of aerial triangulation or to assist its solution. Most UAS, including the small ones, carry on board a GPS unit and an IMU unit. Unfortunately, most of these miniaturized low cost IMU that are used for UAS are not accurate enough to replace the aerial triangulation. Such low accuracy IMU is usually used to navigate the UAS but not to support the aerial triangulation. On the other hand, the GPS antenna in most UAS is a survey grade quality that can receive signals from both GPS and GLONASS. Some of the UAS can receive signals from OMNISTAR with real time corrections.
To Read
- Chapter 11 and 17 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th edition
Ground Control Requirement
Ground Control Requirement ksc17In this section, we will discuss an important topic to any photogrammetric work: ground controls.
A ground control, which we introduced in the last section, is a target in the project area with known coordinates (X,Y,Z). Accurate, well-placed ground controls are essential elements for any photogrammetric project utilizing aerial triangulation.
There are two standard types of ground control points (Figure 7.9), those are:
- Photo Identifiable (Photo ID): This could be any feature on the ground such as a manhole, parking stripe, etc. (the right two images of Figure 7.9). This type of control does not need to be surveyed before the UAS flies the project, as it can be surveyed later on.
- Pre-marked (Panels): This type is generated by marking or painting certain figures or symbols on the ground before the UAS flies the project (the left two images of Figure 7.9). This type of control also does not need to be surveyed before the UAS flies the project as it can be surveyed later on; however, if temporary markers that can be disturbed or moved are used, they should be surveyed ahead of time..
Many projects make use of one type or the other, or a combination of the two.

The leftmost image In Figure 7.9 represents a pre-marked control point set on black and white fabric, while the image next to it represent a pre-marked control point that is spray-painted on a sidewalk. The rightmost images represent different types of photo identifiable ground control points. On these images, the user can pick any visible ground feature (such as a parking strip or edge of where the concrete meets the asphalt pavement on a bridge) to use as a control point.
There are two techniques to survey ground control points. The most common one is using RTK GPS techniques, as it is the fastest and least expensive. RTK survey results in a horizontal accuracy of about 2cm and about 3cm vertical accuracy. RTK survey is widely used for mapping projects. The second survey technique which is much more expensive is differential leveling for height determination and static GPS for horizontal survey. Differential leveling results in around 1cm vertical accuracy. Here in the United States, surveying a point using RTK GPS usually costs between $150.0 and $300 depending on the location and terrain. Differential leveling costs around $1,000 to $2,000 per point, again depend on location and terrain. Selecting one type of surveying technique versus another depends on the expected mapping product's accuracy. Consult the American Society of Photogrammetry and Remote Sensing (ASPRS) Positional Accuracy Standards for Digital Geospatial Data and chapter 9 to stand on the accuracy requirement of the ground control based on product accuracy.
Ground control requirements vary from one project to another depending on the project specifications and its geographic extent. Projects with high geometrical accuracy requirements require more ground controls. Figure 7.10 illustrates typical distribution of ground controls in a rectangular shaped project when the aircraft does not carry on board a GPS antenna, resulting in a non-GPS supported aerial triangulation, or what is usually called “conventional aerial triangulation.”

However, most aerial triangulation today is solved with airborne GPS data. Having GPS data in the aerial triangulation process saves a tremendous number of ground controls. Figure 7.11 illustrates the low density of ground controls required for GPS-based aerial triangulation.

Despite having ground controls only at the edges of the flight line as shown in Figure 7.11, having few additional controls along the interior of the block (see Figure 7.12) is a wise strategy, especially as high accuracy is expected form the aerial triangulation. Savings can be made in the control survey by replacing most of the ground control points at the edges of flight lines with imagery taken with a flight line perpendicular to the project flight lines at each end of the block (see Figure 7.13). Such additional flight lines that are perpendicular to the normal project flight lines are called “cross flight lines.”

Adding two cross flights (strips) at each edge of the photogrammetric block not only saves on number and cost of the ground control points but it also provides strength to the mathematical model within the bundle block adjustment computations. It helps in modeling and solving GPS and IMU problems.

To summarize the subject of ground control requirement for a block, we start with Figure 7.10, which represents the most control consuming case. That is the case of conventional aerial triangulation, where we do not use GPS on the camera during imagery acquisition. Then comes the most efficient method of aerial triangulation, and that is GPS-based aerial triangulation. Figures 7.11 through 7.13 represent different distribution of ground controls for GPS-based aerial triangulation. Each case has its strength and weakness, however, the configuration in Figure 7.13 represents the most economical way when it comes to the reduction in the ground controls requirement.
To Read
- Chapter 16 of the textbook: Elements of Photogrammetry with Applications in GIS 4th edition
Products Generation
Products Generation ksc17In this section, we will discuss products generated from image-based UAS. Although imagery collected by UAS can be used in a variety of applications in the field of remote sensing, we will focus in this lesson on two main mapping products; those are the ortho photo and the digital elevation model.
Digital Ortho Photo (Ortho Map)
Digital ortho, ortho photo, orthographic image, or ortho map are different names for the same thing. Ortho photo, which I like to call it most of the time, is an image that is corrected (through the process of ortho-rectification) from the effect of terrain relief or sensor tilt to convert it to a unified scale map. Row images taken over variable terrain will have different scales at different locations on the image. Pixels covering the terrain of the ridge of a mountain will cover a smaller spot, as it is closer to the sensor (aircraft), as compared to a pixel covering a valley.
Performing the process of ortho-rectification will sample all these pixels, so each pixel covers exactly the same ground resolution or GSD regardless of where it falls in the image or from which terrain it originated. In other words, ortho-rectification means reprocessing the raw digital image to eliminate the scale variation and image displacement resulting from terrain relief and sensor (camera) tilt.
Because ortho photos are geometrically corrected, they can be used as map layers in GIS, overlaying, management, update, analysis, or display operations. This is a great advantage offered by the ortho photo as compared to the raw imagery.
The five primary ingredients for the ortho photo generation are the following:
- digital imagery;
- digital elevation model or topographic dataset;
- exterior orientation parameters from aerial triangulation or IMU;
- camera calibration report;
- photogrammetric processing software that utilizes collinearity equations.
An ortho photo produced using a digital elevation model for the bare earth (no buildings or trees in it) is usually called “ground ortho.” In ground ortho, the building lean is not removed in the process of ortho rectification, and buildings will appear to lean radially away from the center of the image, as you can see in the image of the World Trade Center in Baltimore on the left side of Figure 7.14. On the other hand, "true ortho" is an ortho where the buildings look as if they are erected straight up or as if you are looking at them from right above the roofs, as is illustrated in the right image of Figure 7.14. True ortho is very useful in urban areas, such as downtowns with tall buildings, as it reveals all the information in the streets and pathways surrounding the buildings. True ortho is computationally intensive and needs three-dimensional models of all buildings in the image, which makes it more costly than ground ortho.

It is very important to evaluate the quality of ortho-rectification, as it may cause some defects. Examples of such common defects are the following:
- Image Completeness:
- Root cause: Image not adequately covered by DEM.
- Image Smearing:
- Root cause:
- anomalies or spike error in DEM;
- excessive relief.
- Root cause:
- Double image on adjacent ortho sheets
- Root cause:
- improper camera orientation;
- inaccurate DEMs.
- Root cause:
- Missing Image
- Root cause:
- improper camera orientation;
- inaccurate DEMs.
- Root cause:
- Mismatch of two adjacent orthos
- Root cause:
- inaccurate camera position and orientation;
- inaccurate DEMs.
- Root cause:
Digital Terrain Data
Similar to LiDAR, stereo imagery can be used to generate accurate digital elevation models. Most software used for UAS data processing has the capability of image matching technique to produce fine quality elevation models that can be used for the ortho rectification process and other terrain modeling purposes. The main ingredients for digital terrain data generation are:
- digital imagery;
- exterior orientation parameters from aerial triangulation or IMU;
- camera calibration report;
- photogrammetric processing software that utilizes the image matching technique.
Until recently, users did not trust the poor quality of the auto-correlated digital terrain data. However, in the last couple of years, software development companies adopted a new algorithm called “Semi Global Matching” or SGM that results in fine quality elevation data that in some ways competes with the elevation model generated by LiDAR. This made users excited again about using imagery for the development of a fine quality digital elevation data. The SGM algorithm is a new image matching approach that originated in the computer vision community. It utilizes auto-correlation matching technique based on aggregates per-pixel matching costs that was not possible with the old auto-correlation algorithms.
As it is in ortho photo production, digital elevation data needs to be evaluated to stand on the quality of the data.
There are a couple of terms that are used in the geospatial community to describe digital terrain data, those are:
- Digital Surface Model (DSM): It is also called reflective surface. Such surface represents the original LiDAR data before any feature such as buildings and trees are removed from it. It also represents the elevation model generated from the image auto-correlation process in photogrammetry. Both LiDAR and image auto-correlation collect data on top of natural ground surfaces such as terrain and trees and man-made materials such as buildings and other structures (Figures 7.15 and 7.16 below).


- Digital Terrain Model (DTM): DTM is a term usually associated with digital elevation models of just the ground (trees and man made structures are removed.) DTM is sometimes augmented with 3-D modeling of abrupt changes in the terrain using 3-D lines called break lines. DTM usually contains arbitrary distributed elevation points (not at equal space or grid) called mass points and break lines.
- Digital Elevation Model (DEM): DEM is a term usually associated gridded digital terrain model or points are distributed at equal interval or grid.
- Triangulated Irregular Network (TIN): The term TIN is used to describe the method that most software uses to model the digital terrain data and to present it on the screen. TIN surface represents a set of adjacent, non-overlapping triangles computed from irregularly spaced data points, with x, y horizontal coordinates and z vertical elevations (Figure 7.17).

To Read
- Chapter 13 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th Edition
Summary and final tasks
Summary and final tasks ksc17Summary
Congratulations! You have just completed Lesson 5. I hope that you appreciated the value of the UAS imagery in producing geospatial data that is suitable for many applications in our day to day life. Ortho photo and digital elevation model are indispensable tools used in many environmental and engineering projects. Without them, we would have to put many boots on the ground to survey the terrain and provide the necessary data for engineering and planning. Practicing with the processing software Pix4D, which I selected for the course, will help you tremendously in appreciating the quality and value of the digital ortho photo and digital elevation model.
Final Tasks
| 1 | Study Lesson 5 materials and the text books chapters assigned to the lesson |
|---|---|
| 2 | Start your first post for the discussion on "Human Elements of UAS " Participate in the "Human Elements of UAS" Discussion Forum Post your opinion on the following topic and respond to at least two of your peers' postings: Considering all elements that make a functioning UAS, one may think that the human element is the most important element of a UAS implementation. The human element complements and interacts in one way or another with most other UAS elements such as aerial vehicle, command and control, payloads, data and communication links, and launch and recovery. With the rapid pace of advancement in technology, one may expect that the importance of the human element will diminish as UAS technology is more mature and the UAS becomes more advanced.
(3 points or 3%) Due date for this assignment is at the end of Lesson 6. |
| 3 | Submit your "CONOP and Risk Assessment" assignment report |
| 4 | Complete Lesson 5 Quiz |
| 5 | Start UAS Data Processing Using Pix4D for exercise 2 |
| 6 | Submit final project idea |
| 7 | Attend the weekly call and Exercise 2 training on Thursday evening at 8:00pm ET |
Lesson 6: Fundamentals of Unmanned Aerial System Operations
Lesson 6: Fundamentals of Unmanned Aerial System Operations sxr133Lesson 6 Introduction
Lesson 6 Introduction AnonymousWelcome to Lesson 6! In this lesson, you will become familiar with all aspects of operating a UAS, starting with the obstacles in the face of the UAS and its operations and moving to subjects like guidelines to UAS operations, definition of airspace, launch and recovery, line of sight (LOS) operation, beyond line of sight (BLOS) operation, and personnel qualifications. All the topics mentioned above are crucial to any individual involved in operating a UAS, especially here in the United States. I would like to emphasize the topic of understanding the national airspace (NAS) and the rules surrounding the operation of an aircraft in each of its classes. In this lesson, you will be asked to express your opinion on the newly released FAA roadmap for integrating the UAS in the NAS.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- understand guidelines for operating a UAS;
- describe the different classes of airspace;
- understand the different modes of operating a UAS (LOS versus BLOS);
- describe the UAS personnel qualifications.
Lesson Readings
Course Textbooks
- Chapter 5 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
- Read sections 5.3, 5.6, and 5.7 of chapter 5 in the textbook: Barnhart, et al., Introduction to Unmanned Aircraft Systems.
Web Articles
- Review materials on the High Adventure website.
- Review section 3 of the materials on the Federation of American Scientists website.
Google Drive (Open Access)
- TRB2013 Paper presentation slides: "Addressing the Operational and Technical UAS Airspace Integration Challenges".
- Review PART 107 “Operation and Certification of Small Unmanned Aircraft Systems”.
- Review more details on the test site program.
- Review the FAA document about the test sites program and the FAA page on it.
- Review the TRB 2013 presentation "Unmanned Aircraft System Policy and Regulatory Environment".
- Read the article "What You Need to Know to Legally Operate Your Drone Under New FAA Regulation".
- Review section 16 of the document “Unmanned Aircraft Systems (UAS) Operational Approval” .
Lesson Activities
- Study lesson 6 materials on CANVAS/Drupal and the text books chapters assigned to the lesson
- Complete your discussions for the assignment on "Human Elements of UAS"
- Complete quiz 6
- Start working on the "COA application Draft"
- Start your exercise 3 - Digital Image Classification
- Attend the weekly call on Thursday evening at 8:00pm ET
- Watch the hearing in the U.S. Senate Committee on Commerce, Science, and Transportation on "Unmanned Aircraft Systems: Innovation, Integration, Successes, and Challenges”.
Obstacle to UAS Operations
Obstacle to UAS Operations ksc17In the following sections, you will become familiar with the FAA regulations that restrict the operation of the UAS in the national airspace, especially for commercial use. Whether we all agree with it or not, the reasons behind the FAA restrictions are due to one or more of the following:
- As was mentioned earlier in the previous lessons, the NAS is already congested with manned aircraft, and adding more traffic caused by the unmanned system may compromise the safety of the NAS. This does not seem to be a valid argument, as the UAS will eventually be integrated into the NAS and rules and procedures need to be in place so the safety of the NAS does not become compromised. The FAA sooner or later needs to deal with integration issues.
- For a long time, most utilization of UASs was made for military purposes. During combat situations, the military bends the rules surrounding UAS operations. In other words, the military use of UASs is not as strongly restricted by FAA regulations as are civilian uses of UAS. The FAA was not under any pressure from the civil community demanding the necessary measure for the integration of the UAS into the NAS. This was true until 2012, when the FAA was mandated by the White House to do something about such integration by the year 2015. FAA did not issue its first UAS integration rules until June 2016.
- Most civil applications of UASs are using less expensive models that lack the sophistication that was built into the military version of the UAS. Such fact gave the FAA a reason to be concerned about the reliability and safety of these small UASs. Besides the cost factor, the size and weight of the payload in most civilian UASs is very limited and prohibits carrying onboard sophisticated communications systems such as the one needed to ensure successful detect-and-avoid mechanism. The FAA stated in their road map document of 2013, which you reviewed in the "Sensors Characteristics" section of lesson 4, “To gain full access to the NAS, UAS need to be able to bridge the gap from existing systems requiring accommodations to future systems that are able to obtain a standard airworthiness certificate.” Such a statement reflects how the FAA feels about the current state of UAS technologies.
- Resistance to change by some FAA employees who are faced with the UAS integration, which is perhaps the most disruptive technology in the history of aviation. This is the case with most of the new technologies introduced to users of conventional technologies.
- Concerns over privacy issues. The UAS can be flown very low and the high definition imagery from an imaging sensor may raise privacy concerns.
I would like to add here that even though the FAA restricted the use and operation of UASs in U.S. airspace, there are growing feelings, by consumers who have found useful uses for UASs, about breaking the FAA rules and flying UASs without a COA or the special airworthiness certificate. Before the FAA changed its pace in recent years in dealing with UAS issues, people were frustrated with the sluggish pace of progress by the FAA to integrate the UAS into the NAS. To understand such "unlawful" use of the UAS, read the following article:
To Read
- Chapter 8 of the textbook: Unmanned Vehicle Systems for Geomatics: Towards Robotic Mapping
- Chapter 5 of the textbook: Fundamentals of capturing and processing drone imagery and data
- Section 5.1 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
To Do
Review the TRB2013 Paper presentation slides "Addressing the Operational and Technical UAS Airspace Integration Challenges."
Guidelines to UAS Operations
Guidelines to UAS Operations ksc17Prior to August 29, 2016, where the latest FAA regulations in regard to UAS operation went into effect, the FAA document number N 8900.227 entitled “National policy: The Unmanned Aircraft Systems (UAS) Operational Approval” is used to describe the regulation surrounding the UAS operation in the United States. The policy carefully explains all aspects of UAS operation, from the airworthiness of the aircraft to the operator training and risk mitigation. Getting familiar with these regulations was necessary for anyone who was planning to own or operate a UAS. The document was temporarily issued until the future regulations that proposed in the FAA roadmap replaces what the above document mandated. It took the FAA few years to amend its regulations to allow the legal operation of small unmanned aircraft systems in the National Airspace System. The new rules were published in the Federal Register (Vol. 81 Number 124 Part II) on June 28, 2016 and it went into effect on August 29, 2016. The new rules were added as a new part 107 to Title 14 Code of Federal Regulations (14 CFR) to allow for routine civil operation of small UAS in the NAS and to provide safety rules for those operations. The new rules, which are publicly known as PART 107, become the latest official policy to govern the commercial operation of small UAS in the National Airspace System (NAS). The article "What You Need to Know to Legally Operate Your Drone Under New FAA Regulation" briefly describes the new rules, and it is a good read for anyone that is trying to understand PART 107.
Prior to the issuing of PART 107, the FAA achieved one of its most important milestones, which is the selection of the 6 sites for the "UAS Test Site Program." The 6 sites selection represented the first serious step by the FAA toward the integration of the UAS into the NAS. Among tens of applicants, the FAA On December 30, 2013 announced the selection of the following 6 agencies to operate UAS test sites as it is quoted below:
- University of Alaska. The University of Alaska proposal contained a diverse set of test site range locations in seven climatic zones, as well as geographic diversity with test site range locations in Hawaii and Oregon. The research plan includes the development of a set of standards for unmanned aircraft categories, state monitoring and navigation. Alaska also plans to work on safety standards for UAS operations.
- State of Nevada. Nevada’s project objectives concentrate on UAS standards and operations, as well as operator standards and certification requirements. The applicant’s research will also include a concentrated look at how air traffic control procedures will evolve with the introduction of UAS into the civil environment and how these aircraft will be integrated with NextGen. Nevada’s selection contributes to geographic and climatic diversity.
- New York’s Griffiss International Airport. Griffiss International plans to work on developing test and evaluation as well as verification and validation processes under FAA safety oversight. The applicant also plans to focus its research on sense and avoid capabilities for UAS, and its sites will aid in researching the complexities of integrating UAS into the congested, northeast airspace.
- North Dakota Department of Commerce. North Dakota plans to develop UAS airworthiness essential data and validate high reliability link technology. This applicant will also conduct human factors research. North Dakota’s application was the only one to offer a test range in the Temperate (continental) climate zone and included a variety of different airspace, which will benefit multiple users.
- Texas A&M University – Corpus Christi. Texas A&M plans to develop system safety requirements for UAS vehicles and operations, with a goal of protocols and procedures for airworthiness testing. The selection of Texas A&M contributes to geographic and climactic diversity.
- Virginia Polytechnic Institute and State University (Virginia Tech). Virginia Tech plans to conduct UAS failure mode testing and identify and evaluate operational and technical risks areas. This proposal includes test site range locations in both Virginia and New Jersey.
In totality, these six test applications achieve cross-country geographic and climatic diversity and help the FAA meet its UAS research goals of System Safety & Data Gathering, Aircraft Certification, Command & Control Link Issues, Control Station Layout & Certification, Ground & Airborne Sense & Avoid, and Environmental Impacts.
Each test site operator manages the use and scheduling of the test site in a way that it gives access to parties interested in using the site. The FAA’s role is to ensure that each operator sets up a safe testing environment and to provide oversight that ensures each site operates under strict safety standards.
To Read
- Chapter 5 of the textbook: Fundamentals of capturing and processing drone imagery and data
- Chapter 5 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
- Review PART 107 “Operation and Certification of Small Unmanned Aircraft Systems”
- Review more details on the test site program.
- Review the FAA factsheet announcement about the test sites program
- Review the TRB 2013 presentation "Unmanned Aircraft System Policy and Regulatory Environment".
- Read the article "What You Need to Know to Legally Operate Your Drone Under New FAA Regulation"
To Do
Watch the hearing in the U.S. Senate Committee on Commerce, Science, and Transportation on March 15, 2017 on "Unmanned Aircraft Systems: Innovation, Integration, Successes, and Challenges
Definition of Airspace
Definition of Airspace sxr133In order to understand the UAS operations within the United States, you will need to be familiar with the way the NAS is classified and managed. Figure 5.1 schematically illustrates the different classes of the NAS, while table 5.1 provides details on the different classes of the NAS. Each class has its own rules and restrictions. The Wikipedia web site contains good details on the US national airspace classes. The materials given in the assignment will provide you with additional details about the NAS classes.
| Class | Description |
|---|---|
| Class A | Generally, airspace from 18,000 feet mean sea level (MSL) up to and including flight level (FL) 600, including the airspace overlying the waters within 12 nautical miles (NM) of the coast of the 48 contiguous states and Alaska. Unless otherwise authorized, all pilots must operate their aircraft under instrument flight rules (IFR). (Instructor added note: FL 600 or Flight Level 600, means a flying altitude of 60,000 ft. MSL, for more details, check out this website.) |
| Class B | Generally, airspace from the surface to 10,000 feet MSL surrounding the nation’s busiest airports in terms of airport operations or passenger enplanements. The configuration of each Class B airspace area is individually tailored, consists of a surface area and two or more layers (some Class B airspace areas resemble upside-down wedding cakes), and is designed to contain all published instrument procedures once an aircraft enters the airspace. An air traffic control (ATC) clearance is required for all aircraft to operate in the area, and all aircraft that are so cleared receive separation services within the airspace. |
| Class C | Generally, airspace from the surface to 4,000 feet above the airport elevation (charted in MSL) surrounding those airports that have an operational control tower, are serviced by a radar approach control, and have a certain number of IFR operations or passenger enplanements. Although the configuration of each Class C area is individually tailored, the airspace usually consists of a surface area with a 5 NM radius, an outer circle with a 10 NM radius that extends from 1,200 feet to 4,000 feet above the airport elevation and an outer area. Each aircraft must establish two-way radio communications with the ATC facility providing air traffic services prior to entering the airspace, and thereafter maintain those communications while within the airspace. |
| Class D | Generally, that airspace from the surface to 2,500 feet above the airport elevation (charted in MSL) surrounding those airports that have an operational control tower. The configuration of each Class D airspace area is individually tailored, and when instrument procedures are published, the airspace will normally be designed to contain the procedures. Arrival extensions for instrument approach procedures (IAPs) may be Class D or Class E airspace. Unless otherwise authorized, each aircraft must establish two-way radio communications with the ATC facility providing air traffic c services prior to entering the airspace and thereafter maintain those communications while in the airspace. |
| Class E | Generally, if the airspace is not Class A, B, C, or D, and is controlled airspace, then it is Class E airspace. Class E airspace extends upward from either the surface or a designated altitude to the overlying or adjacent controlled airspace. When designated as a surface area, the airspace will be configured to contain all instrument procedures. Also in this class are federal airways, airspace beginning at either 700 or 1,200 feet above ground level (AGL) used to transition to and from the terminal or en route environment, and en route domestic and offshore airspace areas designated below 18,000 feet MSL. Unless designated at a lower altitude, Class E airspace begins at 14,500 MSL over the United States, including that airspace overlying the waters within 12 NM of the coast of the 48 contiguous states and Alaska, up to but not including 18,000 feet MSL, and the airspace above FL 600. |
| Class G | Airspace not designated as Class A, B, C, D, or E. Class G airspace is essentially uncontrolled by ATC except when associated with a temporary control tower. |

To Read
- Review materials on the High Adventure website which provides more details on the NAS.
Flight Operations
Flight Operations szw5009In most cases, operating a UAS requires employment of similar logistics as those needed for manned aircraft. Large UASs such as the Northrop Grumman’s Global Hawk call for operation requirements similar to those needed to fly a large Boeing aircraft. The Global Hawk, which is the size of a Boeing 737, requires runways for takeoff and landing. It can fly over 60,000 feet, cruise at 310 knots, and has an endurance of 36 hours. On the other hand, small UASs weigh only a few pounds and do not need airports or runways for takeoff and landing. Different UAS sizes and sophistication also require different personnel skills and requirements.
Launch and Recovery
There are many ways in which a UAV can be launched, some of which are very complex while others are as simple as a hand toss into the air. Some UASs, such as target drones, are air-launched from a fixed wing aircraft. Usually, large UASs are equipped with wheels for takeoff and landing and do not need special equipment, while smaller UASs needs a variety of launch and recovery strategies depending on the complexity of the system. Many small and medium size UAS launch systems have a requirement to be mobile, or in other words, to be mounted on a truck or a trailer. Such mobile launchers fall within one of the following types:
- Rail Launchers
- Pneumatic Launchers
- Hydraulic/Pneumatic Launchers
For more details on these launchers, refer to chapter 17 of the supplemental textbook Introduction to UAV Systems, 4th edition.
Line of Sight (LOS) Operation
Line-of-sight (LOS) operation refers to operating the UAS through direct radio waves. The LOS link provides command and control uplink and product downlink while the UAS operates within a certain distance from the GCS. The link is used to launch and recover the aircraft and perform data acquisition according to the type of payload mission of the system. In the United States, civilian operations are usually conducted on 915 MHz, 2.45 GHz, and 5.8 GHz.
Beyond Line of Sight (BLOS) Operation
Beyond Line-of-sight (BLOS) operation refers to operating the UAS through satellite communications or using a relay vehicle such as another aircraft. The recent advancements in SwiftBroadband service and hardware, including smaller, lighter avionics that don’t compromise on performance or data capacity, allow near-global connectivity to become available to support and enhance UAV operations. SwiftBroadband service is provided by InmarSat Satellite broadband communications. BLOS is usually limited to military UAS operations. Civilian UAS operations do not need BLOS systems for the time being, as their missions are conducted within line of sight range. Civilian operations have access to BLOS via the Iridium satellite system, which is owned and operated by Iridium LLC.
The FAA through its "Partnership for Safety Plan (PSP)" program continue its efforts to team with the industry to help them with the UAS integration. The following organizations were among the entities that FAA is working with to test and try the BVLOS and many of the other UAS integration issues:
- Amazon Prime A
- Burlington Northern Santa Fe (BNSF) Railway
- Drone Racing League (li>DRL)
- Florida Power and Light
- UPS Flight Forward Inc.
- Wing (an Alphabet company)
- Xcel Energy
For more information on the PSP, visit this FAA website.
The FAA in mid-June, 2021 announced that they are forming a new Aviation Rulemaking Committee, or ARC, to provide recommendations to help the agency develop a regulatory path for routine Beyond Visual Line of Sight drone flights. The committee considers the safety, security and environmental needs, as well as societal benefits, of these operations.
Personnel Qualifications
Unmanned aerial system operators of remote pilots, visual observers, mission planners, and other support staff are responsible to:
- plan and analyze flight missions;
- perform preflight, in-flight and post-flight checks and procedures;
- conduct air reconnaissance, surveillance and acquisition missions;
- launch and recover the air frame from the runway or any other suitable sites or mechanisms;
- perform maintenance on communications equipment and power sources.
According to FAA PART 107, the job descriptions for the following jobs are specified:
- the remote pilot in command (RPIC)
- the person manipulating the flight controls of the small unmanned aircraft system
- the visual observer
According to the FAA, the following operational restrictions apply to all UAS pilots:
- One RPIC must be designated at all times.
- The RPIC will be required to obtain a remote pilot certificate with a small UAS rating.
- The RPIC will have the final authority and responsibility for the operation and safety of a small UAS operation conducted under part 107.
- RPIC must not perform crew duties for more than one UAS at a time.
- Only one RPIC per aircraft is authorized, and the RPIC must be in a position to assume control of the aircraft.
- In case of an in-flight emergency, the RPIC will be permitted to deviate from any rule of part 107 to the extent necessary to meet that emergency.
- The RPIC (who is a certificated airman) can supervise another person’s manipulation of a small UAS’s flight controls. A person who receives this type of supervision from the remote pilot in command is not required to obtain a remote pilot certificate to manipulate the controls of a small UAS as long as the remote pilot in command possesses the ability to immediately take direct control of the small unmanned aircraft.
As for the visual observer job, the FAA requires:
- A visual observer is a person who assists the remote pilot in command and the person manipulating the flight controls of the small UAS (if that person is not the remote pilot in command) to see and avoid other air traffic or objects aloft or on the ground.
- The visual observer is an optional crew member who will not be required to obtain an airman certificate.
- No Airman Certification or Required Training of Visual Observer.
- If used, observers are considered crew members.
- If used, observers must not perform crew duties for more than one UAS at a time.
- Observers are not allowed to perform concurrent duties both as UAS pilot and observer.
- For more details on the qualifications of each of the above-mentioned jobs, refer to section 16 of the FAA UAS Operational Approval policy N 8900.227 document.
As for the crew in general:
- The remote pilot in command, the person manipulating the flight controls of the small UAS (if that person is not the remote pilot in command), and the visual observer are to maintain effective communication.
- The remote pilot in command determines how that communication will take place.
- Such communications can be accomplished at a distance through technological assistance.
- The remote pilot in command, the person manipulating the flight controls of the small UAS, and the visual observer must always have visual-line-of-sight capability even if they do not exercise it.
Several agencies started providing training and issuing a UAS operator certification to support newcomers to the UAS business, such as the one in the following links:
- UAS Training Vendor 1: RPAS Training and Solutions
- UAS Training Vendor 2: Unmanned Vehicle University
- UAS Training Vendor 3: Drone Pilot Ground School
To Read
- Chapter 5 of the textbook: Fundamentals of capturing and processing drone imagery and data
- Chapter 5 in the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
- Section 3 of the materials on the Federation of American Scientists website, which provides details on UAS operations.
- Section 16 of the document “Unmanned Aircraft Systems (UAS) Operational Approval” to stand on the qualification of the UAS operation personnel.
Summary and final tasks
Summary and final tasks sxr133Summary
Congratulations! You've finished Lesson 6, Fundamentals of Unmanned Aerial System Operations. You should find by now that you are comfortable with describing different UAS classes, listing UAS system elements, designing a concept of operating a UAS, assessing risk surrounding a UAS operation, understanding FAA regulations, defining the operation criteria for different classes of the national airspace, and understanding the guidelines for UAS flight operations. If you feel that you are not comfortable with any of the previously listed subjects, then you need to review the lessons notes and/or contact me.
Final Tasks
| 1 | Study Lesson 6 materials and the text books chapters assigned to the lesson |
|---|---|
| 2 | Complete your discussions for the assignment on "Human Elements of UAS" |
| 3 | Complete Lesson 6 Quiz |
| 4 | Start working on the "COA application Draft" |
| 5 | Start your exercise 3 - Digital Image Classification |
| 7 | Attend the weekly call on Thursday evening at 8:00pm ET |
Lesson 7: Aviation Regulatory and Certificate of Authorization (COA) Process
Lesson 7: Aviation Regulatory and Certificate of Authorization (COA) Process sxr133Lesson 7 Introduction
Lesson 7 Introduction AnonymousWelcome to Lesson 7! In this module, you will become familiar with the current FAA regulations that govern UAS operations and the ongoing efforts to integrate their operations into the National Airspace System (NAS). On top of it are the latest rules known as PART 107. You will also explore the current recreational versus public or commercial operations of UAS, be familiar with the Certificate of Authorization (COA), Certificate of Waiver, and Airworthiness certificate and how to apply for one, and examine issues related to privacy that are of concern to both the government and industry. You will be asked to choose your application materials and organize them for your COA or Part 107 waiver application. The COA/Part 107 waiver project includes a few graded components that will be submitted in the different sections of the lesson. During this lesson, you will be engaged in discussions with fellow students on several topics related to the lesson objectives. Participation in these discussions is mandatory wherever it is requested.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- recognize the differences between standards and regulations;
- describe the rules and regulations associated with operating a UAS in the United States of America;
- interpret the FAA restrictions on operating a UAS for commercial use; and
- prepare an application for Certificate of Authorization (COA) or PART107 Certificate of Waiver.
Lesson Readings
Course Textbooks
- Chapter 5 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
- Chapter 5 of the textbook: Fundamentals of capturing and processing drone imagery and data
Google Drive (Open Access)
- Sections 3.1 and 3.2 of Watts, et al., "Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use"
- Read the FAA "Literature Review on Detect, Sense, and Avoid Technology for Unmanned Aircraft Systems"
Review the following:
Web Articles
- Review several COAs issued by the FAA.
Google Drive (Open Access)
- Review the FAA "Integration of Civil Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS): Roadmap."
- Review the FAA "Model Aircraft Operating Standards."
- Review the FAA "Notice of proposed rulemaking (NPRM)."
- Review PART 107 “Operation and Certification of Small Unmanned Aircraft Systems.”
- Review the article "What You Need to Know to Legally Operate Your Drone Under New FAA Regulation."
- Review the FAA "National Policy on the Unmanned Aircraft Systems (UAS) Operational Approval."
- Review sample of PART107 Certificate of Waiver.
- Review the information provided in this COA information template document before completing your COA application.
- Review the samples of issued COAs that is provided to you under the "Example_COAs_from_FAA" in the Modules section.
Lesson Tasks
- Study lesson 7 materials on CANVAS/Drupal and the text books chapters assigned to the lesson
- Complete quiz 7
- Start your first post for the discussion on "FAA Road map"
- Start your first post for the discussion on "Differences Between Rules and Regulations"
- Continue working on the COA Application and the Final Project Report
- Start UAS Data Processing Using Pix4D for exercise 4
- Submit your Pix4D processing materials for exercise 2
- Attend the weekly call and training on exercise 4 on Thursday evening at 8:00pm ET
The Need for Regulations
The Need for Regulations ksc17The Federal Aviation Administration (FAA) was created in 1958 in response to a series of fatal accidents and midair collisions involving commercial aircraft. The FAA was mandated to develop plans and policies for the use of navigable airspace to ensure the safety of aircraft and the efficient use of airspace. Prescribed air traffic regulations should cover the flight of aircraft (such as safe altitudes) for navigating, protecting, and identifying aircraft; protecting individuals and property on the ground; using the navigable airspace efficiently; and preventing collision between aircraft, between aircraft and land or water vehicles, and between aircraft and airborne objects.
Since the creation of the FAA, American airspace has become one of the most regulated fields in the United States. With the introduction of UASs, the FAA has had to examine and ensure that these pilotless aircraft can operate safely and meet all the above mentioned regulations. The NAS is already congested with piloted aircraft, and adding a swarm of UAVs requires thoughtful planning. The FAA's main mandate is to ensure that UASs do not endanger current users of the NAS (including manned or other unmanned aircraft) nor compromise the safety of the people and property on the ground.
When it comes to the safe operation and integration of the UAS into the NAS, one of the main concerns that the FAA has is the lack of detect, sense, and avoid capability of the current UAS technology. The FAA did a thorough literature review to stand on what is possible and what is not along this line. The article listed in the reading assignment of this section details the FAA quest for the detect, sense, and avoid possibilities.
To Read
- Read sections 3.1 and 3.2 of the article “Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use," which briefly discusses regulations governing the use of the UAS.
- Read the FAA "Literature Review on Detect, Sense, and Avoid Technology for Unmanned Aircraft Systems" which details the FAA quest for technology to support detect, sense, and avoid capability of the UAS.
- Chapter 5 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
- Chapter 5 of the textbook: Fundamentals of capturing and processing drone imagery and data
Current Status of FAA Guidelines on UAS
Current Status of FAA Guidelines on UAS sxr133In this section, you will explore the current regulations that govern UAS operations and the efforts underway to integrate their operations into the National Airspace System (NAS). The status of UAS regulations can be considered in relation to two different eras. The first one preceded the provisions of the FAA Modernization and Reform Act of 2012 (P.L. 112-95), and the second is what we are currently dealing with after the 2012 provision. During both eras, the FAA regulations on operating a UAS in NAS were very strict and in fact prohibited civilians from flying UASs until Part 107 went into effect on August 29, 2016. In 2008, The Aviation Safety Unmanned Aircraft Program Office (UAPO) of the FAA issued the Interim Operational Approval Guidance 08-01. “Interim Operational Approval Guidance, Unmanned Aircraft Systems Operations in the U. S. National Airspace System” provided guidance to help determine if unmanned aircraft systems (UAS) should be allowed to conduct flight operations in the U. S. national airspace system (NAS). On July 30, 2013, the FAA issued a national policy (N 8900.227) for reviewing and evaluating the safety and interoperability of proposed Unmanned Aircraft Systems (UAS) flight operations conducted within the United States (U.S.) National Airspace System (NAS) under the subject “Unmanned Aircraft Systems (UAS) Operational Approval.” The new national policy defined in details the methods of the UAS operational approval through the issuance of either a COA for public aircraft operations or a Special Airworthiness Certificate for civil operations. All guidelines and regulations are jointly developed by the following entities within the FAA:
- the Unmanned Aircraft Program Office (UAPO), FAA Aircraft Certification Service (AIR-160)
- the Production and Airworthiness Division, FAA Aircraft Certification Service (AIR-200)
- the Flight Technologies and Procedures Division, FAA Flight Standards Service (AFS-400)
- the FAA Air Traffic Organization’s Office of System Operations and Safety, (AJR-3)
Originally, the Certificate of Authorization, or COA, was limited to public agencies and no commercial agency was granted a COA. Even for public agencies, COA cannot be guaranteed, and COAs may take different lengths of time or have some restrictions built in, according to the FAA document N 8900.227, which states “because of the uniqueness of various UAS flight operations, each application must be evaluated on its own technical merits, including operational risk management (ORM) planning. Each application may require unique authorizations or limitations directly related to the specific needs or capabilities of the UAS and/or the proposed specific mission and operating location.”. However, during 2015, the FAA started issuing grants exemption for commercial entities to fly UAS for commercial use under strict limitations. The FAA based such grant exemption on section 333 of the FAA Modernization and Reform Act of 2012. An exemption according to section 333, allows commercial companies to fly UAS, after they apply for COA, of course, for commercial use. Even with the heavy restrictions that surrounded these exemptions, the move was welcomed by companies who are planning to use UAS for various commercial tasks, and it was considered to be the baby step that they were waiting for.
The previous surprising move by the FAA was followed by three unprecedented moves.
To Read
- Read sections 3.1 and 3.2 of the article “Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use," which briefly discusses regulations governing the use of the UAS.
- Read the FAA "Literature Review on Detect, Sense, and Avoid Technology for Unmanned Aircraft Systems" which details the FAA quest for technology to support detect, sense, and avoid capability of the UAS.
- Chapter 5 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
- Chapter 5 of the textbook: Fundamentals of capturing and processing drone imagery and data
More FAA Changes
More FAA Changes AnonymousBlanket COA
The FAA issued a new type of COA, called "blanket COA" for companies that were granted exemptions according to section 333. According to the new COA,"under the new policy, the FAA will grant a Certificate of Waiver or Authorization (COA) for flights at or below 200 feet to any UAS operator with a section 333 exemption for aircraft that weigh less than 55 pounds, operate during daytime Visual Flight Rules (VFR) conditions, operate within visual line of sight (VLOS) of the pilots, and stay certain distances away from airports o heliports:
- 5 nautical miles (NM) from an airport having an operational control tower; or
- 3 NM from an airport with a published instrument flight procedure, but not an operational tower; or
- 2 NM from an airport without a published instrument flight procedure or an operational tower; or
- 2 NM from a heliport with a published instrument flight procedure.
The “blanket” 200-foot COA allows flights anywhere in the country except restricted airspace and other areas, such as major cities, where the FAA prohibits UAS operations. Previously, an operator had to apply for and receive a COA for a particular block of airspace, a process that can take 60 days. The agency expects the new policy will allow companies and individuals who want to use UAS within these limitations to start flying much more quickly than before.
Section 333 exemption holders automatically received a “blanket” 200 foot COA. Anyone who wants to fly outside the blanket parameters must obtain a separate COA specific to the airspace required for that operation." To learn more about the history of this development, read FAA Announces Major Change to UAS Approval Process.
POLICIES UPDATE: Section 333 of the FAA Modernization and Reform Act of 2012 was later repealed in the 2018 FAA Reauthorization Act. Part 107 replaced the need for section 333 for UAS weigh less than 55 lbs. For UAS that weighs 55 lbs or larger,operators need to apply for an exemption under the Special Authority for Certain Unmanned Systems (49 U.S.C. §44807).
Proposed Rules Making
The FAA working with the DOT, issued on February 15, 2015 a proposal for the future regulations on the UAS operations in the NAS. The FAA opened the door for the public to comment on the proposed rules. Comments were accepted until April 24, 2015. Table 1 summarizes the proposed rules; full version of the proposed rule making can be found here.
| Item | Descriptions/Instructions |
|---|---|
| Operational Limitations |
|
| Operational Qualifications | Pilots of a small UAS would be considered “operators”. Operators would be required to:
|
| Aircraft Requirements |
|
| Model Aircraft |
|
For civil agencies, Special Airworthiness Certificates are issued to applicants wishing to conduct UAS research and development (R&D), crew training, and market surveys.
The Small Unmanned Aircraft Regulations (Part 107)
The FAA finally released its latest regulation on the commercial operation of small UAS in the National Airspace System in June 2016 and it went into effect on August 29, 2016. Part 107 document contains 626 pages of details that you may or may not concerned about. Be aware that the FAA published new or changed many roles within PART 107 since it was first published and it will continue doing so. However, you need to study and focus on all provisions related to the following topics:
- Where and when one can fly UAS for commercial (non-hobbyist) purpose.
- What are the specifications (weight/size) of the UAS that one can fly under these rules.
- What are the required credentials for people operating UAS under these rules.
- Few other topics that you may feel relevant to your activities.
Drone Remote Identification
The FAA beleieves that the drones are changing aviation and the FAA is committed to working towards fully integrating drones into the National Airspace System (NAS). Therefore, the FAA is requiring all drones pilots who are required to register or have registered their drone to operate accordnace to the newly published requirements of remote ID. Details of new rules are published in the Code of Federal Regulations. Originally, the FAA set September 16, 2023 as the deadlines for compliance but it was extended until March 16, 2024. According to the FAA, "Remote ID is the ability of a drone in flight to provide identification and location information that can be received by other parties through a broadcast signal".
The FAA identified three ways drone pilots can meet the identification requirements of the Remote ID rule:
- "Operate a Standard Remote ID drone (PDF) that broadcasts identification and location information of the drone and control station. A Standard Remote ID drone is one that is produced with built-in Remote ID broadcast capabilities in accordance with the Remote ID rule's requirements.
- Operate a drone with a Remote ID broadcast module (PDF). A broadcast module is a device that broadcasts identification and location information about the drone and its take-off location in accordance with the Remote ID rule's requirements. The broadcast module can be added to a drone to retrofit it with Remote ID capabilities. Pilots operating a drone with a Remote ID broadcast module must be able to see their drone at all times during flight.
- Operate (without Remote ID equipment) (PDF) at FAA-recognized identification areas (FRIAs) sponsored by community-based organizations (CBOs) or educational institutions. FRIAs are the only locations where UAS (drones and radio-controlled airplanes) may operate without broadcasting Remote ID message elements."
All drones that are required to be registered or have been registered, including those flown for recreation, business, or public safety, must comply with new rule on remote ID. To find out whether your unmanned aircraft (serial number) is in compliance with Part 107 Operations Over People (OOP) and/or Part 89 Remote ID (RID) regulations, visit the FAA UAS Declaration of Compliance site.
Part 107 (The Small Unmanned Aircraft Regulations)
Part 107 (The Small Unmanned Aircraft Regulations) AnonymousThe following paragraphs some of which were taken from the FAA website, summarize the main ruling introduced by Part 107.
Operating Requirements
- The small UAS operator manipulating the controls of a drone should always avoid manned aircraft and never operate in a careless or reckless manner.
- You must keep your drone within sight. Alternatively, if you use First Person View or similar technology, you must have a visual observer always keep your aircraft within unaided sight (for example, no binoculars). However, even if you use a visual observer, you must still keep your unmanned aircraft close enough to be able to see it if something unexpected happens.
- Neither you nor a visual observer can be responsible for more than one unmanned aircraft operation at a time.
- You can fly during daylight or in twilight (30 minutes before official sunrise to 30 minutes after official sunset, local time) with appropriate anti-collision lighting. Note: That rule was changed effective April 21, 2021 where operating over people and night operations were permitted without a waiver for Part 107 pilots. Here is the latest rules:
"No person may operate a small unmanned aircraft system during periods of civil twilight unless the small unmanned aircraft has lighted anti-collision lighting visible for at least 3 statute miles that has a flash rate sufficient to avoid a collision. The remote pilot in command may reduce the intensity of, but may not extinguish, the anti-collision lighting if he or she determines that, because of operating conditions, it would be in the interest of safety to do so.
- Minimum weather visibility is three miles from your control station.
- The maximum allowable altitude is 400 feet above the ground, and higher if your drone remains within 400 feet of a structure.
- The maximum speed is 100 mph (87 knots).
- You can’t fly a small UAS over anyone who is not directly participating in the operation, not under a covered structure, or not inside a covered stationary vehicle. Note: That rule was changed effective April 21, 2021 where operating over people and night operations were permitted without a waiver for Part 107 pilots. The latest rules as of April 21, 2021:
"a remote pilot in command may conduct operations over human beings only in accordance with the following:
(a) That human being is directly participating in the operation of the small unmanned aircraft;
(b) That human being is located under a covered structure or inside a stationary vehicle that can provide reasonable protection from a falling small unmanned aircraft; or
(c) The operation meets the requirements of at least one of the operational categories § 107.110 for Category 1 operations; §§ 107.115 and 107.120 for Category 2 operations; §§ 107.125 and 107.130 for Category 3 operations; or § 107.140 for Category 4 operations."
UAS must fall under one of the four categories 1, 2, 3, and 4 to take advantage of this ruling
- No operations from a moving vehicle are allowed unless you are flying over a sparsely populated area.
"No person may operate a small unmanned aircraft system:
(a) From a moving aircraft; or
(b) From a moving land or water-borne vehicle unless the small unmanned aircraft is flown over a sparsely populated area and is not transporting another person's property for compensation or hire"
- Operations in Class G airspace are allowed without air traffic control permission.
- Operations in Class B, C, D and E airspace need ATC approval. See Chapter 14 in the Pilot's Handbook (PDF).
- You can carry an external load if it is securely attached and does not adversely affect the flight characteristics or controllability of the aircraft. You also may transport property for compensation or hire within state boundaries provided the drone – including its attached systems, payload and cargo – weighs less than 55 pounds total and you obey the other flight rules. (Some exceptions apply to Hawaii and the District of Columbia. These are spelled out in Part 107.). Here, one needs to be pay attention to the rule "No carriage of hazardous materials" which put a restriction on what you can carry as you can not carry anything.
- You can request a waiver of most operational restrictions if you can show that your proposed operation can be conducted safely under a waiver. The FAA will make an online portal available to apply for such waivers. Users can apply for a waiver at the FAA dedicated web page.
In addition, Part107 gives entities who already have 333 Exemptions the option to continue operating under the terms of their exemptions or move to Part 107. 333 Exemption usually is granted for a two-year time period, which means most operators will eventually shift to operating under 107 after their exemptions expire. Most likely no one in the future will need 333 exemption as they’ll be able to do everything they want under part 107.
Pilot Certification
To operate the controls of a small UAS under Part 107, you need a remote pilot airman certificate with a small UAS rating, or be under the direct supervision of a person who holds such a certificate.
You must be at least 16 years old to qualify for a remote pilot certificate, and you can obtain it in one of two ways:
- you may pass an initial aeronautical knowledge test at an FAA-approved knowledge testing center;
- if you already have a Part 61 pilot certificate, other than a student pilot certificate, you must have completed a flight review in the previous 24 months and you must take a small UAS online training course provided by the FAA.
If you have a non-student pilot Part 61 certificate, you will immediately receive a temporary remote pilot certificate when you apply for a permanent certificate. Other applicants will obtain a temporary remote pilot certificate upon successful completion of a security background check. We anticipate we will be able to issue temporary certificates within 10 business days after receiving a completed application.
UAS Certification
You are responsible for ensuring a drone is safe before flying, but the FAA does not require small UAS to comply with current agency airworthiness standards or obtain aircraft certification. Instead, the remote pilot will simply have to perform a preflight visual and operational check of the small UAS to ensure that safety-pertinent systems are functioning properly. This includes checking the communications link between the control station and the UAS. The UAS must also be registered.
Respecting Privacy
Although the new rule does not specifically deal with privacy issues in the use of drones, and the FAA does not regulate how UAS gather data on people or property, the FAA is acting to address privacy considerations in this area. The FAA strongly encourages all UAS pilots to check local and state laws before gathering information through remote sensing technology or photography.
As part of a privacy education campaign, the agency will provide all drone users with recommended privacy guidelines as part of the UAS registration process and through the FAA’s B4UFly mobile app. The FAA also will educate all commercial drone pilots on privacy during their pilot certification process; and will issue new guidance to local and state governments on drone privacy issues. The FAA’s effort builds on the privacy “best practices” (PDF) the National Telecommunications and Information Administration published last month as the result of a year-long outreach initiative with privacy advocates and industry.
Other Requirements
If you are acting as pilot in command, you have to comply with several other provisions of the rule:
- You must make your drone available to the FAA for inspection or testing on request, and you must provide any associated records required to be kept under the rule.
- You must report to the FAA within 10 days any operation that results in serious injury, loss of consciousness, or property damage (to property other than the UAS) of at least $500.
Table 2 summarizes the main provisions of PART 107 rules, you may also consult the FAQ published by the FAA on the new rules:
| Item | Descriptions/Instructions |
|---|---|
| Operational Limitations |
|
| Remote Pilot in Command Certification and Responsibilities |
A remote pilot in command must:
A remote pilot in command may deviate from the requirements of this rule in response to an in-flight emergency. |
| Airworthiness Certification | FAA airworthiness certification is not required. However, the remote pilot in command must conduct a preflight check of the small UAS to ensure that it is in a condition for safe operation. |
| Model Aircraft |
|
Latest FAA Rules Changes
On January 15, 2021, the FAA change their UAS rules to allows routine operations over people and routine operations at night under certain circumstances. The rule eliminated the need for typical operations to receive individual part 107 certificate of waivers from the FAA. The rule was first published in the Federal Register on January 15, 2021. Corrections to the final rule were published in the Federal Register on March 10, 2021 delaying the effective date from March 16, 2021 to April 21, 2021. To learn more details about the new rule, visit the FAA website. PART 107 rules keeps evolving and oneneeds to keep a close an eye on it.
Tension between FAA Regulations and Local Jurisdictions
Local governments and jurisdictions struggled in allowing drones operators freely operate within their localities sighting public safety and privacy concerns even if such operators follow the rules under PART 107. While such local authorities realize the authority that FAA has over their local air space, some enacted restrictions on the ground for operating UAS (i.e. restrictions related to takeoff and landing sites). Even when many state governments are passing UAS laws designed to promote the growth of the drone industry and the correct implementation of FAA rules, some city laws within those same states are creating their own rules to stop UAS operators from flying within their cities. The following articles shed some light on such on the struggles of local authorities with the enacted FAA rules:
Can I Fly a Drone in a Public Park?
Three Tips to Get Your HOA’s Drone Rules Off the Ground
FAA Rules for Recreational Drone Pilots Flying Near Airports
Part 108 and Beyond Visual Line of Sight Operations
Part 108 and Beyond Visual Line of Sight Operations qaa3FAA Part 108 — Enabling BVLOS Drone Operations
FAA Part 108 is a proposed regulatory framework intended to govern Beyond Visual Line of Sight (BVLOS) operations of unmanned aircraft (drones or UAS) in the U.S. It would go beyond the current Part 107 rules (which require that remote pilots maintain visual line of sight) by creating a more scalable, performance-based regime for advanced drone operations.
The underlying goals of Part 108 include:
- Reducing or eliminating the need for individual waivers for BVLOS operations
- Allowing routine, more complex drone use cases (e.g. package delivery, infrastructure inspections, wide-area monitoring)
- Integrating drones more fully into the National Airspace System (NAS) with safety assurances
- Promoting innovation and scaling of commercial UAS capabilities
PART 108 Historical & Regulatory Context
- The FAA established the BVLOS Aviation Rulemaking Committee (ARC) in 2021 to draft recommendations for how to regulate BVLOS. The ARC produced a report with ~70 recommendations, which included a proposed CFR Part 108.
- The FAA and Congress have signaled urgency to make progress on BVLOS regulation. The FAA Reauthorization Act of 2024 includes mandates related to Part 108 / BVLOS rulemaking.
- Publication of a Notice of Proposed Rulemaking (NPRM) for Part 108 has been long anticipated; in August 2025, the FAA released a draft NPRM under the title “Normalizing Unmanned Aircraft Systems Beyond Visual Line-of-Sight Operations”.
Key Proposed Features & Structure of PART 108
Based on the NPRM and industry analyses, some of the main proposals and features of Part 108 include:
- New Regulatory Framework for BVLOS
- Creates a dedicated part of the CFR (Part 108) to govern Beyond Visual Line of Sight operations, rather than relying on waivers under Part 107.
- Permits vs. Certificates
- Permits: For lower-risk, smaller-scale operations (e.g., agriculture, training, surveys).
- Certificates: For higher-risk, larger-scale, or more complex BVLOS operations.
- Weight Classes / Aircraft Categories
- Establishes thresholds such as ≤ 55 lbs, ≤ 110 lbs, and up to ~1,320 lbs for different levels of approval and oversight.
- Operational Roles
- Defines new operational roles, such as Flight Coordinator, Operations Supervisor, and Remote Pilot in Command, clarifying responsibilities.
- Automated Data Service Providers (ADSPs)
- Introduces third-party service providers for strategic deconfliction, airspace awareness, and coordination between operators.
- Airspace & Right-of-Way Rules
- Defines shielded areas (close to structures or terrain where drone operations have some priority).
- Clarifies right-of-way responsibilities between drones, crewed aircraft, and other UAS.
- Safety & Technical Requirements
- Performance-based requirements for detect-and-avoid (DAA) capabilities.
- Reliability standards for communications, navigation, and control systems.
- Mandatory fail-safe behaviors in case of lost link or system failure.
- Population Density Categories
- Creates ground-population density classifications (sparse, moderate, dense) with corresponding operational limitations.
- Cybersecurity & Security Standards
- Requires cybersecurity measures, secure command-and-control links, and operational integrity protection.
- Transition & Interim Approvals
- Provides transition pathways from Part 107 waivers to Part 108 compliance (e.g., shielded operations waivers).
Current Status & Timeline for PART 108
- The NPRM was published in August 2025 by the FAA under the title “Normalizing Unmanned Aircraft Systems Beyond Visual Line-of-Sight Operations.”
- The NPRM opens a public comment period. Stakeholders (industry, academia, government, users) are invited to submit feedback.
- The FAA is expected to review comments and revise the rule as needed before issuing a final rule.
- Some observers expect a final rule by 2025 or early 2026, though timing remains uncertain.
- There have been missed or delayed deadlines. For example, FAA missed a September 16, 2024 deadline mandated by FAA Reauthorization Act to publish the Part 108 NPRM.
Challenges, Risks & Stakeholder Concerns about PART 108 Implementation
- Regulatory complexity & burden: Ensuring that performance-based requirements are implementable, measurable, and enforceable could be challenging.
- Safety assurance: Proving reliability, redundancy, and safety in BVLOS operations remains a central hurdle.
- Spectrum, communications, and navigation constraints: Reliable connectivity, navigation accuracy, interference, and latency become more important when the pilot cannot see the aircraft directly.
- Liability & risk allocation: As operations scale, insurance, responsibility, and blame attribution in failure cases become more complex.
- Public acceptance & privacy: Operating drones over populated areas, even with safety guarantees, may face public resistance concerning noise, privacy, and intrusion risks.
- Interoperability & data sharing: The use of ADSPs, deconfliction among many operators, and real-time data exchange across systems will require robust standards and coordination.
- Transition / grandfathering: Many operators now rely on waivers and exemptions; migrating to a Part 108 system without disruption requires careful transition strategies.
- Regulatory lag: Technology evolves rapidly, so there is risk the final regulation will lag behind what is technologically feasible.
- Resource constraints & administrative capacity: The FAA must staff and allocate resources to process certifications, oversight, and compliance enforcement.
Implications & Opportunities with PART 108
If successfully implemented, Part 108 could unlock substantial growth in UAS applications, including:
- Package delivery / logistics: Enabling last-mile drone delivery over longer distances without requiring visual line of sight.
- Infrastructure & utility inspection: Enables continuous monitoring of pipelines, power lines, railroads, etc., over large spans.
- Agriculture & environmental monitoring: Wide-area surveillance of crops, forests, water resources, and wildlife in remote regions.
- Search & rescue / emergency response: Drones can extend reach and speed in disaster zones, beyond visual constraints.
- Scientific data collection: Atmospheric & environmental sensors, mapping, remote sensing missions in remote areas.
- Commercial innovation: More advanced business models, autonomous drone systems, remote operations, multi-vehicle coordination.
For comprehensive understanding of Part 108 NPRM, consult the following resources:
- The FAA published NPRM “Normalizing Unmanned Aircraft Systems Beyond Visual Line of Sight Operations”. (https://www.federalregister.gov/documents/2025/08/07/2025-14992/normalizing-unmanned-aircraft-systems-beyond-visual-line-of-sight-operations)
- Understanding the FAA's Approach to BVLOS Operations: Part 108 and Beyond (https://www.mtec.aero/post/understanding-the-faa-s-approach-to-bvlos-operations-part-108-and-beyond)
- Navigating Drone Regulations in 2025: Part 108(BVLOS) (https://www.vsiaerial.com/post/2025-drone-regulations-part-108)
- Will Executive Orders to Enable BVLOS Operations Reshape the Drone Industry? (https://www.commercialuavnews.com/regulations/will-executive-orders-to-enable-bvlos-operations-reshape-the-drone-industry)
- Part 108: A New Era for Aerial Survey and Client Solutions.
The NPRM, exceeding 700 pages in length, will be available for public comment for 60 days following its official publication in the Federal Register. During this period, all stakeholders and interested parties are invited to submit their feedback prior to the rule's finalization.
UAS Registration Program
UAS Registration Program AnonymousAfter issuing PART107, the Federal Aviation Administration (FAA) required all UAS owners to register each UAS that is purchased weighing between 0.55 lbs to 55 lbs. If one meets the criteria to register an unmanned aircraft and does not register, he or she will be subject to civil and criminal penalties defined in the U.S. Government UAS regulation terms.
FAA requires operators of UAS to register their UAS according to the rules one follows when flying a UAS, those are:
UAS Flown under the Small UAS Rule (Part 107)
The FAA requires owner/operator of UAS:
- to register the unmanned aircraft under "Part 107"
- label it with the registration number;
- UAS must weigh less than 55 pounds.
Registration costs $5 per aircraft and is valid for 3 years.
In order to register, one needs:
- email address,
- credit or debit card,
- physical address and mailing address (if different from physical address),
- make and model of the unmanned aircraft.
UAS flown for Recreational Use
The FAA Reauthorization Act of 2018, which will be described next, in Section 349, "Exception for limited recreational operations of unmanned aircraft (49 U.S.C. 44809)" stated:
"a person may operate a small unmanned aircraft without specific certification or operating authority from the Federal Aviation Administration if the operation adheres to all of the following limitations:
(1) The aircraft is flown strictly for recreational purposes.
(2) The aircraft is operated in accordance with or within the programming of a community-based organization’s set of safety guidelines that are developed in coordination with the Federal Aviation Administration.
(3) The aircraft is flown within the visual line of sight of the person operating the aircraft or a visual observer co-located and in direct communication with the operator.
(4) The aircraft is operated in a manner that does not interfere with and gives way to any manned aircraft.
(5) In Class B, Class C, or Class D airspace or within the lateral boundaries of the surface area of Class E airspace designated for an airport, the operator obtains prior authorization from the Administrator or designee before operating and
complies with all airspace restrictions and prohibitions.
(6) In Class G airspace, the aircraft is flown from the surface to not more than 400 feet above ground level and complies with all airspace restrictions and prohibitions.
(7) The operator has passed an aeronautical knowledge and safety test described in subsection (g) and maintains proof of test passage to be made available to the Administrator or law enforcement upon request.
(8) The aircraft is registered and marked in accordance with chapter 441 of this title and proof of registration is made available to the Administrator or a designee of the Administrator or law enforcement upon request."
To apply the above regulations, the FAA issues "CERTIFICATE OF WAIVER OR AUTHORIZATION" to authorize the use of drones for recreational purposes according to section 44809.
To register a model aircraft, one needs to be:
- 13 years of age or older (if the owner is less than 13 years of age, a person 13 years of age or older must register the model aircraft);
- a U.S. citizen or legal permanent resident.
In order to register a model aircraft, one needs:
- email address;
- credit or debit card;
- physical address and mailing address (if different from physical address).
In 2019, the FAA issued the Advisory CircularAC 91-57B. This new circular, which was developed after Part 107 regulations were firmly in place, provides even more details on what constitutes safe operations and provides numerous resources to pilots in order to familiarize themselves with the requisite aeronautical knowledge to conduct safe operations. Notably, this document also focuses primarily on “drones” rather than traditional remote controlled model aircraft, signifying a shift in the FAA’s thinking on the subject.
FAA Reauthorization Act of 2024
FAA Reauthorization Act of 2024 AnonymousThe following was announced on the FAA website:
"The FAA Reauthorization Act of 2024 (Public Law 118-63) was signed into law on May 16, 2024. This authorization runs through Fiscal Year 2028 and communicates congressional priorities for how the agency carries out its mission to provide the safest, most efficient aerospace system in the world. This legislation is broad and speaks to everything from FAA’s organizational structure, ways to bolster many of the agency’s oversight processes, and where to invest resources to support safety and efficiency for both conventional users and new entrants. Much of this legislation aligns with the agency’s existing priorities and approaches but tells us where Congress is most interested in seeing adjustments to resources and timelines for various activities.
The FAA believes this Act supports the needs of the aviation ecosystem and will help advance aviation into the future. The FAA is committed to implementing the requirements in the Act as efficiently as possible."
A summary of the FAA Reauthorization act of 2024 can be found in the section by section summary document published by the house transportation committee.
Data Exchange and LAANC
Data Exchange and LAANC AnonymousThe materials in this section is adopted from the FAA website's on Data Exchange and LAANC.
The FAA UAS Data Exchange is an innovative, collaborative approach between government and private industry facilitating the sharing of airspace data between the two parties. Under the FAA UAS Data Exchange umbrella, the agency will support multiple partnerships, the first of which is the Low Altitude Authorization and Notification Capability.
What is LAANC?
LAANC, Figure 6.1, is the Low Altitude Authorization and Notification Capability, a collaboration between FAA and Industry. It directly supports UAS integration into the airspace. It provides access to controlled airspace near airports through near real-time processing of airspace authorizations below approved altitudes in controlled airspace.

Figure 6.1 LAANC illustrated
There is a graphic of a person holding a phone and a drone on the far left, labeled "Drone Users". In the middle, there is a phone labeled "UAS Service Suppliers", connected to two blue boxes by a dotted line. The top (larger) box is labeled "FAA Airspace Data" and includes the following: "TFRs", "NOTAMs", and "Facility Maps". Below that box, there is a smaller one that reads "FAA's UAS: Data Exchange". On the far right, there is a graphic of a plane and an air traffic control tower, labeled "FAA Air Traffic".
How does it work?
LAANC automates the application and approval process for airspace authorizations. Through automated applications developed by an FAA Approved UAS Service Suppliers (USS) pilots apply for an airspace authorization. Requests are checked against multiple airspace data sources in the FAA UAS Data Exchange such as temporary flight restrictions, NOTAMS and the UAS Facility Maps. If approved, pilots receive update from the FAA site.
Where can I fly under LAANC?
LAANC is available at nearly 400 air traffic facilities covering approximately 600 airports. If you want to fly in controlled airspace near airports not offering LAANC, you can use the manual process to apply for an authorization.
The capability is in beta throughout 2018, and seeks to test the capability nationwide; the results will inform future expansions of the capability.
After LAANC, FAA Permits UAS Flying Near Airports
In a surprise move during October 2018, the Federal Aviation Administration (FAA) granted permission for certain UAS manufacturers to operate in a controlled airspace. Such move by the FAA was possible after the FAA rolled out its “Altitude Authorization and Notification Capability (LAANC)” initiative. DJI is among the nine companies that the FAA approval as a UAS Service Supplier capable of offering LAANC services, allowing DJI to offer its customers near-real-time authorization to fly in controlled airspace near airports. That occurred after the FAA performed a rigorous test and validation of DJI’s technology capabilities to support LAANC services.
The article titled FAA clears DJI and other drone companies to fly near airports was published by Engadget states “The Federal Aviation Administration (FAA) has given nine companies permission to fly in controlled airspace, such as airports, as part of its Low Altitude Authorization and Notification Capability (LAANC) initiative. One of those nine companies is DJI, along with Aeronyde, Airbus, AiRXOS, Altitude Angel, Converge, KittyHawk, UASidekick and Unifly. It doesn't mean operators can fly those brands' drones over airports anytime they want, though -- it only means that professional drone pilots can now get authorization to enter controlled airspace in near-real time instead of waiting for months. A pilot that's going to use a drone to conduct an inspection, capture photos and videos or herd birds away from airports, for instance, can now send their applications to fly in controlled airspace to LAANC. The program then processes their applications in near-real time, designating the locations within that airspace they can use, along with the altitudes they can fly in. LAANC makes sure the drones won't be able to go anywhere near planes, in case the location is an airport, and will inform the FAA Air Traffic of the permissions it granted.
The article continued to state that:
Before LAANC, using drones for productive work near many airports required detailed applications and up to months of waiting, even when the benefits were clear and safety was prioritized," DJI Program Manager Brandon Montellato explained. "Now, LAANC allows easy drone use in more than 2,000 square miles near airports, including many populated areas that can benefit tremendously from drone operations."
UAS Traffic Management
UAS Traffic Management (UTM)
Another major effort by the FAA to understand and manage the integration of UAS into NAS, is the UAS Traffic Management initiative that lead by NASA. UTM is a a research platform to manage large numbers of drones flying at low altitude along with other airspace users. The following few sections about UTM are adopted from NASA website.
What is Unmanned Aircraft Systems Traffic Management?
Ever wonder what the skies will look like in the next five to 10 years? Can you imagine stepping onto your balcony on a sunny day, seeing drones buzzing around? They could be delivering food and goods to doorsteps, hovering around backyards for family fun or over highways for traffic monitoring. In 2021, more than 873,000 unmanned aircraft systems, called UAS, but commonly referred to as drones, are registered to fly in the United States – and their numbers are increasing quickly. Many have questions about how such a big change to the airspace will affect our lives and safety.
NASA’s Ames Research Center in California’s Silicon Valley set out to create a research platform that will help manage large numbers of drones flying at low altitude along with other airspace users. Known as UAS Traffic Management, or UTM, the goal is to create a system that can integrate drones safely and efficiently into air traffic that is already flying in low-altitude airspace. That way, package delivery and fun flights won’t interfere with helicopters, airplanes, nearby airports or even safety drones being flown by first responders helping to save lives.
The system is a bit different than the air traffic control system used by the Federal Aviation Administration for today’s commercial airplanes. UTM is based on digital sharing of each user's planned flight details. Each user will have the same situational awareness of the airspace, unlike what happens in today’s air traffic control. The multi-year UTM project continued NASA’s long-standing relationship with the FAA. Throughout the collaboration, Ames has provided research, development and testing to the agency, which is being put to use in the real world. NASA led the UTM project along with more than 100 partners across various industries, academia and government agencies committed to researching and developing this platform.
How did the research work?

UTM research was broken down into four phases called TCLs, technical capability levels, each increasing in complexity and with specific technical goals that helped demonstrate the system as the research progressed.
TCL1: Completed in August 2015 and serving as the starting point of the platform, researchers conducted field tests addressing how drones can be used in agriculture, firefighting and infrastructure monitoring. The researchers also worked to incorporate different technologies to help with flying the drones safely such as scheduling and geofencing, which restricts the flight to an assigned area.
TCL2: Completed in October 2016 and focused on monitoring drones that are flown in sparsely populated areas where an operator can't actually see the drones they're flying. Researchers tested technologies for on-the-fly adjustment of areas that drones can be flown in and clearing airspace due to search-and-rescue or for loss of communications with a small aircraft.
TCL3: Conducted during spring 2018, this level focused on creating and testing technologies that will help keep drones safely spaced out and flying in their designated zones. The technology allows the UAS to detect and avoid other drones over moderately populated areas.
TCL4: From May through August 2019, the final level demonstrated how the UTM system can integrate drones into urban areas. Along with a larger population, city landscapes present their own challenges: more obstacles to avoid, specific weather and wind conditions, reduced lines of sight, reduced ability to communicate by radio and fewer safe landing locations. TCL4 tested new ways to address these hurdles using the UTM system and technologies onboard the drones and on the ground. These included incorporating more localized weather predictions into flight planning, using cell phone networks to enhance drone traffic communications and relying on cameras, radar and other ways of “seeing” to ensure drones can maneuver around buildings and land when needed – all while communicating with other drones and users of the UTM system.
The UTM team invented a totally new way to handle the airspace: a style of air traffic management where multiple parties, from government to commercial industry, work together to provide services. UTM’s research results were transferred incrementally to the FAA, which continues testing and, with industry partners, is implementing the system. By the time the project officially wrapped up in May 2021, several efforts had emerged to push this line of research into other realms, including managing traffic for the flying taxis envisioned for our cities and flights of jets and even balloons at very high altitudes not currently covered by traditional air traffic management.
This partnership between research and regulation agencies, along with the input of thousands of experts and users will set the stage for the future of a well-connected sky. Drones will offer many benefits by performing jobs too dangerous, dirty or dull for humans to do, and NASA is helping navigate toward that future.
International Standards on UAS Operations
International Standards on UAS Operations qaa3The First Global UAS Standards Developed by ISO
During November 2018, the International Organization for Standardization (ISO) released the first draft set of global standards for unmanned aircraft system (UAS) use. The draft standards, which is expected to be welcomed by Federal Aviation Administration (FAA), suggests no-fly zones around airports and other restricted areas, along with geofencing measures to keep drones away from sensitive locations. The standards also call for drone operators to respect others' privacy and a human intervention fail-safe for all flights. The standard suggested that training, flight logging and maintenance requirements should be in place, along with data protection rules. The draft standard comes in three parts and can be purchased from ISO store.
Certificate of Waiver, Airworthiness Certificates, and Certificate of Authorization (COA)
Certificate of Waiver, Airworthiness Certificates, and Certificate of Authorization (COA) sxr133As you may have noticed from the materials you reviewed in the previous section, no one is allowed to fly a UAS without prior approval from the FAA. Any UAS operation in the United States has to occur in one of two ways. Either the UAS belongs to a public agency (i.e., governmental) and then requires a COA or operates under Part 107 rules, or it belongs to to a civilian entity and therefore requires adherence to Part 107 rules and perhaps a special airworthiness certificate or a waiver. For manned aircraft, the FAA requires several basic steps to obtain an airworthiness certificate in either the Standard or Special class. The FAA may issue an applicant an airworthiness certificate when:
- the registered owner or operator/agent registers the aircraft,
- the applicant submits an application through the dedicated portal, and
- the FAA determines the aircraft is eligible and in a condition for safe operation.
The process for a UAS is different for the time being, as it is approached through either a COA or a special airworthiness certificate, as was discussed above. For UAS, the FAA may consider an airwortiness letter like the following:
"To Whom It May Concern:
The eBee small Unmanned Aircraft System has been inspected and reviewed on behalf of XY organization by qualified individuals and a determination has been made based on testing data and evaluation data provided by the manufacturer that the aircraft is serviceable and airworthy for the intended use as advertised by the manufacturer, subject to the warrantees and representations offered by said manufacturer.
Sincerely,
John Doe, System Engineer, XY organization"
Just to reiterate, the process of requesting a UAS operation within the territorial airspace of the United States (the airspace above the contiguous United States, Alaska, Hawaii, U.S. territories, and U.S. territorial waters) differentiates depending whether the applicant is a public agency or a civilian entity. The methods of operational approval are the issuance of either a COA for public aircraft operations or for civilian operators is either to operate under PART 107 for UAS that weighs less than 55 lbs or operators need to apply for an exemption under the Special Authority for Certain Unmanned Systems (49 U.S.C. §44807). Special Airworthiness Certificate is needed for civil operations under certain conditions. The FAA on its website allowed civil users to apply for a COA, it is not needed anymore, through a dedicated portal. This Form shows the web application interface. The form is provided to show the actual interface for the COA application and the required materials and all applicants have to provide the required submissions through the portal. To apply for a COA, go to the FAA UAS Civil COA Portal. You will need to create an account on the FAA website before you proceed with your application. Anyhow, if you are planning to apply for a COA, be prepared to provide the following materials and information through the portal and/or when the FAA ask you later if needed:
Sample Certificate of Authorization Application
This link provides a sample of COA application provided by the FAA on their website Sample COA application form from the FAA website.
Certificate of Authorization Application Components
Make sure that your COA application provide the FAA with the following components:
- Applicant Contact Information:
- 333 Exemption number if any
- name, address, phone, and email
- Purpose of the operation and the requested exemption, if any
- Aircraft System
- description
- picture
- airframe
- dimensions
- power source
- electric, internal combustion, etc.
- weight (gross takeoff weight)
- avionics
- performance
- endurance
- range
- speed
- operating altitude
- turing radius
- climb/decent rates
- Airworthiness statement and documentation
- Communications
- pilot/operator to aircraft
- pilot/operator to observers
- communication within the airspace
- air traffic control procedures and frequencies
- local airspace frequencies
- video/data
- FCC approved frequencies
- ranges
- operational communications range
- backup communication
- radios
- cell phones/landlines
- Ground Control Station
- description
- picture
- capabilities
- setup and operations
- frequency management
- FCC approved equipment
- range
- ConOps, Emergency Procedures, and Risk Mitigation
- identify, control, and document potential hazards (examples)
- human factors
- machine
- media
- management
- mission
- assess the risks (examples)
- weather induced flight cancelation or termination
- lost communication
- loss of payload
- aircraft fire
- analyze risk control measures (examples)
- mishap notification
- access and contact procedures to fire and rescue
- programmed procedure for lost links
- make control decisions
- implementation of risk controls
- review and improvement process
- Flight Operations Area and Time
- COA Location
- boundary points (or center point if circular) recorded in coordinates
- Time of Day
- Launch and Recovery Points
- launch and recovery points recorded in coordinates
- Lost Link/Rally Points
- lost link/rally points recorded in coordinates
- Description of airspace class(es) (A, C, D, E, or G) for proposed operations and surrounding area
- Map and/or aeronautical chart depicting the flight operations in relation to ground references and airspace
- VFR Chart
- Aerial image (i.e. Google Earth/Maps)
- Show COA area with boundaries clearly marked.
- Show planned launch, recovery, and rally points.
- FAA coordination and concept of operations plan (Flight Standards District Office, Air Traffic Control Tower, etc.)
- Planned nominal flight operations in proposed airspace
- Altitude (minimum and maximum)
- Security for crew
- Launch and Recovery Procedures
- Primary/planned launch and recovery location(s)
- Launch and recovery checklists and procedures
- Takeoff and Landing or Fixed Wing launch and recovery methods
- Launch (examples)
- Hand-held
- Rail
- Catapult
- Support vehicles/equipment
- Weather limits
- Recovery (examples)
- Net
- Grass
- Weather limits
- Landing speed
- Launch (examples)
- Lost Communications Procedure
- Return to Base (RTB) procedures
- Lost communication between pilot and Air Traffic Controllers
- Lost communication between pilot and observers
- Lost Link Mission Procedures
- Internal navigation systems failure
- Control systems failure
- 5%">Low/lost battery
- Length of time to identify lost link
- Procedure to re-establish link
- Procedure if link is not re-established
- Platform actions if link is lost
- Notification procedures in the event of lost link
- Operator (Pilot) and Visual Observers
- Crew qualifications
- Pilot certifications
- Aircraft currency
- Required currency of medicals
- Platform training and currency
- Crew resource management (CRM) approach
- Communications and coordination for operations
- Provision for UAS operations (single platform at a time)
Part 107 Certificate of Waiver
Public agencies or private individual or business who wants to be exempted to fly UAS under certain conditions can apply for Certificate of Authorization (COA). The introduction of Part 107 removed many hurdles from the face of operating civilian UAS under many conditions. However, for conditions that are not listed or described directly under Part 107 regulations, a civilian operator can apply for a waiver. The FAA states "A waiver is an official document issued by the FAA which approves certain operations of aircraft outside the limitations of a regulation. You may request to fly specific drone operations not allowed under part 107 by requesting an operational waiver. These waivers allow drone pilots to deviate from certain rules under part 107 by demonstrating they can still fly safely using alternative methods." . The following table illustrate the conditions under which one needs to apply for a waiver to operate under Part 107.
List of operations that require a waiver under Part 107 (source FAA)
How To Apply For a Waiver?
One can apply for a Part 107 Waiver through the FAA website. The FAA details the guidelines for the waiver application and the required information. Pay close attention to the "Waiver Safety Explanation Guidelines for Part 107 Waiver Applications" that you may encounter in the DroneZone operational waiver application. For the waiver application, the FAA required extensive details on:
- Describing the proposed operation
- Describing possible operational risks and methods to mitigate those risks
The following items are required for the "Waiver Safety" part of the application as adopted from the FAA website:
Describe Your Proposed Operation(s)
Operational Details
- Where do you plan to operate?
- Consider providing latitude/longitude and a detailed map of your planned flight area.
- How high will you fly your aircraft (maximum altitude above ground level)?
- Do you want to fly in controlled airspace (Class B, C, D, surface E)?
- If yes, please see 14 CFR §107.41 and our Flying Drones Near Airports (Controlled Airspace) – Part 107 page
- Are there any other kinds of airspace within 5 miles of any planned flight area?
- What kind of area(s) will you fly over?
- For example: rural, sparsely populated, congested, populated, a neighborhood, within city limits, large outdoor gathering of people, a restricted access site, etc.
Small UAS Details
- What kind of UAS will you use to fly the operations requested in this application?
- For example: multi-rotor, fixed wing, hybrid (both multi-rotor and fixed wing), single rotor, lighter than air, etc.
- What is your UAS's power or energy source in flight?
- What is your UAS's maximum flight time (in minutes), range (in feet), and speed (in miles per hour)?
- How big is the aircraft (length/width/height in inches)?
- How do you ensure the aircraft only flies where it is directed (i.e. ensure containment)?
- For example: geo-fencing, tether, etc.
- What kind of termination system, if any, does the UAS have?
- For example immediate flight termination switch
- How much will the aircraft and its payload weigh when flying?
- If the aircraft carries any external or internal load (or object), how is the load secured?
- What, if any, external or internal load (or object) could be dropped from the aircraft when flying, and how will you assure the safety of people, or other people's property, if it is dropped or detached when flying?
Pilot/Personnel Details
- What minimum level of experience will the Remote Pilot in Command (Remote PIC) have to fly under this waiver?
- How many personnel (including the Remote PIC) will you use for operations under this waiver (minimum needed)?
- What kind of training, if any, will personnel (e.g. visual observer(s)) have prior to flying under the waiver?
- How will the personnel be trained?
- How will the Responsible Person know the other personnel are competent and have operational knowledge to safely fly the UAS under the waiver conditions?
- If personnel will be tested, what kind of testing will be performed, and how will evaluations be conducted and documented?
- How will personnel maintain the knowledge/skill to fly under this waiver? Will recurrent training or testing be required?
Describe Operational Risks and Mitigation
Provide, to the greatest extent possible, how you propose to address or lessen the possible risks of your proposed operation. This could include using operating limitations, technology, additional training, equipment, personnel, restricted access areas, etc. When reviewing the questions for each section below, the FAA's primary concerns are:
- How you will ensure your operation(s) remains safe at all times, even in unusual circumstances.
- What kinds of circumstances could arise, and how you plan to handle each.
The following questions are associated with each waivable section of part 107. Only answer the questions for the regulatory section applicable to the application you will submit:
- 107.25 Operations from a moving vehicle or aircraft
- 107.29 Daylight operation
- 107.31 Visual line of sight aircraft operation
- 107.33 Visual observer
- 107.35 Operation of multiple small unmanned aircraft
- 107.37 Operation near aircraft
- 107.39 Operation over people
- 107.51(a) Operating limitations: ground speed
- 107.51(b) Operating limitations: altitude
- 107.51(c) Operating limitations: minimum visibility
- 107.51(d) Operating limitations: minimum distance from clouds
NOTE: The list of questions may not be all-inclusive. You may need to provide additional information based on your specific operation.
To Do:
- Read chapter 4 of the textbook "Introduction to Unmanned Aircraft Systems."
- Review the FAA national policy on the Unmanned Aircraft Systems (UAS) operational approval.
- Review the samples COA applications that you may benefit from for your COA application.
- Review the samples of valid operational Part 107 Waivers.
Remote Identification of Unmanned Aircraft Systems
Remote Identification of Unmanned Aircraft Systems qaa3The first few sections were adopted from the FAA website with minimal modifications. The last section reflects the latest updates on the UAS remote identification as presented by the FAA/AUVSI webinar on September 16, 2021. The original proposed rules were published in the Federal Register on December 31, 2019. On January 16, 2020, the FAA provided a briefing to the House of Representatives and Senate Aviation Subcommittee regarding the Remote Identification of Unmanned Aircraft Systems notice of proposed rulemaking (84 FR 72438). It is a useful overview of the entire matter of UAS identification that is worth watching.
UAS Remote Identification
Drones or unmanned aircraft systems (UAS) are fundamentally changing aviation, and the FAA is committed to working to fully integrate drones or UAS into the National Airspace System (NAS). Safety and security are top priorities for the FAA, and Remote Identification (Remote ID) of UAS is crucial to our integration efforts.
What is Remote ID?
Remote ID is the ability of a UAS in flight to provide identification information that can be received by other parties. According to the proposed rules making, there are three ways drone pilots will be able to meet the identification requirements of the remote ID rule, see Figure 6.2:
- By flying a standard remote ID drone (transmit to remote ID USS and broadcast).
- By flying a limited remote ID drone (transmit to remote ID USS) within the visual line of sight.
- By flying a drone without remote ID capability within visual line of sight at an FAA-Recognized Identification Area (FRIA). Drones not equipped with remote ID do not need to broadcast or transmit to a remote ID USS when within a FRIA. Only community-based safety organizations (CBO's) can apply to establish a FRIA.

Why Do We Need Remote ID?
Remote ID would assist the FAA, law enforcement, and Federal security agencies when a UAS appears to be flying in an unsafe manner or where the drone is not allowed to fly.
The development of Remote ID builds on the framework established by the small UAS registration rule and the LAANC capability to lay the foundation of an Unmanned Aircraft System Traffic Management System (UTM) that is scalable to the national airspace.
Notice of Proposed Rule Making:
The Remote Identification proposed rule provides a framework for the remote identification of all UAS operating in the airspace of the United States. The rule would facilitate the collection and storage of certain data, such as identity, location, and altitude, regarding an unmanned aircraft and its control station.
Once published, the FAA will solicit comments from the public to better inform its rulemaking process. The FAA posts these comments, without edit, including any personal information the commenter provides, to the Regulations.gov website. The docket number is FAA-2019-1100.
Remote ID Cohort:
The goal of the FAA Remote ID Cohort is to develop the technology requirements applicable to FAA-qualified remote ID UAS service suppliers.
What's next?
Remote ID is the next step to enable safe, routine drone operations across our nation. This capability will enhance safety and security by allowing the FAA, law enforcement, and Federal security agencies to identify drones flying in their jurisdiction.
What has the FAA done?
In December 2018, the FAA issued a Request for Information (RFI) to establish an industry cohort to explore potential technological solutions for Remote ID.
The UAS Identification and Tracking Aviation Rulemaking Committee (ARC), chartered by the FAA in June 2017, submitted its report and recommendations (PDF) to the agency on technologies available to identify and track drones in flight and other associated issues.
Latest Update on Remote ID
During the Drones Safety Awareness Week, the FAA and AUVSI jointly offered the webinar "National Drone Safety Awareness Week: Remote ID Compliance Strategy" on September 16, 2021. The webinar discussed how the new rules regarding Remote ID will impact the industry and create safer skies for all. The webinar also discussed how the industry is reacting to the coming new roles to ensure compliance for the short and long-term. You can watch the recording of the session on YouTube. You can also review the presentation slides posted on CANVAS.
FAA Policy on Remote ID Enforcement: Drone pilots were originally expected to comply with the September 16, 2023, compliance date for Remote ID. However, the FAA understands that some drone pilots may not be able to comply because of the limited availability of broadcast modules and the lack of approved FAA-Recognized Identification Areas. In those instances, the FAA will consider all factors in determining whether to take enforcement action through March 16, 2024. Access the Federal Register to read the full policy.
Application for the Certificate of Authorization (COA) or Part 107 Waiver
Application for the Certificate of Authorization (COA) or Part 107 Waiver sxr133In this section, you are expected to develop and submit the required materials for the COA or Part of 107 waiver application for the platform you selected in the activities of Lesson 1. It is helpful to review previously submitted COA or Part 107 waiver applications available on the FAA website before populating your own documentation, so you can become familiar with the format, required materials, and depth of information. The following is a brief list about the materials you may need in order to complete the COA or Part 107 waiver application for your platform:
- Aircraft system
- Communications
- Ground Control station
- Emergency procedures
- Flight operations area
- Launch and recovery
- Lost communications
- Lost link mission
- Operator and visual observers
Make sure to incorporate risk mitigation strategies and address the integration of automation and autonomy in your system in the various sections as appropriate. More details on the information required for a COA or Part 107 waiver application can be found in the template provided. The link to the FAA site provided above also provides plenty of examples on COA and Part 107 waiver applications.
To Read
- Visit the Federal Aviation Administration website to review several COAs issued by the FAA. Make sure that you review several applications from various organizations and note differences in their platforms and procedures.
To Do
- Review the information provided in this COA information template document before completing your COA application.
- Review the samples of issued COAs that are provided to you in the module entitled "Example_COAs_from_FAA".
- Visit the FAA website and stand on current issued Part 107 waivers.
Summary and final tasks
Summary and final tasks sxr133Summary
Congratulations! You've finished Lesson 7, Aviation Regulatory and Certificate of Authorization Process (COA). I hope you digested the materials very well, as they are essential to understanding the circumstances of operating any UAS in the U.S. The exercise of developing your own COA or Part 107 waiver application will enable you to manage a UAS operation, as it has provided you with crucial knowledge about logistics and safety concerns regarding UAS operations. The exercise not only had provided you with FAA rules and regulations, but has also given the necessary technical knowledge about different sub-systems of the UAS.
Final Tasks
| 1 | Study Lesson 7 materials and the text books chapters assigned to the lesson |
|---|---|
| 2 | Complete Lesson 7 Quiz |
| 3 | Start your first post for the discussion on "FAA Road map" |
| 4 | Continue working on the COA Application and the Final Project Report |
| 5 | Start UAS Data Processing Using Pix4D for exercise 4 |
| 6 | Submit your Pix4D processing materials for exercise 2 |
| 7 | Attend the weekly call on Thursday evening at 8:00pm ET |
Lesson 8: Geospatial Data Quality, Accuracy, and Mapping Standards
Lesson 8: Geospatial Data Quality, Accuracy, and Mapping Standards szw5009Lesson 8 Introduction
Lesson 8 Introduction AnonymousIntroduction
Evaluating the quality and accuracy of geospatial data is one of the most important topics among geospatial data users. Geospatial data are used for diverse applications, including engineering and infrastructure positioning applications. Knowing how accurate the measurements that are derived from geospatial data can be a matter of life or death in some applications, like inaccurately locating a gas pipeline by an excavation team. In this lesson, you will be introduced to various statistical concepts that are related to determining geospatial data accuracy. You will also learn about the latest map accuracy standards designed for digital geospatial data published by the American Society of Photogrammetry and Remote Sensing (ASPRS).
Learning Objectives
At the successful completion of this lesson, you should be able to:
- understand basic statistical terms used to express product accuracy.
- understand errors in geospatial data.
- understand different types of accuracy.
- differentiate between different errors in geospatial data.
- describe factors affecting geospatial products accuracy.
- practice accuracy computations.
- understand The ASPRS positional accuracy standards.
Lesson Readings
Google Drive (Open Access)
- ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2, version 2 (2024)
- ASPRS Highlight Article “Best Practices in Evaluating Geospatial Mapping Accuracy according to the New Mapping Accuracy according to the New ASPRS Accuracy Standards”
- ASPRS Highlight Article “Overview of the ASPRS Positional Accuracy Standards for Digital Geospatial Data EDITION 2, VERSION 2 (2024)”
Lesson Activities
- Study lesson 8 materials on CANVAS/Drupal and the textbook chapters assigned to the lesson
- Complete quiz 8
- Submit your COA Application
- Complete your discussions for the assignment on "FAA Roadmap."
- Complete your discussions for the assignment on "Differences Between Rules and Regulations."
- Attend the weekly call on Thursday evening at 8:00 pm ET
- Practice computing product accuracy for each of the three data processing exercises.
Geospatial Data Accuracy and Quality and Mapping Standards
Geospatial Data Accuracy and Quality and Mapping Standards AnonymousIntroduction:
Evaluating the quality and accuracy of geospatial data is one of the most important topics among geospatial data users. Geospatial data are used for diverse applications, including engineering and positioning applications. Knowing how accurate the measurements are that are derived from geospatial data is a matter of life or death in some applications, like when locating gas pipelines. In this section, you will be introduced to various statistical concepts that are related to determining geospatial data accuracy. You will also learn about the latest map accuracy standards designed for digital geospatial data published by the American Society of Photogrammetry and Remote Sensing (ASPRS).
Metrics in Geospatial Production Process:
For any geospatial data product, collecting metrics about a dataset revolves around the following questions:
- How well does the map fit a national or a global coordinates system and datum?
- How well does the geometric and radiometric quality meet or depart from the client’s expectations or specifications?
- How well do these metrics fit a “standard” or what is considered standard within the geospatial industry?
Why we are concerned about accuracy?
Errors exist in any product we produce, no matter how accurate the instrument or the process we utilize. This is because all measuring instruments are not perfect, including laser instruments. Figure 1 illustrates the common instruments used in surveying and mapping practices and which we may think are perfect measurement devices.

Errors in Measurements
Errors in Measurements AnonymousThere are two types of errors that concern us the most in geospatial data generation, and those are random error and systematic error. The third type, which is what we call blunders, is not considered an error, but we need to understand it and deal with it appropriately.
Random Error (or accidental error) is the type of error that randomly happens in nature due to our, or the instrument’s, incapability in realizing the true value. The true value in any measurement process is elusive to us and is beyond our metaphysical power. In a measuring process, we are only estimating the true value. Random error can be reduced by training, experience, and improved quality, but it cannot be eliminated.
Systematic Error: Is the error that has a repeated constant value and follows a mathematical logic. It can be reduced through calibration.
Blunders: A blunder is not an error; it is a mistake resulting from carelessness or negligence that resembles error. Common causes of blunders in surveying and mapping are:
- Measurement taken incorrectly
- Values misread from the measuring device (i.e. screen)
- Number transposed as they are recorded (696 vs. 969)
- Miscounting grids ticks
- Handwriting that is hard to read
- Values entered incorrectly into the computer
- Using the wrong datum and/or coordinate system
- Using the wrong units (meter versus US or international foot)
- Rounding numbers in recording the data
Facts on Error and Normal Distribution:
- Errors are unavoidable, but controllable;
- Any mapping process will have some variation of errors built in;
- No combination of machine and human can produce a product that is exactly the same each time;
- Biases should be removed prior to analysis;
- Small errors are more common than large errors;
- Errors are just as likely to be positive as negative;
- Large errors seldom occur and can only be so big. Blunders can be large.
Accuracy Defined
Accuracy Defined AnonymousAccuracy: The closeness of results of observations, computations, or estimates of graphic map features to their true value or position on the ground.
Precision (Repeatability): The closeness with which measurements agree with each other.
Facts about Accuracy:
- True value is the theoretically correct or exact value of a quantity. The true value is elusive to us, and it cannot be reached considering our human limitations. True value is a matter relate to metaphysics.
- Accuracy is part of the map metrics that need to be included in the metadata of any geospatial dataset.
To illustrate the concepts of accuracy and precision in a practical fashion, let us consider evaluating the results of the four shooting sessions of Figure 2 that the sharp dart shooter completed at different times. In session A, the shooter’s shots seem to be scattered around the bullseye. He/she managed to get the shots around the targeted spot, or the bullseye, but failed to land them close to each other, i.e. they are scattered apart. To evaluate such a session, we say the shooter was accurate as he/she stayed close to the bullseye, but not precise, as the shots were not close to each other. In session B, we would say the shooter managed to cluster all shots in one spot, so he/she was precise but far away from the bullseye, so he/she was not accurate. Accordingly, in session C, he/she was accurate and precise, while in session D the shooter was neither accurate nor precise. To illustrate the concept of biases in measurements, let us analyze sessions B and C. Assuming the two sessions were shot by the same shooter, it is obvious that the shooter performed perfect shots in both sessions but that his/her shots in session B were biased due to mechanical misalignment of the bow or the gun, if a gun was used. Such misalignment of the bow, the gun barrel, or the sight scope caused the shots to be systematically directed to the wrong position instead of the bullseye, causing a bias in the shots. Once proper calibration is made to these mechanical defects, the bias is then removed and all the shots will perfectly fall around the bullseye, like in session C.

To evaluate the shooter results using probability and density distribution terms, the results of session B are equivalent to the random distribution 3 of Figure 3, precise but not accurate, assuming the most probable value of the bullseye is represented by p on the x-axis. The results of session A, however, resemble the distribution 2 of Figure 3, accurate but not precise. For more information on the subject, please watch this NGS video.

The Ever-confusing Statistical Terms
The Ever-confusing Statistical Terms AnonymousTo illustrate the different statistical terms we usually run into when we discuss data accuracy, let us consider the five error values (3-in., 2-in, 1-in., 5-in., and 4-in.) that were calculated on a population of data.
- Mean (average)
- Range = the distance between the largest error and the smallest error, i.e. Smallest = 1, Largest = 5 3)
Variance = measure of spread or dispersion around the mean. It is the mean square of all the errors
Here, inch2 is a meaningless unit and a better statistical term to use is the standard deviation.
- Standard deviation, also called one-sigma, is the square root of the variance
- Root Mean Square Error (RMSE)
- RMSE is not Standard deviation or sigma; they are different.
Root Mean Square Error (RMSE) is computed as follows:
Where,
Z = Measured Value from the data
Zi = Control Value (field surveyed)
n = number of measurements
Relationship Between Standard Deviation and Root Mean Square Error (RMSE)
Relationship Between Standard Deviation and Root Mean Square Error (RMSE) AnonymousFacts about RMSE:
- Includes random and systematic errors
- More useful to use as it reveals biases (systematic error)
- It tells us how accurate the data is
Facts about Standard Deviation:
- Includes only random error
- Reflects only how precise the data is
- It does not tell us how accurate the data is in the presence of biases. It only tells us how precise the data is.
Table 1 illustrates the difference between standard deviation and the RMSE in revealing the presence of biases in measurements. The table represents a vertical accuracy evaluation for points cloud derived from UAS imagery by comparing it to a higher accuracy elevation model derived from a mobile lidar mapping system. The UAS-derived elevation model needed to meet 5 cm (0.164 ft) accuracy. If we used standard deviation alone, the data would meet the specifications with a value of 0.076 ft. However, looking at the high value of 0.246 ft. (7.5 cm) of the mean, it is obvious this data set contains a bias, and the only way to catch it is by either evaluating the value of the mean or using the RMSE as the accuracy measure. The high value of the RMSE = 0.257 ft. (7.83 cm) will flag the data as not meeting specifications. The far right column contains the error values after removing the bias of 0.246 ft. (7.5 cm) from the measurements. Once we remove the bias, the values for the RMSE and the standard deviation are equal and they both meet the project accuracy specifications. Removing a bias from elevation data could be as simple as shifting the entire dataset up or down by the magnitude of the bias itself, such practice is called z-pump.

Table 1 Vertical Accuracy Tabulation of Geospatial Product
Normal Distribution Curve
Normal Distribution Curve AnonymousIn randomly distributed repeated measurements, measurements values will vary around the mean or the average, with most values being closer to the average. Deviation from such behavior indicates the presence of bias(es) or perhaps blunders in the measurements. Figure 4 shows a true random distribution of a set of measurements that do not contain biases. For the measurement’s distribution in Figure 4, we notice that 68.2% of the measured values fall within +/- 1 RMSE or +/- 1 sigma from the mean value, that is 34.1% on both sides of the mean. We also notice that 95% of the measurements fall within +/- 2 RMSE or +/- 2 sigma from the mean. Understanding such distribution is essential to understanding the map accuracy standard we are going to discuss in the following sections.
Common Error Estimation Terms
Common Error Estimation Terms AnonymousTable 2 lists the most common terms used to estimate errors in surveying and mapping. Probable error is the term used to describe the probability, or the confidence level, that 50% of the errors fall within, while 95% errors represents the confidence level that 95% of the measured error values fall under.
| Error | % Error | Constant wrt σ |
|---|---|---|
| Probable Error | 50 | 0.6745 σ |
| Standard Error | 68.27 | 1.000 σ |
| 90% Error | 90 | 1.6449 σ |
| 95% Error | 95 | 1.9599 σ |
| 3σ Error | 99.73 | 3.0000 σ |
The different confidence levels (50% to 99.73% or 3 sigma) listed in Table 2 can be used to express the same accuracy level. For example, accuracy expressed via RMSE and at the 95% confidence level essentially reflects the same accuracy, differing only in their statistical confidence assignments.
To clarify these distinctions, consider the following example: In Figure 5, colored balls symbolize errors identified during an accuracy assessment using independent check points. Ball diameters indicate varying error magnitudes for each check point, while the funnel’s spout diameter corresponds to the maximum allowable error for each statistical metric—50%, 90%, 95%, and 97.73% confidence levels. For instance, Funnel D’s larger spout accommodates the greatest error, representing the 97.73% confidence level.
If users unfamiliar with these statistical terms are presented with various accuracy figures, they would likely select the smallest value—in this case, 6.74 cm associated with the 50% confidence level—as it suggests tighter accuracy. Conversely, producers might prefer the larger value of 30 cm at the 97.73% confidence level, anticipating greater flexibility. However, both selections are based on a misunderstanding: both values reflect the same underlying accuracy, differentiated solely by the proportion of checkpoints meeting that threshold. Specifically, for the 6.74 cm figure at the 50% confidence level, only half of the check points must meet this criterion, whereas at 30 cm and the 97.73% confidence level, nearly all must comply.
This nuanced distinction often leads to confusion among end users, prompting the decision to remove the 95% confidence level and rely exclusively on RMSE in the latest version of the accuracy standards of the American Society of Photogrammetry and Remote Sensing (ASPRS), which provides a clearer and more consistent metric for accuracy.

Positional Errors and Accuracy
Positional Errors and Accuracy AnonymousAccording to the ASPRS Positional Accuracy Standards for Digital Geospatial Data, the terms positional error and absolute and relative accuracy are defined as follow:
- Positional error – The difference between data set coordinate values and coordinate values from an independent source of higher accuracy for identical points.
- Positional accuracy – The accuracy of the position of features, including horizontal and vertical positions, with respect to a horizontal and vertical datum.
- Relative accuracy – A measure of variation in point-to-point accuracy in a data set.
- Relative accuracy – Characterizes the internal geometric quality of an elevation data set without regard to surveyed ground control.
The New ASPRS Positional Accuracy Standards for Digital Geospatial Data
The New ASPRS Positional Accuracy Standards for Digital Geospatial Data AnonymousIn November of 2014, the American Society of Photogrammetry and Remote Sensing (ASPRS) published Edition 1 of the first ever mapping accuracy standards that are solely designed for today's digital geospatial data. Edition 2, v1 was published on August 23, 2023 followed by version 2 on June 24, 2024 to correct some measures to suite today's technologies and processes and adding six addenda on best practices and guidelines. As of today, the final official version of the ASPRS accuracy standards is edition 2, version 2 (2024).
Motivation Behind the New Standard is:
- Legacy map accuracy standards, such as the ASPRS 1990 standard and the National Map Accuracy Standards (NMAS) of 1947, are outdated (over 30 years since ASPRS 1990 was written).
- Many of the data acquisition and mapping technologies that these standards were based on are no longer used.
- More recent advances in mapping technologies can now produce better quality and higher accuracy geospatial products and maps.
- Legacy map accuracy standards were designed to deal with plotted or drawn maps as the only medium to represent geospatial data.
- Within the past two decades (during the transition period between the hardcopy and softcopy mapping environments), most standard measures for relating GSD and map scale to the final mapping accuracy were inherited from photogrammetric practices using scanned film.
- New mapping processes and methodologies have become much more sophisticated with advances in technology and advances in our knowledge of mapping processes and mathematical modeling.
- Mapping accuracy can no longer be associated with camera geometry and flying altitude alone (focal length, xp, yp, B/H ratio, etc.).
- New map accuracy is influenced by many factors such as:
- the quality of camera calibration parameters;
- quality and size of a Charged Coupled Device (CCD) used in the digital camera CCD array;
- amount of imagery overlaps;
- quality of parallax determination or photo measurements;
- quality of the GPS signal;
- quality and density of ground controls;
- quality of the aerial triangulation solution;
- capability of the processing software to handle GPS drift and shift;
- capability of the processing software to handle camera self-calibration,
- the digital terrain model used for the production of orthoimagery.
These factors can vary widely from project to project, depending on the sensor used and the specific methodology. For these reasons, existing accuracy measures based on map scale, film scale, GSD, c-factor and scanning resolution no longer apply to current geospatial mapping practices.
- Elevation products from the new technologies and active sensors such as lidar, UAS, and IFSAR are not considered by the legacy mapping standards. New accuracy standards are needed to address elevation products derived from these technologies.
The New Standard Highlights
- Sensor agnostic, data driven: Positional Accuracy Thresholds which are independent of published GSD, map scale or contour interval
- It is All Metric!
- Unlimited Horizontal and vertical Accuracy Classes:
- Added additional Accuracy Measures
- Aerial triangulation accuracy,
- Ground controls accuracy,
- Orthoimagery seam lines accuracy,
- Lidar relative swath-to-swath accuracy,
- Recommended minimum Nominal Pulse Density (NPD)
- Horizontal accuracy of elevation data,
- Delineation of low confidence areas for elevation data
- Required number and spatial distribution of QA/QC check points based on project area
- Introduced a new accuracy type, the three-dimensional accuracy or 3D accuracy.
- Eliminated the use of 95% confidence level as an accuracy measure. RMSE is the only accuracy measure the new standards recognize and use.
- Factoring in the accuracy of the ground control and checkpoints survey when computing products accuracy.
- Added six addenda on best practices and guidelines for:
- General Best Practices and Guidelinesli>
- Field Surveying of Ground Control and Checkpoints
- Mapping with Photogrammetryli>
- Mapping with Lidarli>
- Mapping with UAS
- Mapping with Oblique Imageryli>
Advantage of Specifying the New ASPRS Positional Accuracy Standards for Digital Geospatial Data for a Project
Users of the new standards do not have to specify accuracy details for the intermediate processes in product generation. The user needs to specify the final deliverable product accuracy and the new standards will set up all accuracy specifications for intermediate processes, such as ground survey, aerial triangulation, etc., involved in the production of the final product. Figure 6 illustrates such a concept.

Horizontal Accuracy Standards for Geospatial Data
Horizontal Accuracy Standards for Geospatial Data AnonymousSome of the highlights of the new ASPRS Horizontal Accuracy Standards for Geospatial Data are the following:
Unlimited horizontal accuracy classes:
The new standard was designed to fit any horizontal accuracy requirement no matter what technology, current or future, is used. Table 3 represents the new ASPRS horizontal accuracy classes.Table 3 The new ASPRS horizontal accuracy classes Horizontal Accuracy Class Absolute Accuracy
RMSEH (cm)
Orthoimagery Mosaic Seamline Mismatch (cm) # cm ≤ # ≤ 2 × # RMSEH = RMSEx2 + RMSEy2
RMSEH = Radial RMSE = Circular RMSE = Two−dimensional RMSE of X & Y
- Aerial triangulation results should be twice as accurate as the generated products:
- Ortho and planimetric maps ONLY:
- RMSEH(AT) = ½ × RMSEH(Map)
and - RMSEV(AT) = RMSEH(Map)
- RMSEH(AT) = ½ × RMSEH(Map)
- For ortho, planimetric maps, and elevation maps:
- RMSEH(AT) = ½ × RMSEH(Map)
and - RMSEV(AT) = RMSEH(DEM)
- RMSEH(AT) = ½ × RMSEH(Map)
- Ortho and planimetric maps ONLY:
- Control points for aerial triangulation should be twice as accurate as the generated product:
- For ortho and planimetric maps ONLY:
- RMSEH(GCP) = ½ × RMSEH(Map)
and - RMSEV(GCP) = RMSEH(Map)
- RMSEH(GCP) = ½ × RMSEH(Map)
- For ortho/planimetric maps and elevation maps:
- RMSEH(GCP) = ½ × RMSEH(Map)
and - RMSEV(GCP) = ½ × RMSEV(DEM)
- RMSEH(GCP) = ½ × RMSEH(Map)
- For ortho and planimetric maps ONLY:
Table 4 lists common horizontal accuracy classes for geospatial mapping products.
| Horizontal Accuracy Class RMSEx and RMSEy (cm) | RMSEr (cm) | Orthoimage Mosaic Seamline Maximum Mismatch (cm) |
|---|---|---|
| 0.63 | 0.9 | 1.3 |
| 1.25 | 1.8 | 2.5 |
| 2.50 | 3.5 | 5.0 |
| 5.00 | 7.1 | 10.0 |
| 7.50 | 10.6 | 15.0 |
| 10.00 | 14.1 | 20.0 |
| 12.50 | 17.7 | 25.0 |
| 15.00 | 21.2 | 30.0 |
| 17.50 | 24.7 | 35.0 |
| 20.00 | 28.3 | 40.0 |
| 22.50 | 31.8 | 45.0 |
| 25.00 | 35.4 | 50.0 |
| 27.50 | 38.9 | 55.0 |
| 30.00 | 42.4 | 60.0 |
| 45.00 | 63.6 | 90.0 |
| 60.00 | 84.9 | 120.0 |
| 75.00 | 106.1 | 150.0 |
| 100.00 | 141.4 | 200.0 |
| 150.00 | 212.1 | 300.0 |
| 200.00 | 282.8 | 400.00 |
| 250.00 | 353.6 | 500.0 |
| 300.00 | 424.3 | 600.0 |
| 500.00 | 707.1 | 1000.0 |
| 1000.00 | 1414.2 | 2000.0 |
Vertical Accuracy Standards for Geospatial Data
Vertical Accuracy Standards for Geospatial Data AnonymousSome of the highlights of the new ASPRS Vertical Accuracy Standards for Geospatial Data are the following:
5. Unlimited vertical accuracy classes:
The new standard was designed to fit any vertical accuracy requirement, no matter what technology, current or future, is used. Table 5 represents the new ASPRS vertical accuracy classes.
| Vertical Accuracy Class | Absolute Accuracy | Data Internal Precision (where applicable) | |||
|---|---|---|---|---|---|
| NVA RMSEv (cm) | VVA RMSEv (cm) | Within-Swath Smooth Surface Precision Max Diff (cm) | Swath-to-Swath Non-Vegetated RMSDz (cm) | Swath-to-Swath Non-Vegetated Max Diff (cm) | |
| #-cm | ≤ # | As found | ≤ 0.60*# | ≤ 0.80*# | ≤ 1.60*# |
- Non-vegetated Vertical Accuracy (NVA) for any part of the project that is not covered by vegetation.
- Vegetated Vertical Accuracy (VVA) for the part of the project that is partly or fully covered by vegetation.
6. The standards introduced relative accuracy for elevation data, besides the absolute accuracy.
Table 6 lists a new accuracy term, which is the relative accuracy. It mainly addresses the Lidar-derived elevation data. The table also provides vertical accuracy examples and other quality criteria for ten common vertical accuracy classes.
| Vertical Accuracy Class | Absolute Accuracy | Data Internal Precision (where applicable) | |||
|---|---|---|---|---|---|
| NVA RMSEv (cm) | VVA RMSEv (cm) | Within-Swath Smooth Surface Precision Max Diff (cm) | Swath-to-Swath Non-Vegetated RMSDz (cm) | Swath-to-Swath Non-Vegetated Max Diff (cm) | |
| 1-cm | ≤ 1.0 | As found | ≤ 0.6 | ≤ 0.8 | ≤ 1.6 |
| 2.5-cm | ≤ 2.5 | As found | ≤ 1.5 | ≤ 2.0 | ≤ 4.0 |
| 5-cm | ≤ 5.0 | As found | ≤ 3.0 | ≤ 4.0 | ≤ 8.0 |
| 10-cm | ≤ 10.0 | As found | ≤ 6.0 | ≤ 8.0 | ≤ 16.0 |
| 15-cm | ≤ 15.0 | As found | ≤ 9.0 | ≤ 12.0 | ≤ 24.0 |
| 20-cm | ≤ 20.0 | As found | ≤ 12.0 | ≤ 16.0 | ≤ 32.0 |
| 33.3-cm | ≤ 33.3 | As found | ≤ 20.0 | ≤ 26.7 | ≤ 53.3 |
| 66.7-cm | ≤ 66.7 | As found | ≤ 40.0 | ≤ 53.3 | ≤ 106.7 |
| 100-cm | ≤ 100.0 | As found | ≤ 60.0 | ≤ 80.0 | ≤ 160.0 |
| 333.3-cm | ≤ 333.3 | As found | ≤ 200.0 | ≤ 266.7 | ≤ 533.3 |
7. The standards introduced horizontal accuracy estimation for elevation data
- For Photogrammetric elevation data, the horizontal accuracy equates to the horizontal accuracy class that would apply to planimetric data or digital orthoimagery produced from the same source imagery, using the same aerial triangulation/INS solution.
- For Lidar elevation data: use the following formula:
Table 7 lists some horizontal accuracy values for lidar data based on the previous formula (the GNSS horizontal accuracy is assumed to be equal to 0.10 m, the IMU error is assumed to be 10.0 arc-seconds for the roll and pitch and 15.0 arc-seconds for the heading)
| Flying Height (m) | GNSS Error (cm) | IMU Roll/Pitch Error (arc-sec) | IMU Heading Error (arc-sec) | RMSEH (cm) |
|---|---|---|---|---|
| 500 | 10 | 10 | 15 | 10.7 |
| 1,000 | 10 | 10 | 15 | 12.9 |
| 1,500 | 10 | 10 | 15 | 15.8 |
| 2,000 | 10 | 10 | 15 | 19.2 |
| 2,500 | 10 | 10 | 15 | 22.8 |
| 3,000 | 10 | 10 | 15 | 26.5 |
| 3,500 | 10 | 10 | 15 | 30.4 |
| 4,000 | 10 | 10 | 15 | 34.3 |
| 4,500 | 10 | 10 | 15 | 38.2 |
| 5,000 | 10 | 10 | 15 | 42.0 |
8. The Standards Introduced a Formal Accuracy Testing Statement:
For the first time, the new standards provide users with formal data evaluation statements to be used by the data users and data producers. The following statements are examples of the accuracy statement of an elevation dataset:
8.1 Accuracy Reporting by Data User or Consultant
This type of reporting should only be based on a set of independent checkpoints. The positional accuracy of digital orthoimagery, planimetric data, and elevation data products shall be reported in the metadata in one of the manners listed below. For projects with NVA and VVA requirements, two three-dimensional positional accuracy values should be reported based on the use of NVA and VVA, respectively.
8.1.1 Accuracy Testing Meets ASPRS Standard Requirements
If testing is performed using a minimum of thirty (30) checkpoints, accuracy assessment results should be reported in the form of the following statements:
Reporting Horizontal Positional Accuracy
“This data set was tested to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEH horizontal positional accuracy class. The tested horizontal positional accuracy was found to be RMSEH = __(cm)”.
Reporting Vertical Positional Accuracy
“This data set was tested to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEV Vertical Accuracy Class. NVA accuracy was found to be RMSEV = __(cm).” VVA accuracy was found to be RMSEV = __(cm).”
Reporting Three-Dimensional Positional Accuracy
“This data set was tested to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a ___ (cm) RMSE3D three-dimensional positional accuracy class. The tested three-dimensional accuracy was found to be RMSE3D = ___(cm).”
8.1.2 Accuracy Testing Does Not Meet ASPRS Standard Requirements
If testing is performed using fewer than thirty (30) checkpoints, accuracy assessment results should be reported in the form of the following statements:
Reporting Horizontal Positional Accuracy
“This data set was tested as required by ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023). Although the Standards call for a minimum of thirty (30) checkpoints, this test was performed using ONLY __ checkpoints. This data set was produced to meet a ___(cm) RMSEH horizontal positional accuracy class. The tested horizontal positional accuracy was found to be RMSEH = ___(cm) using the reduced number of checkpoints.”
Reporting Vertical Positional Accuracy
“This data set was tested as required by ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023). Although the Standards call for a minimum of thirty (30) checkpoints, this test was performed using ONLY __ checkpoints. This data set was produced to meet a ___(cm) RMSEV vertical positional accuracy class. The tested vertical positional accuracy was found to be RMSEV = ___(cm) using the reduced number of checkpoints.”
Reporting Three-Dimensional Positional Accuracy
“This data set was tested as required by ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023). Although the Standards call for a minimum of thirty (30) checkpoints, this test was performed using ONLY __ checkpoints. This data set was produced to meet a ___(cm) RMSE3D three-dimensional positional accuracy class. The tested three-dimensional positional accuracy was found to be RMSE3D = ___(cm) using the reduced number of checkpoints.”
8.2 Accuracy Reporting by Data Producer
In most cases, data producers do not have access to independent checkpoints to assess product accuracy. If rigorous testing is not performed by the data producer due to the absence of independent checkpoints, accuracy statements should specify that the data was “produced to meet” a stated accuracy. This “produced to meet’’ statement is equivalent to the “compiled to meet” statement used by prior Standards when referring to cartographic maps. The “produced to meet’’ statement is appropriate for data producers who employ mature technologies and who follow best practices and guidelines through established and documented procedures during project design, data processing, and quality control. However, if enough independent checkpoints are available to the data producer to assess product accuracy, it will do no harm to report the accuracy using the statement provided in section 4.1 above.
If not enough checkpoints are available, but the data producer has demonstrated that they are able to produce repeatable, reliable results and thus able to guarantee the produced-to-meet accuracy, they may report product accuracy in the form of the following statements:
Reporting Horizontal Positional Accuracy
“This data set was produced to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEH horizontal positional accuracy class.
Reporting Vertical Positional Accuracy
“This data set was produced to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEV vertical accuracy class.
Reporting Three-Dimensional Positional Accuracy
“This data set was produced to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a ___ (cm) RMSE3D three-dimensional positional accuracy class
9. The Standards introduced a new accuracy term, the Three-Dimensional Positional Accuracy:
The following formula defines the three-dimensional accuracy standard for any three-dimensional digital data as a combination of horizontal and vertical radial error. RMSE3D is derived from the horizontal and vertical components of error according to the following formula:
10. The Standards introduced a new approach for assessing product accuracy by factoring in the accuracy of the surveyed checkpoints when computing product accuracy:
As we are producing more accurate products, errors in surveying techniques of the checkpoints used to assess product accuracy, although they are small, can no longer be neglected, and they should be represented in computing the product accuracy. Currently, we quantify product accuracy, ignoring the errors in the surveyed checkpoints. In such practice, our surveying techniques approximate the datum, i.e., producing pseudo datum, and therefore, we are evaluating the closeness of data to the pseudo datum and not the true datum. The following figure illustrates the current practices and the new ones proposed in Edition 2 of the ASPRS standards.

Currently, we model error as follows:
The proposed method:
Best Practices in Determining Product Accuracy*
- Check data should not be used in calibrating the tested products:
- Totally independent checkpoints
- Check data must be more accurate than the tested data:
- Two times more accurate
- Check data must be well distributed around the project area:
- Check data must be a valid statistical sample:
- Minimum of 30 checkpoints for orthos
- Minimum of 30 checkpoints for elevation data
* according to the ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2, v2 (2024)
The new ASPRs Standards and number of check points
The new ASPRs Standards and number of check points AnonymousThe new standards provide Table 8 for the recommended number of check points required for validating product accuracies. A minimum of 30 check points should be used to assess vertical or horizontal accuracy for a product. For project areas that are larger than 10,000 square kilometers, use only 120 checkpoints.
| Project Area (Square Kilometers) | Total Number of Checkpoints for NVA |
|---|---|
| ≤10005 | 30 |
| 1001-2000 | 40 |
| 2001-3000 | 50 |
| 3001-4000 | 60 |
| 4001-5000 | 70 |
| 5001-6000 | 80 |
| 6001-7000 | 90 |
| 7001-8000 | 100 |
| 8001-9000 | 110 |
| 9001-10000 | 120 |
| >10000 | 120 |
Elevation Data Quality Versus Positional Accuracy
Elevation Data Quality Versus Positional Accuracy qaa3When modeling terrain with lidar, it is important to be aware of the difference between elevation data quality and positional accuracy. In many instances, users of lidar data focus solely on point cloud accuracy as specified by sensor manufacturers, but an accurate lidar point cloud does not necessarily result in accurate modeling of the terrain, nor will it create accurate volumetric calculations: elevation data must also faithfully represent the terrain detail. Therefore, users should also consider point density as it relates to terrain roughness or smoothness, as this is an equally important aspect of accurate terrain modeling.
Terrain modeling methodologies (e.g., polygon-based Regular Triangulated Networks (RTNs) versus Triangulated Irregular Networks (TINs) versus Voxel-Based Networks) also affect the terrain model quality. Terrain analysis is sensitive to whether the software represents the point cloud as a TIN, a gridded surface, or an RTN. Methods that involve gridding the data are sensitive to grid cell size (post spacing). Note that lidar point density is an important factor when choosing grid cell size.
The Figure below illustrates the relationship between terrain roughness and point density. While the point cloud in this example may have a vertical accuracy of RMSEV = 10-cm, TIN interpolation based on surrounding areas of low point density places the vertical position of point A at point A’, resulting in a vertical error of 2 meters in this example. The remedy is to obtain the point cloud at a higher density so that it more accurately represents the terrain detail. Attempting to use a low-density point cloud to represent terrain with high frequencies of undulation will result in inaccurate volume estimations, regardless of what software or modeling algorithms are used. Smoother terrain may be adequately represented with a lower density point cloud. Very smooth or flat terrain can be accurately modeled using a point cloud with nominal post spacing (NPS) of a few meters or coarser.
The Nyquist-Shannon sampling theorem, which is well-known and widely used in signal processing, may be used to determine the point density required to accurately represent the project terrain. According to the Nyquist-Shannon sampling theorem, if a signal x(t) contains no frequencies higher than B Hz, then a sampling rate of greater than 2B samples per second (or 2B Hz) will be needed in order to reconstruct the original signal without aliasing.
For example, let us assume that the undulation rate of the terrain represents the highest frequency of the signal to be modeled, and the nominal point spacing represents the sampling rate needed to model the terrain without aliasing. If we want to accurately model rocky terrain where the spikes caused by these rocks appear every 30 cm on average, the nominal point spacing of the lidar data used to model this terrain should be less than 15 cm.
Summary and Final Tasks
Summary and Final Tasks szw5009Summary
Congratulations! You have just completed Lesson 8. You may have noticed from the different sections of the lessons that the UAS market is growing rapidly. There are quite a few manufacturers for the civilian UAS, as well as software and sensor producers. User requirements will drive the selection process for the UAS and the processing software that is right for the job. Required UAS endurance, range and payload capacity will be different from one application to another. However, most applications will prefer more endurance, longer range, and heavier payload if the price is right.
In this lesson, you also learned about the value of evaluating data quality and accuracy and how to use the new ASPRS standards to report such quality and accuracy factors.
By now, you must be finishing the products generation of ortho photo and digital elevation model using Pix4D and the sample imagery. Samples of the products need to be submitted with your project report and presented next week during your presentation.
Final Tasks
| 1 | Study Lesson 8 materials and the text books chapters assigned to the lesson |
|---|---|
| 2 | Complete Lesson 8 Quiz |
| 3 | Submit your COA Application |
| 4 | Complete your discussions for the assignment on "FAA Road map" |
| 5 | Complete your discussions for the assignment on "Differences Between Rules and Regulations" |
| 6 | Attend the weekly call on Thursday evening at 8:00pm ET |
Lesson 9: Civilian and Commercial Applications of the Unmanned Aerial System
Lesson 9: Civilian and Commercial Applications of the Unmanned Aerial System szw5009Digital Image Classification for Land Use Land Cover Assessment (LCLU)
Digital Image Classification for Land Use Land Cover Assessment (LCLU) qaa3Digital Image Classification is an information extraction process (machine or automated interpretation) that involves the application of pattern recognition theory to multispectral imagery. It analyzes spectral properties of various surface features (e.g., crops) in a multiband image and sorts spectral data into spectrally related categories by the use of predefined, numerical decision rules.
The process involves:
- Categorizing images into different surface materials or conditions
- Collection of spectral signatures for specific surface materials
- Based upon spectral response
- Assumes unique spectral signatures exist for land covers
- Trains the computer to recognize those spectral signatures
- Statistical operation
- No direct site or situation information
- Does not rely on visual interpretation
- Not necessarily more accurate or objective than visual interpretation
- Someone must decide the classes and whether signatures are accurate or not
- Signature extraction
- Unsupervised classification using statistics or clustering algorithm
- Supervised classification using training sites
- Poor signatures lead to poor results
- Possibly stratify broad classes first
- Classify the imagery
- Spatial filtering for GIS compatibility
- Accuracy assessment
The process utilizes one or more of the following recognition types:
- Spectral pattern recognition: When decision rules are based on spectral radiance characteristics of the scene.
- Spatial pattern recognition: When decision rules are based on geometric characteristics of the scene (i.e. shape, size, patterns)
- Temporal pattern recognition: uses time as an aid in feature identification
- Object-oriented classification: involve combined use of both spectral and spatial recognition
Among the difficulties usually encountered with this technique are the following:
- Signature is not unique for given sensor characteristics
- Signature too unique
- Same cover type but with distinct differences
- Soil moisture, surface material, atmospheric conditions
- Same cover type but with distinct differences
- Mixed pixels
- Signature extension issue (over space and time)
There are two types of image classification algorithms, those are:
- Unsupervised Classification: uses an automatic clustering algorithm that analyzes the “unknown” pixels in the database and divide them into several spectrally distinct classes based upon their natural grouping or clusters.
In the unsupervised process, a user directs a computer software package to automatically identify and categorize pixels in an image. This is done on a purely statistical basis, though the user has control over the number of statistical classes, or clusters, to be created. With different classification algorithms, the user will also have control over statistical parameters, such as how much variation is permitted in a single class.
While there are no set rules on how many classes should be defined, a general rule of thumb is that the classes should total three times the number of final land cover categories sought. This allows for the possibility of different spectral signatures pertaining to the same land cover type (for example, if forest is sought as a class, deciduous and coniferous forests may require more than one spectral signature to accurately categorize them as forest). A number of these classes will likely represent meaningless categories or mixed pixels that may then be thrown out at a later point in the process.
Once a set of signatures has been defined, they may then be used to classify the entire image. Pixels with statistical characteristics similar to those in the signature set will be assigned the appropriate class. The resulting thematic layer has every pixel assigned a value representing the signature it was determined to be best represented by, Figure 8.1. This data set is then evaluated by the user to determine what land cover type is represented.
Processing Steps:
a - clustering or grouping
b – coloring
c – identification
Figure 8.1 Clustering process in unsupervised Classification
Advantages and Disadvantages of Unsupervised Classification
Pros:
+ no extensive prior knowledge of the region required
+ opportunities for human error minimized
+ unique classes are recognized as distinct units
+ logistically less cumbersome
Cons:
- natural groupings do not necessarily correspond nicely with desired information classes
- no control over the menu of classes and their specific id
- spectral properties of informational classes vary over time, relationships between information and spectral classes change - make it difficult to compare unsupervised
classes from one image/date to another
- Supervised Classification: The process of using samples of known informational classes (training sets) to classify pixels of unknown identity. Identification and delineation of training areas is key to successful implementation. The basic strategy in using supervised classification is to sample areas of known cover types to determine representative spectral values of each cover type. Such samples are called training areas or training fields, Figure Training fields or spectral signatures are established from homogeneous cover type areas through:
- Map digitizing - transfer photo information to base map (use table digitizer)
- On-screen digitizing, Figure 8.2
- Seed-pixel approach
The main steps in supervised classification are the following:
- Training Class Selection
- Generation of statistical parameters to train the classification algorithm, such as:
- class means
- standard deviations
- covariance matrices
- correlation matrices
- Data Classification: Assigning each pixel of the data to one of the training classes
- Evaluation and Refinement
- Documentation: Maps and tabular summaries
Additional Remarks:
- With supervised classification, the process relies on user input to identify areas of specific cover types and to apply a classification algorithm that then utilizes that information to find the same cover type in other regions of the image. This process often involves having specific information on ground conditions collected through fieldwork or through high-resolution imagery/aerial photography. The sites of interest are referred to as training or calibration sites. Essentially, those portions of the image that will be used to identify other portions of the image with similar characteristics. A second set of “known” sites may be reserved for use in accuracy assessment. These are often referred to as calibration or truth sites.
- Statistics for training sites are used to define a spectral signature for specific cover types of interest. A range of supervised classification processes exist but the Maximum Likelihood classifier is one of the most common.
- Each pixel in an image is compared to the statistics compiled for the different signatures. The classification algorithm determines on a probability basis the likelihood that any given pixel should be assigned to any given class. The class providing the highest likelihood is the one to which the pixel is assigned to.
Good Strategy for Supervised Classification
- Number of pixels - want to statistically characterize the spectral properties of an informational class (i.e. forest, crop, water), should have >= 100 pixels total for an informational class
- Location - geographically dispersed, boundaries away from edge/mixed pixels number of areas - depends on the number of information categories, 10 at a minimum, enough for accuracy assessment and incorporation of spectral subclasses
- Uniformity - unimodal distributions, use training areas to characterize mean, variance, covariances - sometimes not easy due to spectral variation present
Read more on Digital Image Classification.
To Read
To Do
- Submit materials for Digital Image Classification
Lesson 9 Introduction
Lesson 9 Introduction mjg8Welcome to Lesson 9! In this lesson, you will become familiar with the different applications that the UAS is utilized for. The list of commercial and civilian applications is increasing by the day. It is difficult, if it is not impossible, to nail down such a list. The low cost and easy deployment of the UAS encouraged many people to utilize the unmanned aircraft to replace manned aircraft for their activities. Users are discovering new applications every day; however, we will only cover in this lesson the most obvious one. We will not cover the military application, as it is obvious, but we will consider, for the purpose of this lesson, the security and surveillance use of UAS as a civilian/commercial application since some of such services are offered commercially. Much of the commercial and scientific use of UAS that concerns us is in the field of geospatial data acquisition for remote sensing activities. The term “geospatial data” refers to any dataset that is referenced spatially (i.e., geolocated or geo-referenced) with known coordinates systems and datum. I expect from you in this lesson to read chapter 6 of the textbook Introduction to Unmanned Aircraft Systems and several external readings I will point out in the lesson notes.
Lesson Objectives
At the successful completion of this lesson, you should be able to:
- recognize different applications of the UAS for civilian use;
- understand how the UAS data is used for different applications;
- compose a list of additional applications that can be served by UAS.
Lesson Readings
Course Textbooks
- Chapter 4 of the textbook:: Introduction to Unmanned Aircraft Systems, 2nd edition
- Chapter 20 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th edition
- Chapters 8 to 19 of the textbook: Fundamentals of capturing and processing drone imagery and data
Web Articles
- Gahran, A. “Fighting fire with data, spacecraft, drones"
Google Drive (Open Access)
- Chao, H., et al., "AggieAir: Towards Low-cost Cooperative Multispectral Remote Sensing Using Small Unmanned Aircraft Systems"
- Read the dronenodes.com article Commercial Drone Applications On The Rise | Enerprise UAV Solutions | 2018
- Read the lecture slides on Digital Image Classification
Lesson Activities
- Study lesson 9 materials on CANVAS/Drupal and the text books chapters assigned to the lesson
- Submit your Final Project Report and Presentation Slides
- Start your first post for the discussion on "The UAS and Ethics"
- Submit materials for exercise 3 - Digital Image Classification
- Attend the weekly call on Thursday evening at 8:00pm ET
- Watch the webinar: " Applying Drones to Surveying and Engineering Projects Today."
The Different Application of the UAS
The Different Application of the UAS szw5009In this section, you will become familiar with and understand the different civilian and commercial applications of the UAS as it stands today. UAS applications that concern us the most are the remote sensing applications. Here, the UAS is replacing manned aircraft as an acquisition platform. Remote sensors such cameras and LiDAR systems are shrunk in size and weight to make them more suitable for the lightweight small UAS as was mentioned in the Payload section of Lesson 2. Remote sensing applications derived from sensors onboard a UAS are more or less similar to the applications that one can expect from a manned system. Manned aircraft can carry larger and heavier payload, which open the door for additional applications that required large sensors such as IFSAR. Reported applications for the UAS include the following:
- Remote Sensing Applications
- Precision Agriculture: Precision agriculture is the most widely used civilian application of the UAS. Farmers and the agricultural community are very optimistic about the prospect of using UAS for their daily activities. The following articles should provide you with a fairly decent idea of the topic:
- Range Land Management
- Landslides Research: Engineers are using UAS for land monitoring and management. This field is also witnessing a promising future with the use of the UAS for their daily repetitive monitoring activities.
- Ocean and coastal Research
- Contaminant Spills and Pollution
- Landfill Mapping and monitoring
- Engineering and Surveying
- Corridor Mapping
- Mining site mapping
- Crop and aquaculture farm monitoring
- Mineral exploration
- Spectral and thermal analysis
- Critical infrastructure monitoring, including power facilities, ports, and pipelines
- Commercial photography, aerial mapping and charting, and advertising
- Disaster response, including search and support to rescuers, in situations such as:
- fires,
- floods and hurricanes,
- landslides
- Medical Supplies Delivery
- Traffic monitoring, and
- Other environmental control and monitoring.
- General Applications and Services
- media resources
- security awareness
- communications and broadcast, including news/sporting event coverage
- cargo transport
Details on some of these applications are given in chapter 6 of the textbook and assigned readings listed below. Try to visit the UAV Applications in this site, as it has interesting information about different aspects of the UAS and its applications. Another way to explore potential applications of UAS-derived products is to look into the different applications of Geographic Information System (GIS) as they are closely related. In this regard, ESRI published on their website a good educational overview to highlight the different applications of GIS.
To Read
- Chapter 4 of the textbook:: Introduction to Unmanned Aircraft Systems, 2nd edition
- Chapter 20 of the textbook: Elements of Photogrammetry with Applications in GIS, 4th edition
- Chapters 8 to 19 of the textbook: Fundamentals of capturing and processing drone imagery and data
To Do
Watch the webinar: "Applying Drones to Surveying and Engineering Projects Today"
UAS for Disaster Response
UAS for Disaster Response szw5009In this section, you will become familiar with a widely used application of the UAS: the UAS for disaster response.
One of the widely utilized applications for the UAS is for disaster response situations. UAS is particularly useful for tasks that include one or all the 3 Ds -- dirty, dangerous, and dull:
Dirty: is much open to interpretation and to operation environment, but it can be described by flying over oil, nuclear, or gas installation sites where accidents have occurred, such as the Japanese Fukushima Daiichi nuclear plant, to take air samples or imagery.
Dangerous: refers clearly to situations where a pilot in a similar mission could become a casualty due to dangerous operations.
Dull: is when repetitive tasks are required over and over again. An example of the dull mission is border surveillance and maritime patrols that need eyes in the sky for hours at a time.
For UAS to suitably serve disasters, it needs more capabilities besides its adaptation for the 3 D’s factors. Such capabilities are defined by survivability, durability, and adaptability.
Survivability: Survivability of a UAS in a disaster response scenario relies on its efficient system of communications. For a UAS search and rescue mission, the UAS should consider three forms of communications. Those are:
- communication between the UAS operator and the UAS;
- communication between the operator and the victims on the ground;
- communication between other rescue ground machines and their teams.
Durability: The system's ability to survive a harsh or unpredictable operation environment such as unpredictable dropping debris, changing environment and loss of signal. UAS operation designers in such environments usually relay on multi-level UASs. As an example of this is the utilization of a High Altitude Long Endurance (HALE) UAS in the operation to carry equipment, provide a backup communication link, and to provide a high altitude overview of the site to plan emergency exits routes.
Adaptability: The ability of a mini-UAS with its small size to overcome fallen debris and unpredictable narrow spaces while maintaining its ability in sensing changes in an unpredictable and uncertain environment.
As examples of the use of UAS for disaster response, we will single out the UAS use for forest fire disasters.
UAS for Forest Fires:
Remote sensing techniques have proven to be very effective in mapping and monitoring fires and in giving feedback to first responders. Satellite remote sensing has limited capabilities in supporting fire response. This is due to the fact that the most available satellites have limited spatial resolutions (limited details) and they only occasionally orbit over the fire site, while fire monitoring needs continuous (24/7) coverage. However, satellite imagery can be useful in monitoring fires on a regional or national level, but not on a fire-front micro level. Thermal imagery from MODIS sensors on board the Terra and Aqua satellites with a resolution of 1 km were used by the U.S. Department of agriculture Forest Service Active Fire Mapping Program to monitor regional fires across the U.S. Beside the coarse resolution of its imagery, MODIS orbit any location only twice daily, which is infrequent for tracking the evolution of the fire and to support firefighters in real time.
Alternative to satellite imagery, aerial imagery from manned and unmanned aircraft is frequently used to provide needed frequent aerial observations of a fire. Two approaches were utilized in using the UAS for fire monitoring. The first High Altitude Long Endurance system (HALE) UAS can fly high and provide imagery with better resolution and better frequency than satellites. However, HALE UAS is expensive to procure and to maintain.
The second approach uses fleets of small UAS working cooperatively to provide more detailed information on the fire and its perimeter. In some cases, both approaches are utilized together with the HALE providing an overview image of the fire while small UASs are used to transmit high definition imagery in real time for the perimeter areas of the fire.
Here in the U.S., several wildfire monitoring programs have been adopted over the years. An example of such programs is the joint cooperation between NASA, General Atomics Aeronautical Systems, Inc. and various government agencies involved in fire research. The project used the General Atomics ALTUS II UAS, which is the civilian version of the Predator. Among the sensors on board the ULTUS II payload was a thermal multispectral scanner. Imagery was transmitted to the ground station through INMARSAT geostationary satellites. Once the imagery is received at the ground station, it goes through the geo-referencing and ortho-rectification processes, which convert them to a geo-referenced map before it goes into the hands of the field team. NASA published images (Figure 8.3) of the Grass Valley/Slide fire near Lake Arrowhead/Running Springs in the San Bernardino Mountains of Southern California acquired by the thermal-infrared imaging sensors on board NASA's Ikhana unmanned research aircraft. For more information on past NASA collaborative efforts in the field of different applications for UAS, visit UAS Integration in the NAS.

To Read
UAS Challenges in Certain Applications
UAS Challenges in Certain Applications szw5009In this section, we will discuss operational challenges in using the UAS for certain applications.
So far, we have read and discussed materials about the successful utilization of the unmanned aircraft for a variety of applications. However, some of such applications are found to be challenging due to different reasons. Among such reasons are the following:
- FAA hesitates in allowing UASs to fly during natural disaster situations such as floods and hurricanes. This is mainly due to the fact that operating a UAS during a storm lacks alternative communications capabilities. During storms, the air traffic control capabilities in the affected area are usually limited, risking the safety of the UAS, which usually operates without sense-and-avoid instruments.
- UAS offers many advantages over conventional of traffic monitoring and transportation planning for police, emergency responders, and DOT. UAS can move from one location to another with higher speed and is not restricted to specific routes that are usually used by ground vehicles. In addition, UAS can fly through hazardous or inclement weather conditions. However, UAS used for traffic monitoring is challenged in urban canyon areas where visibility of the traffic on the ground is obscured by high rise buildings.
- Small UASs cannot maintain their flying routes during stormy conditions. The lightweight of the UAS makes it venerable to gusty winds.
- Here in the U.S., it is difficult to obtain the FAA proper approval to fly civilian projects whenever there are people in the project area, even after the issuance of Part 107. Such restriction is expected to be diminishing in the future as the FAA continues its efforts to integrate the UAS into the NAS.
To Read
- Chapter 6 of the textbook: Introduction to Unmanned Aircraft Systems, 2nd edition
Summary and final tasks
Summary and final tasks szw5009Summary
Congratulations! You have just finished Lesson 9, Civilian and Commercial Applications of the Unmanned Aerial System. You may notice that the use of UAS for civilian applications extends to almost any applications offered by manned aircraft. In fact, the UAS provides more opportunities than the manned aircraft. The UAS, with its small maneuvering size and its low-cost operation, makes them more useful and more affordable than manned aircraft. That is very true for small projects and projects that may involve hazardous operational conditions. UAS applications are expanding, and we hear about new applications every day. Amazon, for example, recently unveiled plans for UAV package delivery service. What do you think the coolest application is that the UAS should be used for and that no one has thought about until now? Post your opinion in the discussion form.
Final Tasks
| 1 | Study lesson 9 materials on CANVAS/Drupal and the text books chapters assigned to the lesson |
|---|---|
| 2 | Complete quiz 9 |
| 3 | Submit your Final Project Report and Presentation Slides |
| 4 | Start your first post for the discussion on "The UAS and Ethics" |
| 5 | Submit materials for exercise 3 - Digital Image Classification |
| 6 | Attend the weekly call on Thursday evening at 8:00pm ET |
Lesson 10: UAS Safety and Privacy Concerns
Lesson 10: UAS Safety and Privacy Concerns qaa3Privacy Concerns
Some American citizens, including lawmakers, believe that before Unmanned Aerial Vehicles (UAVs) start routinely observing Americans from above, the Federal Aviation Administration (FAA) needs to address two little concerns, safety, and privacy. The issue of privacy was received differently depending on who is involved in evaluating the UAS impact. Many of us believe that flying a UAS with a camera is no different from flying a helicopter during a mapping mission. However, others see it as clear invasion of their personal privacy, and they want the FAA to curb the use of the UAS in overpopulated areas. During 2013, the Senate Judiciary Committees held a hearing on UAV/UAS issue, where it was very clear that senators in both parties are worried about the threat to Americans’ privacy posed by increasing use of the unmanned aerial systems (UASs). The article "Lawmakers voice concerns on drone privacy questions" published by NBC details the outcome of the discussions occurred during that hearing.
Safety and Security Concerns
Integrating small UAS into the NAS raises concerns as unmanned airplanes pose risk by potential air collision with other airplanes or by causing properties damage or loss of life on the ground. Many people and lawmakers also have concerns about their security, as they believe that drones can be hacked and used for terrorist acts. The report "Unmanned Aerial Vehicles: Examining the Safety, Security, Privacy and Regulatory Issues of Integration into U.S. Airspace" provides fairly good details on the issues of UAS privacy, security, and safety.
PART 107 The Good News
The burden of concerns over privacy was lessened with the issuing of PART 107 by the FAA, as it contained no clause to regulate the use of UAS as it relates to privacy. It was the right move by the FAA as the topic is controversial and there is no easy solution for it. However, things may change in the future as the "FAA Reauthorization Act of 2018" which was authorized by Congress during its 115th session on April 13, 2018 brought back the privacy issues to the table by mandating the FAA to carry out a review to identify any potential reduction of privacy specifically caused by the integration of unmanned aircraft systems into the national airspace system.
Activities
UAS is capable of collecting very high definition/resolution imagery of people's backyards and perhaps through windows. The public in the United States expressed two main opinions about allowing UAS to fly over populated areas, especially if it is used for surveillance and search and rescue missions.
Discussion Forum
- Review the article "Concerns Over Drone Aircraft Used in the United States; Safety and Privacy Still Concerns, GAO Reports" that discusses the privacy and safety issues of using UAS.
- Read the report "Unmanned Aerial Vehicles: Examining the Safety, Security, Privacy and Regulatory Issues of Integration into U.S. Airspace"
- Read the article "Implications of Drones on American Privacy and Freedom" by Mike Tully.
Lesson Readings
Course Textbooks
- Section 1.8 of chapter 1 of the textbook: Introduction to UAV Systems (Aerospace Series), 5th edition
- Chapter 16 of the textbook: Introduction to the Unmanned Aircraft Systems, 2nd edition
Lesson Activities:
- Study lesson 10 materials on CANVAS/Drupal and the text books chapters assigned to the lesson
- Complete your discussions for the assignment on "The UAS and Ethics"
- Submit your materials for exercise 4
- Complete the final comprehensive quiz opens on... (check schedule on CANVAS)
- Present your final project to class on . (check schedule on CANVAS)






