Reinforcement Learning for Autonomous Drone Navigation
DOI:
https://doi.org/10.36676/irt.2023-v9i5-002Keywords:
valuable in security and law, capability, perspectives, autonomouslyAbstract
Drone navigation involves the process of controlling the movement and flight path of unmanned aerial vehicles (UAVs). It encompasses both the hardware and software systems that enable drones to navigate and maneuver autonomously or under the guidance of a human operator. The utility of drone navigation is vast and varied, making it a critical component in numerous industries and applications. Firstly, drone navigation plays a crucial role in aerial surveillance and reconnaissance. Drones equipped with advanced navigation systems can efficiently patrol large areas, monitor activities, and gather real-time data from various perspectives. This capability is particularly valuable in security and law enforcement operations, disaster response, and environmental monitoring, where access and visibility might be limited.
Drone navigation is critical in aerial mapping and surveying. Drones outfitted with GPS and other positioning technologies may record photographs and collect data with pinpoint accuracy, allowing for the creation of comprehensive 3D maps, topographic models, and land surveys. This allows for more precise, quicker, and cost-effective data collecting for urban planning, agricultural, infrastructure inspection, and building projects. Furthermore, drone navigation is critical in the transportation and logistics industries. Autonomous drones can fly predetermined paths to rapidly and effectively transfer commodities, medical supplies, and other payloads to distant or inaccessible regions. This technology has the potential to transform last-mile distribution, especially in rural areas or during emergencies when regular transportation routes may be hampered.
References
Bezas K, Tsoumanis G, Angelis CT, Oikonomou K. Coverage Path Planning and Point-of-Interest Detection Using Autonomous Drone Swarms. Sensors (Basel). 2022 Oct 5;22(19):7551. doi: 10.3390/s22197551.
Çetin E, Barrado C, Pastor E. Counter a Drone in a Complex Neighborhood Area by Deep Reinforcement Learning. Sensors (Basel). 2020 Apr 18;20(8):2320. doi: 10.3390/s20082320.
Çetin E, Barrado C, Pastor E. Countering a Drone in a 3D Space: Analyzing Deep Reinforcement Learning Methods. Sensors (Basel). 2022 Nov 16;22(22):8863. doi: 10.3390/s22228863.
Foehn P, Brescianini D, Kaufmann E, Cieslewski T, Gehrig M, Muglikar M, Scaramuzza D. AlphaPilot: autonomous drone racing. Auton Robots. 2022;46(1):307-320. doi: 10.1007/s10514-021-10011-y. Epub 2021 Oct 19.
Gnanasekera M, Katupitiya J, Savkin AV, De Silva AHTE. A Range-Based Algorithm for Autonomous Navigation of an Aerial Drone to Approach and Follow a Herd of Cattle. Sensors (Basel). 2021 Oct 29;21(21):7218. doi: 10.3390/s21217218.
Pestana J, Maurer M, Muschick D, Hofer M, Fraundorfer F. Overview obstacle maps for obstacle-aware navigation of autonomous drones. J Field Robot. 2019 Jun;36(4):734-762. doi: 10.1002/rob.21863. Epub 2019 Feb 13.
Rojas-Perez LO, Martinez-Carranza J. DeepPilot: A CNN for Autonomous Drone Racing. Sensors (Basel). 2020 Aug 13;20(16):4524. doi: 10.3390/s20164524.
Yoo T, Lee S, Yoo K, Kim H. Reinforcement Learning Based Topology Control for UAV Networks. Sensors (Basel). 2023 Jan 13;23(2):921. doi: 10.3390/s23020921.
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.