Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems

 
CONTINUE READING
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
Aerostack2: A Software Framework for Developing Multi-robot Aerial
                                                                          Systems
                                                                  Miguel Fernandez-Cortizas1∗ , Martin Molina1,2 , Pedro Arias-Perez1 ,
                                                                      Rafael Perez-Segui1 , David Perez-Saura1 , Pascual Campoy1

                                                   1
                                                       Computer Vision and Aerial Robotics Group (CVAR), Centre for Automation and Robotics (C.A.R.),
                                                                    Universidad Politécnica de Madrid (UPM-CSIC), 28006 Madrid, Spain
                                                                                   2
                                                                                      Department of Artificial Intelligence
arXiv:2303.18237v1 [cs.RO] 31 Mar 2023

                                                                       Universidad Politécnica de Madrid (UPM), 28040 Madrid, Spain
                                                          ∗
                                                            To whom correspondence should be addressed; E-mail: miguel.fernandez.cortizas@upm.es

                                            Abstract— In recent years, the robotics community has              To address these challenges, this paper proposes a collabo-
                                         witnessed the development of several software stacks for           rative framework for aerial robotics that brings together users
                                         ground and articulated robots, such as Navigation2 and             and developers to work towards a common goal. Building
                                         MoveIt. However, the same level of collaboration and
                                         standardization is yet to be achieved in the field of aerial       on the work of others, we aim to enhance research in aerial
                                         robotics, where each research group has developed their own        robotics and accelerate their application in industry fields.
                                         frameworks. This work presents Aerostack2, a framework for         This paper presents an overview of our proposed framework
                                         the development of autonomous aerial robotics systems that         and discusses its potential impact in the field.
                                         aims to address the lack of standardization and fragmentation         The presented software framework Aerostack2 is an evo-
                                         of efforts in the field. Built on ROS 2 middleware and featuring
                                         an efficient modular software architecture and multi-robot         lution of a former one called Aerostack [3] which was
                                         orientation, Aerostack2 is a versatile and platform-independent    developed and successfully used in our research lab for more
                                         environment that covers a wide range of robot capabilities         than six years, not only for research purposes, but also for
                                         for autonomous operation. Its major contributions include          industrial projects and international robotics competitions
                                         providing a logical level for specifying missions, reusing         like MBZIRC 2020 or IMAV 2017.
                                         components and sub-systems for aerial robotics, and enabling
                                         the development of complete control architectures. All major          Our framework, presented in this paper, incorporates im-
                                         contributions have been tested in simulation and real flights      portant changes and improvements to enable more efficient
                                         with multiple heterogeneous swarms. Aerostack2 is open             and effective robotic systems. Specifically, the framework
                                         source and community oriented, democratizing the access to         operates using ROS 2 (Robotics Operating System 2 [4]), a
                                         its technology by autonomous drone systems developers.             widely used middleware for robotics that provides valuable
                                         Source code:                                                       tools to create modular and distributed robotic systems. Ad-
                                         https://github.com/aerostack2/aerostack2                           ditionally, the framework is built on a more efficient modular
                                         Documentation:                                                     software architecture and enables multi-robot orientation.
                                         https://aerostack2.github.io/                                      Moreover, the framework aims to foster community creation
                                                                                                            within the ROS 2 ecosystem, as seen in other widely used
                                                               I. I NTRODUCTION                             software frameworks in the Robotics Community.

                                            In recent years, the robotics community has witnessed           Contributions
                                         the development of several software stacks focused on the            Compared to other existing solutions in the drone sector,
                                         control and guidance of ground robots and articulated robots.      Aerostack2 is an original tool mainly because of the follow-
                                         Navigation2 [1] and MoveIt [2] are two examples of such that       ing characteristics:
                                         have gained widespread adoption. However, the same level             • Platform Independence. Aerostack2 is a general en-
                                         of collaboration and standardization has not been observed              vironment, not dedicated to specific aerial platforms,
                                         in the field of aerial robotics. Research groups have tended            so it can be used with different types of drones. For
                                         to develop their own frameworks, resulting in isolated efforts          example, Aerostack has been used on drones with
                                         that are difficult to integrate.                                        Pixhawk controllers or with commercial platforms (e.g.,
                                            Furthermore, even when frameworks have been developed,               DJI).
                                         they often have a narrow focus, such as low-level control,           • Versatility. Aerostack2 covers a wide range of robot
                                         which limits their usefulness in more comprehensive appli-              capabilities for autonomous operation with different
                                         cations. This fragmentation of efforts can make it challenging          types of mechanism related to flight control, spatial
                                         to take advantage of the strengths of each framework in a               localization, aerial navigation planning methods, forms
                                         common application.                                                     of communication between drones, etc.
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
•   Easy mission specification. Aerostack2 provides a log-        TABLE I: Comparison of revelant low-level control sys-
      ical level that helps developers formulate the tasks to       tems.
      be done by autonomous aerial robots. Compared to               Flight Controller   Open Source   Simulation    Rate Input
      the level provided by ROS2 programming, this logical           Pixhawk/PX4          OSH, OSS     SITL, HITL        3
      level abstracts details (e.g., using the notion of robot       Ardupilot               OSS       SITL, HITL        3
                                                                     Paparazzi            OSH, OSS     SITL, HITL        3
      behaviors) and simplifies the specification of missions.       Crazyflie            OSH, OSS     SITL, HITL        3
      In addition, an existing robotic system developed using        DJI Matrice          Propietary      HITL           3
      Aerostack2 may be used by operators to formulate aerial        Parrot               Propietary      SITL           7
      misions with the help of mission specification tools           Skydio               Propietary        -            7
      (e.g., user interfaces, behavior trees, etc.).                  OSH: Open Source Hardware, OSS: Open Source Software, SITL:
                                                                      Software In The Loop, HITL: Hardware In The Loop.
  •   Simplify the engineering process in aerial robots sys-
      tems. Aerostack2 provides pre-programmed components
      that provide autonomous operation capabilities, encap-
      sulating specialized algorithms (computer vision, sensor      in our research laboratory using ROS. The experience in its
      data fusion, automatic planning, motion controllers,          use and the development of successive versions has been an
      etc.). These components are designed in a general and         important antecedent for the creation of the new framework
      adaptable way to be used in the construction of multiple      presented in this paper.
      aerial robots. Individual Aerostack2 components (e.g., a      AerialCore [6] is an aerial system built using ROS Noetic
      flight motion controller) can be reused to build specific     and meant to be executed entirely onboard. It can be
      aerial robotic applications. In this case, the developer      deployed on any multi-rotor vehicle, given it is equipped
      does not use the complete framework, but only sepa-           with a PX4-compatible flight controller, for both indoor and
      rate components for specific functionalities. Component       outdoor. It supports multi-robot experiments using Nimbro
      reuse may be done at different levels of granularity (e.g.,   network communication and provides both: agile flying and
      simple algorithms, complex robot behaviors, etc.). For        robust control.
      example, a developer who designs a new algorithm for          Agilicious [7] is a co-designed hardware and software
      a certain aerial robot capability (e.g., a flight motion      framework tailored to autonomous and agile quadrotor
      controller) may test such an algorithm by reusing a           flight, which has been developed and used since 2016 at
      partial architecture provided by Aerostack2.                  the Robotics and Perception Group (RPG) of the University
  •   Open-source. The Aerostack2 environment is open               of Zurich. It is completely open-source and open-hardware
      source and free of charge, which facilitates universal        and supports both model-based and neural network-based
      access to this technology by autonomous drone devel-          controllers. Also, it provides high thrust-to-weight and
      opers. Aerostack is offered with a BSD-3-Clause license       torque-to-inertia ratios for agility, onboard vision sensors,
      that allows free distribution and modification of the         GPU-accelerated compute hardware for real-time perception
      software.                                                     and neural-network inference, a real-time flight controller,
                                                                    and a versatile software stack.
                     II. R ELATED W ORK                             KumarRobotics flight stack [8] allows a quadrotor to
   There are numerous flight systems of different nature            navigate autonomously in cluttered and GPS-denied
that allow UAVs to fly. A common division in the literature         environments. It consists of a set of modules that work
is between low-level control systems, typically the flight          together to allow fast autonomous navigation of an aerial
controller, and high-level control systems, flight, or aerial       robot through an unknown environment. The system has
stacks.                                                             been designed so that all sensing and computation occurs
Low-level control systems vary from open source hardware            onboard the robot. Once the robot has been launched, there
(OSH) and open source software (OSS) to proprietary                 is no human interaction necessary for the robot to navigate
commercial controllers. In 2018 Ebeid et al. presented a            to the goal.
survey of open-source hardware and software comparing               CrazyChoir [9] is a ROS 2 toolbox that allows users to run
their main features [5]. Table I shows a comparison of              simulations and experiments on swarms of Crazyflie nano-
relevant flight controllers.                                        quadrotors. CrazyChoiR implements several tools to model
                                                                    swarms of Crazyflie nano-quadrotors and to run distributed,
   Several high-level control systems has been published in         complex tasks both in simulation and experiments.
the recent years. Table II compares existing aerial stacks with     UAV Abstraction Layer (UAL) [10] is a software layer to
the proposed solution of this publication.                          abstract users of unmanned aerial vehicles from the specific
   Aerostack [3] is a software framework that helps                 hardware of the platform and the autopilot interfaces. Its
developers design and build the complete control architecture       main objective is to simplify the development and testing
of aerial robotic systems, integrating multiple heterogeneous       of higher-level algorithms in aerial robotics by trying to
computational solutions (e.g., computer vision algorithms,          standardize and simplify the interfaces with unmanned aerial
motion controllers, self-localization and mapping methods,          vehicles. Unmanned aerial vehicle abstraction layer supports
motion planning algorithms, etc.). Aerostack was developed          operation with PX4 and DJI autopilots (among others),
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
TABLE II: Comparison of relevant open-sourced high level control systems.
                                                                          Soft.
                         Open                    Tested      Middle-                      Multi-    Rate    Multi-    Multi-     Plugin
 Flight Stack                       Modular                                last
                        Source                     in         ware                        frame    output   agent    platform   oriented
                                                                         update
 Aerostack [3]             3           3        S,RL,RO        ROS       10/2021            7        3        3        3           7
 AerialCore [6]            3           3        S,RL,RO        ROS       03/2023            3        3        3        7           3
 Agilicius [7]             3           3          S,RL         ROS       03/2023            7        3        7        7           7
 KumarRobotics [8]         3           7        S,RL,RO        ROS       12/2022            7        3        7        3           7
 CrazyChoir [9]            3           7          S,RL        ROS 2      02/2023            7        3        3        7           7
 UAL [10]                  3           7        S,RL,RO        ROS       12/2022            3        7        7        3           7
 XTDrone [11]              3           3            S          ROS       03/2023            7        3        7        7           7
 RotorS [12]               3           3            S          ROS       07/2021            7        3        7        7           7
 GAAS [13]                 3           3            S          ROS       10/2021            7        7        7        7           7
 Aerostack2 (Ours)         3           3        S,RL,RO       ROS 2      03/2023            3        3        3        3           3
  S: Simulation, RL: real experiments in the lab, RO: real experiments outside the lab.

which are current leading manufacturers. Besides, unmanned                   uses ROS 2 as middleware, compared to ROS used by the
aerial vehicle abstraction layer can work seamlessly with                    rest; (5) six systems have undergone an update in the last
simulated or real platforms and provides calls to issue                      six months; (6) only two, AerialCore and UAL, support
standard commands such as taking off, landing or pose, and                   references in different frames; (7) seven of the systems
velocity controls.                                                           support acro control of the aircraft; (8) three systems have a
XTDrone [11] is a UAV simulation platform based on PX4,                      multi-agent approach; (9) three other systems support more
ROS, and Gazebo. XTDrone supports mulitrotors (including                     than one different flight platform; and (10) only AerialCore
quadrotors and hexarotors), fixed wings, VTOLs (including                    has an architecture oriented towards the use of plugins.
quadplanes, tailsitters, and tiltrotors), and other unmanned
systems (such as UGVs, USVs, and robotic arms). It is                         III. A S TACK OF S OFTWARE C OMPONENTS FOR A ERIAL
convenient to deploy the algorithm to real UAVs after                                               ROBOTICS
testing and debugging on the simulation platform.                               Aerostack2 framework is organized in the form of a
RotorS [12] is a modular Micro Aerial Vehicle (MAV)                          software stack with components distributed in different hi-
simulation framework, which allows a quick start to perform                  erarchical layers, as can be seen in Fig. 1. Components are
research on MAVs. The simulator was designed in a modular                    organized hierarchically in several layers in such a way that
way so that different controllers and state estimators can                   components at one layer may use components of the lower
be used interchangeably, while incorporating new MAVs                        layers (corresponding to less complex functionalities) but
is reduced to a few steps. The provided controllers can                      they do not use components of the higher layers. The layers
be adapted to a custom vehicle by simply changing a                          are the following (from bottom to top):
parameter file. Different controllers and state estimators                      • Middelware. At the lowest level is the software that
can be compared with the provided evaluation framework.                           supports the Aerostack2 environment, which consists of
All components were designed to be analogous to their                             the Linux operating system and ROS (Robot Operating
real-world counterparts. This allows the usage of the same                        System), as well as general software libraries (e.g.
controllers and state estimators, including their parameters,                     OpenCV).
in the simulation as on the real MAV.                                           • Inter-process communication. It includes components to
Generalized Autonomy Aviation System (GAAS) [13] is                               facilitate communication between processes that operate
an open-source program designed for fully autonomous                              concurrently. These are message types for information
VTOL and drones. GAAS provides a fully autonomous                                 exchange that define data structures (specific to aerial
flight platform based on lidar, HD-map relocalization, path                       robotics) that are common to facilitate process interop-
planning, and other modules for aircraft. In contrast to the                      erability.
autopilot technology previously available only for consumer-                    • Interfaces with platforms and sensors. These are com-
grade drones, GAAS aims for robust fully autonomous                               ponents that serve as interfaces with multiple kinds
flight for human-carrying and can be easily combined with                         of aerial platforms and sensors. Aerostack2 has sev-
national air traffic control. The whole framework is loosely                      eral interfaces that allow operating with both physical
coupled, so you can customize your own modules and easily                         platforms (e.g., with Pixhawk or with DJI platforms)
add them to GAAS.                                                                 and simulated platforms (e.g., using simulated drones
                                                                                  with the Gazebo environment) besides different types of
  Of the nine high-level control systems analyzed, it is                          sensor (e.g., USB cameras, RealSense depth camera).
observed that (1) all of them are open source; (2) six have                     • Basic robotics functions. Aerostack2 includes a set of
a modular structure; (3) four systems have been tested in a                       software components that implement specialized algo-
non-laboratory environment, two on a real robot and three                         rithms corresponding to essential aerial robotics func-
only in simulation; (4) only one of the systems, CrazyChoir,                      tions for autonomous operation such as state estimation,
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
Fig. 1: Overview of the software components provided by the Aerostack2 environment.

      motion control, and other basic functions (e.g., emer-       state estimator) for building a particular application. The
      gency handling, etc.).                                       following sections describe in more detail each layer of the
  •   Behaviors. This level includes a set of components           software stack.
      corresponding to different robot behaviors provided by
      Aerostack2 for autonomous operation. Each component
      encapsulates the algorithms used to implement a partic-      A. Interprocess communication
      ular behavior (e.g., take off, hover, generate trajectory,      In order to organize each robot control architecture
      etc.) together with mechanisms for execution monitor-        with multiple interacting processes operating concurrently,
      ing to facilitate the specification of mission plans.        Aerostack2 has a standard data channel that is shared among
  •   Mission control. This level includes components that         the processes. To facilitate process interoperability, the con-
      facilitate the specification of missions for autonomous      tent of the data channel follows conventions defined by stan-
      drone operation. For example, behavior trees can be          dards that are applicable to aerial robotics. These standards
      used to indicate and visualize with a hierarchical graph-    are defined with respect to data structures and common
      ical structure the tasks performed by the drone. On the      names for communication mechanisms between processes
      other hand, Aerostack2 also provides an API (applica-        (e.g., ROS 2 topics, services, and actions). Our framework
      tion programming interface) that allows one to specify       follows some standards that have been defined by the ROS 2
      missions in a flexible way using the Python language.        community for aerial robots, and others (e.g., more specific
      Additionally, Aerostack2 provides tools for the user to      message types or names of ROS 2 topics, services, and
      monitor and manually control the mission execution.          actions) have been defined specifically in Aerostack2. Table
  •   Applications. The top level corresponds to the specific      III illustrates some of the ROS topics used by Aerostack2
      applications built with the components of the lower          together with standard names and standard message types.
      levels. Aerostack2 has examples of applications that can        The content of the standard data channel is distributed in
      serve developers as a reference and guide on how to          the following main groups:
      build drones that operate autonomously. In addition to          • Sensor measurements. Sensor measurements are values
      applications with real drones, Aerostack2 has multiple             corresponding to direct measurements recorded by sen-
      applications in simulated environments with varying                sors. This includes, for example, images from cameras,
      degrees of complexity to facilitate the learning of this           data from IMU, data from GPS, etc.
      technology.                                                     • Actuator commands. These messages correspond to
   It is important to note that this modular organization of             commands that are directly understandable aerial plat-
components is open to be used at any layer. This means                   forms, such as thrust or, depending on the platform,
that developers who use Aerostack2 may use directly the                  values about the desired localization.
top-level components but also may use separately any                  • Self localization. These values correspond to the robot
intermediate layer or other individual component (e.g., the              localization in the environment together with kinematic
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
values (e.g., speed) as they are believed by the robot.     as controlling a mounted arm or manipulating a different
      These values may be obtained, for example, by fusing        sensor. To address this, Aerostack2 incorporates the Sensor
      sensor measurements with the help of extended Kalman        abstraction class, which simplifies the management of
      filters (EKF).                                              external sensors. It should be noted that the Aerostack2
  •   Motion reference. These messages are motion values          framework is fully compatible with all ROS 2 drivers,
      that should be considered as goals by motion con-           enabling easy integration with previous community efforts.
      trollers. This includes, for example, desired values for
      pose, speed, trajectory, etc.
  •   Others: multi-robot communication messages, alert           C. Basic Robotic Functions
      messages corresponding to emergency situations,                The Aerostack2 platform encompasses a collection of soft-
      operation mode of the aerial platform, etc.                 ware components that perform fundamental robotic functions
                                                                  in aerial robotics. These components correspond to functions
                                                                  such as motion control, state estimation, and other essential
B. Platforms and sensors interfaces                               functions (e.g., emergency handling, etc.) that support the
   The common data channel presented in the previous              autonomous operation of various types of aerial robots.
section includes sensor and actuator data represented in a
generic way that is independent of the physical platform
used. This makes it possible to make part of the control
architecture independent of the different platforms to be
used, which facilitates the reuse of its components. The way
to connect the architecture with each platform is through
the construction of interfaces between the platform and the
communication channel.
   Aerostack2 incorporates a Platform abstraction class re-
sponsible for managing the capabilities associated with the
direct integration of various aerial platforms into the frame-
work. This abstraction facilitates the integration of new
platforms into the framework by providing guidance on
integration steps, overloading functions of the Platform class,   Fig. 3: General scheme of how a basic robotic function is
and ensuring compatibility with the entire framework.             implemented in Aerostack2.

                                                                     In general, Aerostack2 components are designed to be
                                                                  general and reusable with alternative algorithms that are
                                                                  implemented in the form of plug-ins and are selected based
                                                                  on each particular situation and aerial robotic application.
                                                                  But this way of structuring the components gains full
                                                                  sense when talking about basic robotic functions due to
                                                                  the importance of these modules within the functioning of
                                                                  the entire system. In this architecture, the basic functions
               Fig. 2: Aerial platform scheme.                    are managed by a function manager, which is responsible
                                                                  for loading the plugins with each concrete algorithm (e.g.,
   The proposed framework’s interaction with the Platform         a PID controller) and managing how they interact within
interface facilitates the integration of both physical and        the rest of the framework. The plugin selector can also
simulated interfaces, without requiring the rest of the frame-    provide meta-control features, such as plugin replacement,
work to distinguish between the two. This feature is a            plugin bypass (which occurs when the motion reference
fundamental pillar that strengthens the sim2real capabilities     can be directly handled by the aerial platform), whereas the
of the framework.                                                 input and output adapters, adjusts the input or output of the
   The responsibility of this interface is to gather sensor       function plugin to the rest of the Aerostack2 framework. In
measurements from the aircraft and transmit them to the           Fig. 2 an schema of the architecture of a basic function is
Communication Layer. Additionally, it is tasked with              described.
receiving actuator commands and other requests from the
various layers of the Aerostack2 framework and relaying              Specifically, the two core functions of the framework are
them to the aircraft in a platform-specific manner.               the following, the connection of which can be shown in
                                                                  Fig. 4:
  Moreover, there may be instances where additional sensors          • Motion Control: This module listens to the motion
or actuators are required for a specific application, such             reference commands generated from the top-level layers
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
ROS topic name                    Message type                          Description
                                                                              Actuator command for the multirotor specifying x, y, and
      motion_reference/pose             geometry_msgs/PoseStamped
                                                                              z (m: meters) and roll, pitch and yaw (rad: radians).
                                                                              Actuator command for the multirotor specifying vx, vy
      motion_reference/twist            geometry_msgs/TwistStamped            and vz (m/s: meters per seconds) and roll rate, pitch rate
                                                                              and yaw rate (rad/s: radians per seconds).
                                                                              Reference trajectory point specifying x, y and z (m:
                                                                              meters), vx, vy and vz (m/s: meters per seconds), ax,
      motion_reference/trajectory       as2_msgs/TrajectoryPoint
                                                                              ay, and az (m/s²: meters per seconds squared) and yaw
                                                                              angle (rad: radians).
                                                                              Raw image data received from camera (a general cam-
      sensor_measurement/camera         sensor_msgs/Image
                                                                              era).
      sensor_measurement/imu            sensor_msgs/Imu                       Inertial Measurement Unit data.
      sensor_measurement/gps            sensor_msgs/NavSatFix                 GPS (Global Positioning System) coordinates.

            TABLE III: Example ROS topics and messages used by Aerostack2 for interprocess communication.

      and converts them into actuator command signals that       approaching forbidden areas. There are multiple agents
      will be followed by the concrete aerial platform. In-      that can trigger these emergencies, and depending on the
      side this module, the controllers process the references   severity of the emergency, the module corresponding to a
      to generate control signals. The Controller Manager        low-level layer will handle it. For example, if the severity
      is responsible for ensuring a suitable combination of      is low, the controller may be able to handle it, but if it is
      motion references, which have to be preprocessed with      severe, the aerial platform itself should act.
      the current working plugin preferences (e.g. the motion
      references shall be expressed in the reference frame in
                                                                 D. Behaviors
      which the controller expects), but also to adequate the
      actuation command signals and post-processing them            Aerostack2 uses a specialized type of component, called
      for aerial platform desired format. Moreover, the plugin   behavior, which provides a logical layer to formulate mission
      selector has the capability to load multiple controllers   plans in a uniform and more simplified way (compared to
      in the form of plugins, being able to be aware of          the direct use of state estimators and actuator controllers).
      the operation of each one and making the appropriate       Each behavior corresponds to a specific robot skill related,
      selection.                                                 for example, to flight motion, such as taking off, landing,
  •   State Estimator: This module combines the information      hovering and following a path, or other abilities (e.g., video
      received from different sensors to estimate the state      recording, communication with other agents, etc.).
      of the aircraft, where the state refers to the position       Using behaviors, a mission plan is expressed as a
      and speed of each aircraft over time. This module can      controlled sequence of activations (or deactivations) of
      load multiple state estimation algorithms in the form of   multiple behaviors that may operate concurrently. Each
      plugins. The State Estimator Manager is responsible for    behavior activation initiates the execution of a particular
      generating the transformation trees (TF trees) that will   task described with certain parameters (e.g., following a
      be used by the rest of the framework. In addition, it is   particular path described with a list of waypoints). The
      aware of the available plugins and is able to make the     result of each behavior execution is described in terms of
      selection based on the environment. In addition, it is     success or failure, which is useful to determine the next
      aware of the available plugins and is able to make the     step to be done during the mission.
      selection based on the environment.
                                                                    We distinguish between two types of robot behaviors,
                                                                 according to execution goals1 : (1) goal-based behaviors that
                                                                 are defined to reach a final state or attain a goal (for example,
                                                                 in an aerial robot, the behavior taking off), and (2) recurrent
                                                                 behaviors that perform an activity recurrently or maintain
                                                                 a desired state (for example, a behavior for visual marker
                                                                 recognition).
                                                                    The notion of behavior in robotics has been used in
                                                                 the behavior-based paradigm [15] [16] (which mainly
Fig. 4: General scheme of basic robotics functions connec-       corresponds to reactive behaviors) and in behavior-based
tion in Aerostack2.                                              systems [17] (with both reactive and deliberative behaviors).
                                                                 In particular, a behavior in Aerostack2 may correspond
  In this category of components, there are also other           not only to reactive behaviors (e.g., hovering) but also to
components, such as emergency handlers, that implement              1 This division is also mentioned in the literature of robotics. For example,
emergency procedures to react quickly in the presence            these two categories have been distinguished using the terms servo, for
of unexpected situations, such as battery discharge or           recurrent behaviors, and ballistic for goal-based behaviors [14].
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
skills similar to mental abilities that use complex internal          assumptions to operate correctly (e.g., a behavior
representations (e.g., generating a trajectory to reach a             for visual marker recognition checks that the lighting
destination).                                                         assumptions in the environment are satisfied). Execution
                                                                      monitoring also detects when the execution has reached
                                                                      a prefixed goal or when there is a failure in the
                                                                      execution. For example, in the case of a task to follow
                                                                      a certain path, a failure can be detected if the robot
                                                                      does not arrive at the destination point in a maximum
                                                                      expected time or if the robot moves in the opposite
                                                                      direction to the direction that is expected to follow
                                                                      towards the destination point.

                                                                    Each software component that implements a behavior
                                                                 encapsulates the details of the algorithms used to execute
                                                                 the task, providing a uniform interface that is common for
                                                                 all behaviors. This interface is used to control the execution
                                                                 of the behavior with the following basic services: start,
                                                                 pause, resume and stop. In addition, a service called
                                                                 modify is used to change parameters of a behavior without
                                                                 the need to stop its execution. The behavior interface is also
                                                                 used to inform about the execution of the behavior with two
                                                                 separate outputs: the execution state of the behavior (e.g.,
                                                                 idle, running, or paused), and a periodic feedback during the
                                                                 execution of behavior.
                                                                    The manner in which the behaviors have been
Fig. 5: Characteristics of the robot behavior used in            implemented is fully consistent with the standard ROS
Aerostack2 to implement robot skills following a distributed     2 actions. Each behavior is linked to its own ROS2 action,
and modular approach.                                            with added functionalities that enable actions to be paused,
                                                                 resumed, or even have their goals modified during execution.
   The software components that implement behaviors in
the Aerostack2 framework are specialized modules for             E. Mission control and supervision
robust plan execution following a distributed organization          Aerostack2 provides several mechanisms that help devel-
that facilitates maintainability and flexibility to add new      opers specify a mission plan and supervise its execution.
behaviors in the future. Each behavior provides the following    The developer can write a mission plan to specify the set
two main types of functions (Fig. 5):                            of tasks that a robot must perform on a particular mission.
                                                                 The specification of mission plans can be formulated in the
  •   Task refinement. Robot behaviors execute tasks that        form of robot behaviors. In this way, the user can specify
      cannot be executed directly by actuators because           the mission more easily by indicating complex tasks that the
      they are formulated in an abstract terminology as          robot executes autonomously (e.g., following a certain path).
      it is used to specify mission plans by the user.              One of the solutions provided by Aerostack2 to specify
      Robot behaviors refine these tasks into more detailed      missions is a Python API (Application Programming
      operations (e.g., references to controllers, etc.). This   Interface). This is a convenient method for users familiar
      is done in an adaptive way in continuous interaction       with computer programming languages and provides high
      with the environment (e.g., waiting a certain amount       flexibility for formulating plans with complex control
      of time to complete each step or selecting the most        regimes. Aerostack2 provides an API with a set of functions
      appropriate method according to the current situation      to activate/deactivate behaviors or to directly send command
      of the environment).                                       through motion reference messages. The next piece of code
                                                                 shows a simple example of a mission written in Python
  •   Execution monitoring. Each robot behavior observes         using the API provided by Aerostack2.
      and assesses how each task is executed (e.g., by
      contrasting observations with expectations), which            As an alternative to the Python API, the developer
      is very important to facilitate a robust operation. In     may use other mechanisms to specify mission plans. For
      contrast to other centralized approaches, Aerostack2       example, mission plans may be formulated using behavior
      distributes execution monitoring into behaviors. For       trees [18]. The modularity and hierarchical of behavior
      example, each behavior checks locally whether the          trees are useful during the mission plan design but also
      current situation of the environment satisfies the         during mission execution thanks to a graphically monitoring.
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
Aerostack2 incorporates custom behavior tree nodes that
     can activate and deactivate robot behaviors. Fig. 6 shows a
     mission sketched in a behavior tree.

 1   import rclpy
 2   from as2_python_api.drone_interface import
         DroneInterface
 3
 4   rclpy.init()
 5
 6   # Initialize drone interface
 7   drone_interface = DroneInterface("drone_sim_0")
 8
 9   # Arm and change into offboard mode
10   drone_interface.arm()
11   drone_interface.offboard()
12
13   # Take Off
14   drone_interface.takeoff(height=3.0, speed=1.0)                         Fig. 6: Mission plan defined via a behavior tree.
15
16   # Go To
17   drone_interface.go_to(0, 5, 3, speed=1.0)
18
                                                                       commands, which allows one to check the system behavior
19   # Follow Path
20   drone_interface.follow_path(                                      or take the controll when the autonomous logic fails.
21       [5, 0, 3],                                                       In this category of components, Aerostack2 also provides
22       [5, 5, 3],                                                    a graphical user interface for using the software framework
23       [0, 0, 3],
24       speed=1.0)                                                    through a web-based application. This type of tool is very
25                                                                     convenient to facilitate rapid planning of a mission for one or
26   # Land                                                            several drones using graphical resources (e.g., a geographic
27   drone_interface.land(speed=0.5)
28
                                                                       map and graphical references), which can be done online
29   # Exit                                                            or offline, allowing repeatability of missions, as shown in
30   drone_interface.shutdown()                                        Fig. 8. This user interface is also useful to show graphically
31   rclpy.shutdown()
32   exit(0)                                                           details of the mision execution, allowing its monitoring and
                                                                       modification in real time.
                 Listing 1: Python Mission Example

        It is important to note that, using these previous methods
     to specify mission plans, the developer has to take into
     account additional operational aspects to coordinate the
     execution considering the relationships between concurrent
     behaviors (e.g., incompatibility or dependency). Such
     coordination mechanisms may be difficult to design
     and implement in complex autonomous robots. In these
     situations, additional solutions can be used based on
     automatic coordination of behaviors [19].                                Fig. 7: Aerostack2 Web-GUI system overview

        In addition to the tools mentioned for the specification          The schematic of this interface is shown in Fig. 7, where
     of the mission plan, Aerostack2 also provides other tools         many users can connect through different devices and com-
     to monitor execution. This includes two user interface tools      municate with the agents through the network. This architec-
     that use information related to basic aerial robotic functions:   ture provides the ability to plan and supervise missions from
     (1) an alphanumeric viewer to monitor the state of specific       different devices, making the system robust and powerful to
     variables (e.g., sensor measurements, values corresponding        use. In addition, since it is a web interface, it can be used
     to state estimation, references for controllers) and (2) an       from any device, increasing its accessibility.
     interface for keyboard teleoperation which can be used by
     the user to operate manually an aerial robot with the help of
     the keyboard.
        The alphanumeric viewer is a powerful tool to have a
     overview of the system’s operation, being very useful for
     research and development of framework modules.
        While the keyboard teleoperation is usefull tool to manip-
     ulate the dron in a simple way, sending position and speed
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
Fig. 8: Aerostack2 Web-GUI Mission Planning view.

             IV. E XPERIMENTS & D ISCUSSION                    The UAVs used for this experiment were two Bitcraze
   This section presents some experiments that show the        Crazyflie 2.1 with IR markers for mocap localization. Due
capabilities of Aerostack2 performing drone swarming mis-      to the limited payload of these micro-UAVs, the computing
sions.                                                         was performed by the ground station computer.

A. Sim2Real
   This experiment studies the ease of moving from an
experiment performed in a simulation environment to the
real world. In this scenario, two drones have to cross two
gates in a coordinated way.
   The simulation was performed using Gazebo Ignition. The
mission was planned using the python API of Aerostack2.
For this mission just the basic behaviors are used: platform
behaviors for offboarding and arming, and motion behaviors
for taking off, flying, and landing. The localization of the
drones and the gates are provided by the ground truth of the
simulator, and for control, a PID controller with trajectory
references is used.
                                                                        Fig. 10: Gate crossing real environment.

                                                                       (a) Simulation                    (b) Real
       Fig. 9: Gate crossing simulation environment.
                                                               Fig. 11: Trajectories performed by the drones during the gate
   The real experiment was carried out in the CAR Robotics     crossing experiments.
Arena, an area of 60 m2 with a surrounding safety net. An
Optitrack motion capture system (mocap) system provides          Table IV shows a comparison between the components
the position of drones and gates within the capture area.      used in simulation and real experiments. This shows that it
Aerostack2: A Software Framework for Developing Multi-robot Aerial Systems
Module           Simulation                Real
   Mission Control      Python API             Python API
      Behaviors        MB, PB, TGB            MB, PB, TGB
   State Estimation    Ground Truth       Motion Capture System
   Motion Control      PID controller         PID controller
      Platforms       Gazebo Simulator          Crazyflie

TABLE IV: Comparison between Aerostack2 modules used
in simulation and real experiments. (MB) Motion behaviors,
(PB) Platform behaviors, (TGB) Trajectory Generator behav-
ior.

is only needed to change the platform and the state estimation
component (in this case, a plugin inside of it) to translate the
experiment from simulation to the real world.                      Fig. 13: Screen of the UI web monitoring the swarm during
B. Heterogeneous swarm                                             an outdoor flight. DJI M210 in red and Pixhawk F450 in
                                                                   blue.
                    Module           Component
                Mission Control       Web GUI
                   Behaviors           MB, PB                      the mission and controlling the drones. The computing was
                State Estimation        GPS                        performed onboard the drones, each drone having an instance
                Motion Control      PID controller                 of Aerostack2, and they communicated their position to
                   Platforms       DJI matrice, PX4
                                                                   the ground station for monitoring, allowing a high level of
TABLE V: Aerostack2 modules used in Gate crossing sim-             autonomy.
ulation experiment. (MB) Motion behaviors, (PB) Platform
                                                                   C. Use cases
behaviors.
                                                                     Aerostack2 has been used for several applications:
                                                                     • Wind turbine inspection: in the context of an industrial
   In this experiment, two different platforms are used at
the same time to perform a cooperative mission. These                  project that aims to increase the autonomy of drones
platforms, shown in Fig. 12, were:                                     in wind turbine inspection, some experiments have
                                                                       been carried out. Fig. 14 shows a drone controlled
   • Pixhawk F450: autopilot PX4 Pixhawk 4 mini on DJI
                                                                       by Aerostack2 in a simulated environment following
     F450 frame with Nvidia Jetson Xavier NX for onboard
                                                                       the movement of the wind turbine during inspection.
     computing, GPS, and Rangefinder as an altimeter.
                                                                       Subsequently, this simulated wind turbine was used to
   • DJI M210: DJI Matrice 210 RTK v2 with Nvidia Jetson
                                                                       give references to a real drone during a flight to simulate
     AGX Xavier for onboard computing.
                                                                       an inspection in the real world, as shown in Fig. 15. This
                                                                       shows the capability of Aerostack2 to be used to mix
                                                                       real-world flying with simulated information.

       (a) Pixhawk F450                    (b) DJI M210
Fig. 12: Aerial platform used during the heterogeneous
swarm experiment.

   This scenario represents the use of a swarm consisting                 Fig. 14: Wind turbine inspection in simulation.
of two drones to perform a simple aerial inspection. The
mission was planned using the Web GUI, generating a path             •   Photovoltaic plant inspection: as part of the devel-
of GPS waypoints for both drones from an aerial image of                 opment of an industrial project, the simulation of an
the flying area. Once the mission was generated, both drones             inspection of a photovoltaic plant was performed with
took off at the same time, and each one traveled along the               a swarm of drones. This simulation was made using
path, landing when they finished.                                        Gazebo and the photorealistic simulator Flightmare
   In this case, the use of two different models of drones               [20]. The mission was planned with the Web GUI,
does not make a difference for Aerostack2 when planning                  defining the limits of the inspection area. Then, a path
using wifi communication, and with a segmentation
                                                                      color algorithm, passing through several gates, achiev-
                                                                      ing the record of the highest number of gates passed by
                                                                      all participants. Fig. 17a shows the drone during this
                                                                      competition.
                                                                  •   Package delivery: IMAV’22 also proposes a package
                                                                      delivery challenge consisting of flying a UAV carrying
                                                                      a package and delivering it to a certain place. The
                                                                      maximum weight of the UAV with the payload was 5
                                                                      kg. This involved the construction of an ad hoc drone
                                                                      using a Pixhawk autopilot on a F450 frame with a
Fig. 15: Wind turbine simulated inspection in a real envi-            package delivery device. In this scenario, Aerostack2
ronment. References from a simulated moving wind turbine              was able to plan the mission with the Web GUI and
(left) were used to move a real drone (right).                        fly the drone in an outdoor environment using GPS
                                                                      references, as shown in Fig. 17b.

                (a) Web GUI mission monitoring
                                                                                       (a) AI nanocopter

                  (b) Simulation environment

                                                                                      (b) Package delivery
                                                                Fig. 17: Using Aerostack2 during IMAV’22 in different chal-
                                                                lenges: AI nanocopter (left) and Package Delivery (right).
               (c) Camera view from the drones
                                                                  •   Maritime Operation: MBZIRC’23 competition pro-
 Fig. 16: Photovoltaic plant swarm simulated inspection.
                                                                      poses a maritime operation of search and evidence
                                                                      collection. This challenge consists of several drones
                                                                      searching for a specific boat in a vast maritime area,
      planning algorithm generates the path over the lines of         coordinated with an unmanned surface vehicle (USV)
      panels that must be taken to inspect the area based on          to take some packages from this boat. The main diffi-
      the desired distance between survey points. After this,         culties of this scenario were the absolute lack of GPS
      another algorithm is responsible for dividing the route         signal, the communication distances, and the coordi-
      into each path that each drone must follow [21]. This           nated maneuvers required to transport the packages.
      shows that Aerostack2 can be used for the development           For this phase, the scenario was recreated in a realistic
      of industrial projects using a swarm of drones. Fig. 16         simulation environment made on Gazebo. Aerostack2
      shows images taken during this experiment.                      served as the framework for the trials of this competi-
  •   Gate crossing: AI Nanocopter Challenge within                   tion, showing the capabilities to fly several coordinated
      IMAV’22 consists of flying one Bitcraze Crazyflie drone         drones, using custom localization and communication
      inside a small arena populated with obstacles and drone         algorithms, and implement neural networks for object
      racing gates. Aerostack2 was able to pilot the UAV              detection, as shown in Fig. 18.
R EFERENCES
                                                                   [1] Steve Macenski, Francisco Martı́n, Ruffin White, and Jonatan
                                                                       Ginés Clavero. The marathon 2: A navigation system. In 2020
                                                                       IEEE/RSJ International Conference on Intelligent Robots and Systems
                                                                       (IROS), 2020.
                                                                   [2] David Coleman, Ioan Sucan, Sachin Chitta, and Nikolaus Correll.
                                                                       Reducing the barrier to entry of complex robotic software: a moveit!
                                                                       case study, 2014.
                                                                   [3] Jose Luis Sanchez-Lopez, Ramón A Suárez Fernández, Hriday Bavle,
                                                                       Carlos Sampedro, Martin Molina, Jesus Pestana, and Pascual Campoy.
                                                                       Aerostack: An architecture and open-source software framework for
                                                                       aerial robotics. In 2016 International Conference on Unmanned
                                                                       Aircraft Systems (ICUAS), pages 332–341. IEEE, 2016.
                                                                   [4] Steven Macenski, Tully Foote, Brian Gerkey, Chris Lalancette, and
                                                                       William Woodall. Robot operating system 2: Design, architecture,
 Fig. 18: Maritime operation in MBZRIC’23 competition.                 and uses in the wild. Science Robotics, 7(66):eabm6074, 2022.
                                                                   [5] Emad Ebeid, Martin Skriver, Kristian Husum Terkildsen, Kjeld Jensen,
                                                                       and Ulrik Pagh Schultz. A survey of open-source uav flight controllers
                                                                       and flight simulators. Microprocessors and Microsystems, 61:11–20,
  The use of Aerostack2 in these applications shows the                2018.
                                                                   [6] Tomas Baca, Matej Petrlik, Matous Vrba, Vojtech Spurny, Robert
capability of this framework to be used in very different              Penicka, Daniel Hert, and Martin Saska. The mrs uav system: Pushing
scenarios, with completely different requirements.                     the frontiers of reproducible research, real-world deployment, and
                                                                       education with autonomous unmanned aerial vehicles. Journal of
                                                                       Intelligent & Robotic Systems, 102(1):26, 2021.
          V. C ONCLUSIONS AND F UTURE W ORK                        [7] Philipp Foehn, Elia Kaufmann, Angel Romero, Robert Penicka, Sihao
                                                                       Sun, Leonard Bauersfeld, Thomas Laengle, Giovanni Cioffi, Yunlong
   This paper has presented an innovative Open Source                  Song, Antonio Loquercio, et al. Agilicious: Open-source and open-
                                                                       hardware agile quadrotor for vision-based flight. Science Robotics,
framework for the development of aerial robot systems. The             7(67):eabl6259, 2022.
framework’s key capabilities, including multirobot orienta-        [8] Kartik Mohta, Michael Watterson, Yash Mulgaonkar, Sikang Liu,
tion, platform independence, versatility, and modularity, have         Chao Qu, Anurag Makineni, Kelsey Saulnier, Ke Sun, Alex Zhu,
                                                                       Jeffrey Delmerico, Dinesh Thakur, Konstantinos Karydis, Nikolay
been demonstrated through a series of experiments conducted            Atanasov, Giuseppe Loianno, Davide Scaramuzza, Kostas Daniilidis,
in diverse scenarios, both in simulation and in the real world.        Camillo Jose Taylor, and Vijay Kumar. Fast, autonomous flight in
   Future work will involve the continued development of               gps-denied and cluttered environments. Journal of Field Robotics,
                                                                       35(1):101–120, 2018.
new behaviors, metacontrol capabilities, and the expansion         [9] Lorenzo Pichierri, Andrea Testa, and Giuseppe Notarstefano. Crazy-
of the number of behaviors, controllers, state estimators, and         choir: Flying swarms of crazyflie quadrotors in ros 2. arXiv preprint
platforms supported by the system through collaboration with           arXiv:2302.00716, 2023.
                                                                  [10] Fran Real, Arturo Torres-González, Pablo Ramón Soria, Jesús Capitán,
the aerial robotics community.                                         and Anibal Ollero. Unmanned aerial vehicle abstraction layer: An
                                                                       abstraction layer to operate unmanned aerial vehicles. International
                                                                       Journal of Advanced Robotic Systems, 17(4):1–13, 2020.
                   ACKNOWLEDGMENTS                                [11] Kun Xiao, Shaochang Tan, Guohui Wang, Xueyan An, Xiang Wang,
                                                                       and Xiangke Wang. Xtdrone: A customizable multi-rotor uavs sim-
   Funding. This work has been supported by the project                ulation platform. In 2020 4th International Conference on Robotics
COPILOT ref. Y2020\EMT6368 ”Control, Monitoring and                    and Automation Sciences (ICRAS), pages 55–61. IEEE, 2020.
                                                                  [12] Fadri Furrer, Michael Burri, Markus Achtelik, and Roland Siegwart.
Operation of Photovoltaic Solar Power Plants by means of               Robot Operating System (ROS): The Complete Reference (Volume
synergic integration of Drones, IoT and advanced communi-              1), chapter RotorS—A Modular Gazebo MAV Simulator Framework,
cation technologies”, funded by Madrid Government under                pages 595–625. Springer International Publishing, Cham, 2016.
                                                                  [13] Generalized autonomy aviation system.
the R&D Synergic Projects Program.                                [14] Joseph L. Jones. Robot Programming: A Practical Guide to Behavior-
   We acknowledge the support of the European Union                    Based Robotics. McGraw-Hill, 2002.
through the Horizon Europe Project No. 101070254 CORE-            [15] Rodney Brooks. A robust layered control system for a mobile robot.
                                                                       IEEE journal on robotics and automation, 2(1):14–23, 1986.
SENSE.                                                            [16] Ronald C Arkin. Behavior-based robotics. MIT press, 1998.
   This work has also been supported by the project               [17] François Michaud and Monica Nicolescu. Behavior-based systems.
INSERTION ref. ID2021-127648OBC32, ”UAV Perception,                    Springer handbook of robotics, pages 307–328, 2016.
                                                                  [18] Michele Colledanchise and Petter Ögren. Behavior Trees in Robotics
Control and Operation in Harsh Environments”, funded by                and AI. CRC Press, jul 2018.
the Spanish Ministry of Science and Innovation under the          [19] Martin Molina and Pablo Santamaria. Behavior coordination for self-
program ”Projects for Knowledge Generating”. The work                  adaptive robots using constraint-based configuration. arXiv preprint
                                                                       arXiv:2103.13128, 2021.
of the third author is supported by the Grant FPU20/07198         [20] Yunlong Song, Selim Naji, Elia Kaufmann, Antonio Loquercio, and
of the Spanish Ministry for Universities. The work of the              Davide Scaramuzza. Flightmare: A flexible quadrotor simulator. In
fifth author is supported by the Spanish Ministry of Science           Conference on Robot Learning, 2020.
                                                                  [21] Marco Andrés Luna, Mohammad Sadeq Ale Isaac, Ahmed Refaat
and Innovation under its Program for Technical Assistants              Ragab, Pascual Campoy, Pablo Flores Peña, and Martin Molina. Fast
PTA2021-020671.                                                        multi-uav path planning for optimal area coverage in aerial sensing
Data and Materials. All materials are open-source                      applications. Sensors, 22(6):2297, 2022.
accesible at https://github.com/aerostack2/
aerostack2 under BSD-3-Clause license.
You can also read