Simulate and Deploy UAV Applications with SIL and HIL Workflows
Unmanned Aerial Vehicles (UAV) are safety critical systems where simulation and testing are essential for verifying control performance before conducting test flights. The latest developments in MathWorks tools let you integrate UAV onboard computers, ground control stations, and autopilots with plant models in Simulink® and scenario simulation using Unreal Engine® for various autonomous flight applications.
In this talk, you will learn about:
- Software-in-the-Loop (SIL) workflow to deploy UAV waypoint following on an onboard computer (NVIDIA® Jetson™) and test it with PX4 SIL simulation
- Hardware-in-the-loop (HIL) workflow using the PX4 Hardware Support Package and an onboard computer (NVIDIA Jetson)
- Integrating Simulink plant for UAV dynamics with HIL simulation and using depth image from Unreal Engine® to test the flight controllers for obstacle avoidance
- Deployment on Pixhawk running in the HIL mode
- UAV scenario simulation with photorealistic scenes using Unreal Engine
- The new UAV Scenario Designer app, which lets you design and playback UAV trajectory based on OSM city maps
Published: 29 May 2022
All right. Hello, everyone. Welcome to the MATLAB Exposition on UAV applications with SIL and HIL workflows. I am Mirhir Archarya, product manager for autonomous navigation and UAV applications at MathWorks. And I would like to introduce my colleagues Ronal George and Julia Antoniou from the application engineering team at MathWorks.
So our agenda for the stock today is to introduce you with software-in-the-loop and hardware-in-the-loop capabilities that we offer using Simulink and PX4 integration. And I'm here to learn along with you from Ronal and Julia on how we can simulate and deploy UAV algorithms in virtual scenarios. So let me be your guide today for this talk, and let's start with looking at what we want to achieve as our end goal.
So when I think about drones and unmanned aerial vehicles, there are various applications that come to my mind, such as that in agricultural fields for surveillance or inspection. But what excites me the most is to see a drone flying through the city block. And just imagine how fast a medical response would be in that case.
But flying a drone in a real-world scenario like this must go through a lot of safety and validation checks. And that's where simulation comes into the picture. So going one step prior to the deployment in real world, you would want to model the drone and design algorithms to simulate it in virtual scenarios, where you can also provide the simulation with a mission plan as an input.
And even then, deploying and flying the drone is still a bit far. We first deploy the flight controllers to a drone autopilot and test the drone behavior in various conditions where things can go south, for which we need to design the flight controller and simulate it along with the dynamics model or plant model of the drone. And we do this by planning the mission either using simulated waypoints or inputs from real-world maps.
Now, let's see how these building blocks are put together for in-the-loop simulations. So what we are going to learn from Ronal today are three variations of simulation with drone autopilot. The first one is modeled in the loop, where the flight controller and plant model are designed in Simulink and operating in a feedback loop within the host PC.
Now, we can also run the flight controller as a C++ executable using the autopilot's firmware on the host PC. And this is what we call a Software In the Loop, or SIL. And when we deploy the flight controller to the autopilot hardware board, we call it hardware in the loop.
Now, we can also deploy the plant model on the hardware. And this is what we refer to as full HIL workflow, where the actuators, sensors, and other drone peripherals are also in the loop. However, for this talk, our focus is what we call as the PX4 HIL workflow, where the flight controller is deployed on PX4 hardware board, and the plant model is in Simulink within the host PC.
Now, to see the things a little bit better, we would like to also visualize the behavior of the UAV plant model in a virtual scenario. And this is what Julia will walk us through, bringing in the cuboid scenarios or Unreal gaming engine interface with Simulink that gives us photorealistic simulations. And when we have a virtual scenario in place, we can simulate sensor models to collect simulated data to test autonomy algorithms. And this completes our workflow with deploying the autonomy algorithms on an embedded hardware board such as NVIDIA Jetson.
So we saw that we are using PX4 for autopilot, or Pixhawk here, to deploy the flight controller. So let's bring in Ronal now. Hey, Ronal, can you tell us a little bit about Pixhawk?
Sure. Mirhir. So Pixhawk is actually an advanced autopilot. It's a little credit card-like-looking autopilot you see here. It's commonly used with PX4 and RDPY. It comes with multiple onboard sensors. You can see them listed there. It also has support for different communication protocols-- serial, I2C, CAN.
You can attach a GPS to it. You can have commands come from a telemetry or RC transmitters. And it provides outputs to your motors.
Great. So I see there are a lot of good stuff in the PX4. And I think we can interface it using MATLAB. Can you tell a little bit about how do we support the interface?
Sure. As you can see, we provide a lot of support for the PX4 firmware. We provide blocks to read microR messages, which is the internal communication protocol for the PX4, and for reading from sensors.
Great. So can you tell a little bit about how this relates with the PX4 architecture?
Absolutely. The microR block lets you subscribe or publish to topics in the PX4 message bus. Additionally, you can see that we have specific blocks to work with, either the sensor hub, the GPS, the RC input, the estimator, or the output driver. One of the key features we support is replacing the position controller or the additative and the rate controller with controllers developed in Simulink. And today I'll be showing you some of these capabilities.
So let's take a step back and look at our larger workflow that Mirhir showed. I'd like to walk you through each of these steps, running the controller as a model in the loop, software in the loop, and then a hardware in the loop. So let's jump right in.
So to start, let's look at running the entire model, which includes the flight controller and the plant model, directly in Simulink. We'll be providing simulated commands to the flight controller to see our behavior of the plant. Great. So here you can see the entire model is running in Simulink.
I'm providing some commands, like changing the altitude, turning on and off circular guidance. And we get a really nice visualization of the actual plant. Now, I'll jump into the model here in a second
Great. So our model's actually built of three subsystems. The first one's the commands. The second one's actually the controller. So the commands go to the controller. And from the controller, we provide actuator commands through actual plant. The subsystem also actually has our visualization.
So let me jump in a little closer into our controller. So the first step is converting MavLINK messages to microR messages before we send it to the actual flight controller. Within the flight controller, you actually have a subsystem to condition those signals and then the actual control itself, which is actually an attitude, position, and an altitude controller.
Now, this is going to generate our actuator outputs. It gets converted back to MAVLink messages that gets sent to the plant. Now the plant model actually has to read from those MAVLink messages and provide inputs to the quadcopter model. So let's take a look at that.
Here you see there's a quadcopter model and some other subsystems. Diving into the quadcopter model, there's a force and moments block. There's a 6DOF rigid body dynamics block. This is what's actually going to give us the state of our plant.
Now, outside this model-- oh, actually one subsystem inside. You can see we have three other blocks. These are our sensors. These actually generate MAVLink messages. So you can see, we use a MAVLink blank message to create our message bus.
Great. So putting all this together, you can actually run the entire flight and flight controller and plant model in Simulink. I know that was a lot of information. But this is all that we have to do in Simulink. You can model your controller. You can model your plant and run it all in some way.
Perfect. So people may ask like, why build our plant model in Simulink? When building your plant model, you have the freedom to decide on the level of fidelity that you want. You can either use multibody, which is a really high-fidelity physical model, or move through a medium model, and finally, just a low-fidelity model. Here we can use our guidance models, which are an approximate representation of a plant. So you decide on the fidelity that you want to represent your plant, whether you're building UAV platforms or designing UAV algorithms.
Great. So we talked about actually going through a model in the loop. Now we'll look at actually moving to software in the loop. So this is where we generate C++ code. So you can see that I'm actually running two models. On top is my flight controller. On the bottom is my plant.
Now, the flight controller is actually running as a software in the loop. And I'll show you how we get to that in a second. The neat feature about when we run in software in the loop, we're actually running monitor in tune.
It's another way of communicating with the plant so I still get to provide attitude, altitude commands. And I can enable and disable circle of guidance. So you can still provide simulated inputs here. How cool is that?
Great. So I'm going to go through these models side by side. On the right, you have my flight controller. And going in, you see it's the same flight controller model that we saw earlier. Of course, it has PX4 interfaces and a PX4 output bus.
On the left, we have our plant. This is the same plant that we saw in our earlier step. It has the same quadcopter model. But it has UDP interfaces to actually communicate those MAVLink messages to our software in the loop.
So I'm going to select Host Target as my hardware board. And I'm going to click Monitor and Tune. So what this does is it's going to actually generate C++ code. And it'll launch an executable that represents our flight controller software.
As soon as that code is being generated and the executable is launched, it'll give me a signal to say, hey, go ahead and run your plant. It'll happen here in a second. So as soon as it tells me to enable my plant, I can go ahead and run my plant. And my plant will start communicating with the C++ executable.
I know it looks like there's not a lot happening here. But as soon as this starts running, I'm also going to go and enable the animation. So here you have to know that the flight controller's running as a software in the loop. And the plant is actually running in simulation in Simulink.
Let me go ahead and enable the animation so then you can see the behavior. Great. So similar to what you saw before-- and again, I can provide commands as and when I need. Test the behavior, test aggressive maneuvers, and see if your plant responds the way you want to. I'll go in, quickly modify and generate code. So now you can test the performance of your flight controller when running a C++ code.
Now we're going to look at deploying the flight controller directly to your PX4, which is connected via USB. You're flight controller will be running as a hardware in the loop, and our plant will be simulated in Simulink. They communicate through a com cord.
Great. So here you'll see that I've just confirmed a mission from QGC. This mission gets pushed on to my PX4. and on the bottom, I have my flight controller module. But that's not running because actually it's deployed.
On the right, I have my Simulink. You can see I can actually read states of my plant while running this. So it's really easy to monitor the messages that are coming to your plant in Simulink.
So I'll walk you through how we actually build this entire session-- or each piece. We just wait for our entire mission to complete. And that's where we can confirm that QGC has completed the mission that we actually defined.
So let's take a closer look. Again, I have my flight controller on the left, which is reading microR messages because that's how the internal protocol in Pixhawk actually works. It's the same flight controller in the center. And then I have output messages going to the output driver.
On the left-hand side, I actually have my plant model that actually communicates through a com port. Now, the plant model stays the same. And I have some sensors that I'm also simulating. When you run hardware in the loop, the sensors in Pixhawk are disabled.
So you can see I have my Pixhawk connected. I'm going to select the right board. And I'm going to hit Deploy. So as soon as this happens, it's going to tell me that I have to reboot my Pixhawk. That's required when you have to actually deploy or upload a new firmware.
Once uploaded, QGC is going to launch. And it's going to wait for communication from the vehicle. So one thing that we have changed here is QGC will only autoconnect via VDP. So the next step for me to do is actually go ahead and run my plant model.
As soon as my plant model's ready to run, QGC has connected. This is because the plant model actually acts as a communication bridge. It takes in MAVLink messages and sends MAVLink messages to QGC.
Great. So I'm going to go ahead and create a new mission. This includes take off with a certain altitude. I'm also going to provide a couple waypoints points to this mission.
And then I can upload this mission. And once ready, I can actually go ahead and confirm this. So just to reiterate, the flight controller is actually running on my hardware connected via USB to my computer. And the plant is running in Simulink. And of course, to ground control acts as my ground control station.
Thanks, Ronal. That was really interesting to see how we can really simulate and see on a real-world map before flying the drone in real world. But since I have the controller deployed on PX4, can I just attach this PX4 to a drone now? I would be really excited to just go and fly. But do you think, are we ready now?
I mean, in some situations, sure, if all you wanted to do is follow some waypoints, absolutely. But in most real-world cases, where you have to identify a target or avoid an obstacle, you may have to do more than that. Julia, do you have some use cases or examples from engineers you've worked with?
Sure. Yeah, so a lot of the engineers I work with in industry, their end goal is not necessarily just flying and controlling a UAV in a controlled environment, but making it do some kind of autonomous task, so for example, delivering a package in a city. But the more autonomous we make our UAV, the more algorithms we add to the system.
And all these algorithms are going to interact with one another. So for example, like shown in this diagram, our on board autonomy could send commands to the flight controller based on the sensor data that we get. So like Mirhir mentioned earlier, verifying these higher-level algorithms in simulation is going to be very important.
So taking our city navigation package delivery example, in reality, the environment the UAV is going to have to operate in will look a lot more like this. So our UAV needs to safely coexist with all these pedestrians, the vehicles, the buildings, the infrastructure. And I'm sure you'd all agree, if it were me designing these algorithms, I'd want to be as confident as possible that my system is going to perform as expected before I do any kind of tests in environments like this one.
And desktop simulation is a really great way to build that confidence, so not just even in your autonomy algorithms, but in how your entire system is performing as one cohesive unit. And MATLAB and Simulink make building this kind of full-system simulation easier. There's a lot of out-of-the-box functionality to simulate environments and sensor data for both a keyboard environment, like you see here, but also to connect with photorealistic environments.
So I know I've talked to engineers who are hesitant to do environmental simulation because of the time investment it could require. So that's why we've been working on tools like the one you see here that help you minimize that time investment up front. This one's called the UAV Scenario Designer.
So with this app, we can really quickly build cuboid-style environments natively in MATLAB, which is really cool. So we can build simple environments by dragging and dropping elements onto the scenario canvas, or we can even build more complex environments like the one that you see here. We can import from files to do this, like DTED files or even STL files.
And once we have that environment designed, we can add our UAVs. We can outfit them with sensors, like LiDAR. And like you see here, we can visualize our UAV executing a path through our environment.
So Simulink is a really natural place to, then, come and integrate all the different components of our system together. We already combined our UAV dynamics model and the controllers in Simulink for the SIL and HIL examples that we just saw before with Ronal. And now we can build on that model further.
So we can add our environment that we just made in that app, our sensor models. And that workflow diagram we've been showing this entire time, this is really when it comes to life. It becomes more instead of a static diagram, it's an executable model that you can really simulate and see how your system performs.
And integrating that keyboard environment we built in that previous example that we just saw, it's pretty easy using these off-the-shelf blocks from the UAV Toolbox Simulink library, so like you see here. And when we use those blocks, we automatically get a visual of our UAV's flight. And it simulates sensor data as the model is running. So in this case, we actually get to see the LiDAR being projected onto these buildings in this simple city block scene.
And then when you get to the point in your project where you start to need those higher-fidelity photorealistic environments, we make it easy to connect with Unreal scenes too. So you can see here, we have a very similar library of blocks that connect with an Unreal scene. And same as before, once we start running our model, we can get a visual of our UAV's flight and the simulated sensor data. So in this case, we're modeling LiDAR and camera, which gives us the ability to do more complex perception algorithms, like obstacle detection and avoidance. And you'll also notice that when we get to the photorealistic level of fidelity, we can even start adding effects like rain to see if our algorithms will hold up in all types of circumstances.
Thanks, Julia. So that was interesting to see how we can visualize different scenarios. And we also learned earlier about the hardware-in-the-loop workflow. So Ronal, I'm really interested how we can actually get these two to work together.
Oh, great, question. So I know we've already talked about hardware-in-the-loop simulation. We can also include a scenario simulation as a part of this larger workflow. So let's take a look at that.
So here you can see I'm running in Unreal Engine to visualize my platform. That's on the bottom left. On the top right, you can see I actually get an image frame. This comes from the camera sensor that's attached to my platform.
Again, the flight controller's running in hardware in the loop. The plant is running in Simulink. And waypoints are coming from QGC. The state of the plant is given to Unreal. And we can read photorealistic sensor data. This can be cameras, LiDARS. Whatever sensor information you want can be read from the Unreal scene.
So let's take a look at how we actually got here. So I have three models here. To the top left is my flight controller. To the actual right is my plant. You've seen both of these.
The new one is in the bottom. This is my Unreal simulation. I'm going to start with the flight controller and go ahead and get it deployed to the Pixhawk. Similarly, I have to unhook it and hook it again so I can reboot it.
If you're using the same controller as before, you can skip this step, as you've already deployed it to your controller, to your actual PX4. So as soon as that is done, you get QGC to launch. Next, let's actually look at our Unreal environment.
You'll see that I have a UDP block which gives us the state of the plant. This gets sent to Unreal. There's also a camera sensor which we use a video viewer to visualize it. And so that's where we get both our Unreal scenario and the image frames that we're seeing.
So last, coming to our plant model. So it's actually the same plant model, but we're going to add another UDP connection here. That's what's going to connect to our Unreal model.
So inside it's what we've seen before. It's a plant dynamic with sensors in it. So the plant model actually acts as an interface to actually communicate and connect QGC and Unreal and the PX4 together.
So as soon as I hit Run on this, you can see my plant gets connected. I can confirm my original mission. And I have this awesome visualization with actual sensor data that's really photorealistic. Nice. So the quadcopter model stays the same here entirely.
Now, you did see that I have this entire workflow. And I hope I was able to showcase how we can start by running our flight controller as a model in the loop and then going into software in the loop and then moving finally into deployed state with the hardware. We also quickly showed how we can do a scenario simulation and get photorealistic environmental details from Unreal Engine.
There's actually one more step that shows how you can deploy an autonomous algorithm onto an NVDIA Jetson and stream some of that camera data, the camera frames, to that Jetson to do you on-board autonomy. There's a link to an example here that's from our UAV Toolbox that actually walks through this entire-- all the steps that I showed you today is shown in that example. So make sure to take a look.
That was awesome. So really we all learned from the session that going out to fly a drone in a city is sure an exciting idea but maybe not the best approach. So we'll see the SIL and HIL workflows help you to ensure that you can keep it safe by testing the flight behavior in simulation.
And that's where MATLAB and Simulink can help you with the reference models to integrate external autopilot such as PX4, generate C++ for autonomy algorithms, and do scenario simulation. If you're interested to look more into the drone analysis with Simscape, there is a relevant workshop in the Expo that you can attend later. And all that we showed today, all the demos are available out of the box in UAV Toolbox with MATLAB. So I really encourage you all to check out our resources, including the product page and the other resources we have listed down here.
And make sure when you share your MATLAB Expo experience on social media you use the #matlabexpo. And you can reach out to us via the LinkedIn profiles as well. So with that, we would like to thank you for attending this session.